Page 1

Vol. 8 Issue 1


A Harvard College Undergraduate Research Association Publication


Fall 2020

On The Cover: Neurographica artwork by Anton Antokhin



Masthead Editors-in-Chief Marissa Sumathipala Minjue Wu

Primary Research Patrick Magahis, Editor Features Brennan Disdar, Editor

Design Kemi Ashing-Giwa, Editor Sponsors Brevia would like to thank the Harvard Undergraduate Council and the Harvard College Office of Undergraduate Research and Fellowships for their generous support of our publication.

Writers Harris Allen Kemi Ashing-Giwa Armeen Barghi Gabriela Escalante Abigail Miller Tasnia Nabil Eric Costa Pimentel

Geela Margo Ramos Andrea Rivera Marissa Sumathipala Isabella Trasolini Lara van Rooyen Hilina Woldemichael

Acknowledgements Thank you to Gregory Llacer for his extensive support of Brevia and HCURA. Notes The views expressed in Brevia articles are solely those of the authors. They do not represent the official stance of the magazine or any of its sponsors or affiliates.




Humans are creatures of perception. Sight, touch, smell, hearing, and taste. These five senses hold the power to set humans apart from animals.

Perception takes the objective reality of our material world and transforms it into the abstract. Photons from a red-orange sunset become transient electrical impulses flickering across our brain to evoke a sense of joy. Research straddles the objective and abstract, interpreting tangible data to form new ideas. In this issue, you will learn about why white has no color, how we perceive mental illness, and what happens when we lose our senses. So turn the page and watch as we transform our perceptions of research into ideas that matter.

Marissa Sumathipala Editor-in-Chief



Table of Contents Covers: Perceive

Why White has No Color: Receptive Fields for Color Vision Marissa Sumathipala


Oral Microbiome and Taste Perception


The Balancing Act Between Senses

Hilina B Woldemichael

Lara van Rooyen 14


Osteocalcin: The skeleton of the Acute Phase Response

The Middle Ground

A Fresh Approach to Genome Editing

How Our Brains Perceive Mental Illness..Without us Knowing

Healthcare Wearable Technologies: Exploring Remote Medical

Tracking and Personalized Telemedicine

Gabriela Escalante


Abigail Miller


Chibuike Uwakwe 22 Isabella Trasolini


Eric Costa Pimentel


Kemi Ashing-Giwa



Origin of Life: the Initiative and Beyond





Why white has no color: Rec

Is white a color? This age-old question has puzzled scientists, physicists, and philosophers alike for nearly as long as the question of whether color even exists. Nonetheless, almost all of us have experienced the wonders of color perception. Surprisingly, the dizzying array of hues we perceive in our everyday life arise from combinations of just three types of color receptors in our eyes: red, green, and blue. When light reflects off an object, such as a bright-red apple, a photon travels at a certain wavelength and enters the retina. There, it triggers a chemical cascade in specialized photoreceptor cells, called cones, that respond to different colored light by relaying an electrical signal to other cells. In the case of the apple, only the red type of cones will send a signal. Eventually, this signal reaches the last step in the retinal pathway: retinal ganglion cells. These cells extend out of the retina and relay the visual information all the way to different brain regions. Along the way, the information is combined and processed to identify features like shape and hue, ultimately allowing us to recognize the object as a delicious red apple. What about objects that appear white? What we perceive as white light is actually composed of all wavelengths from across the entire visible spectrum. This may seem nonintuitive, but it’s the phenomenon that’s responsible for rainbows: white light is separated by water droplets into the different colors, producing a colorful spectrum. However, if white light is composed of the entire color spectrum, why do we perceive it as having no color? The answer lies in the receptive field of the retinal ganglion cells. A neuron’s receptive field is the region of the world where a stimulus can activate the cell. Pioneering neuroscientists Hubel and Wiesel, who made some of the


By Marissa S

ceptive fields for color vision



earliest discoveries of how the visual system works, were one of the first to discover the receptive fields of ganglion cells in mammals. To do so, they first developed a new method to measure Thow electrical signals in a single neuron changes with visual stimuli by inserting a tiny electrode into the brain of their model organism, a spider monkey.1 A small bright spot of light in the center of a cell’s receptive field caused it to fire more electrical signals. A donut-shaped light stimulus had the opposite effect, making the cell fire less. Lastly, a large spot of light had no effect on the cell’s activity. From this, Hubel and Wiesel concluded that ganglion cells had a concentrically organized receptive field (Figure 1).1 Stimuli in the center of the receptive field excites the cell, and stimuli around the edges of the receptive field inhibits the cell. But, the earliest of these studies were done on retinal cells that are responsible for our black and white vision. The question of how color vision worked remained unanswered until a few years later. Careful experiments by Hubel, Wiesel, and others systematically characterized the response properties of hundreds of cells in the retina and uncovered a population of ganglion cells that responded to color.2 Astonishingly, these color receptive ganglion cells had the same concentric receptive field organization as the ones for black and white vision. But instead of being activated by light in the center and inhibited by light on the edges, these cells were activated by red in the center and inhibited by green on edges. In other words, a visual stimulus with red in the center caused maximal activation




of the cell, while a stimulus with green on the edges caused maximal inhibition of the cell. A stimulus with red in the center and green on the edges had no effect on the cell2. Hubel and Wiesel also found cells with the opposite pattern: they were activated by green in the center and by red on the edges. Later studies by others characterized a similar receptive field pattern but for blue and yellow light3. These color receptive ganglion cells can be grouped into four broad categories, shown in Figure 2. These findings explain why we perceive white as having no color. When white light hits the retina, none of the four ganglion cell types change their firing rate. If we take the red on, green on cell as an example, red light in the center activates the cell, but white light also contains green light. The green light in the edges of the receptive field inhibits the cell. In essence, these two effects compete and cancel each other out. Thus, even though both red and green light are hitting the retina, they has no effect on the cell. The competing activation in the center and inhibition on the edges occurs for all the four cell types, since white light contains every color of light, resulting in the perception of no color. These landmark discoveries transformed our understanding of how vision works. Their findings paved the way for further studies characterizing the increasingly complex receptive fields of neurons in the visual cortex. Though these studies were published over four decades ago, they are a remarkable testament to the elegant principles that guide even the most complex of processes like vision. 8



Figure 1. Color opponent concentric fields in retinal ganglion cells.4

Figure 2. Schematic of concentric receptive fields for black and white vision (




Oral Microbiome and Taste Perception By Hilina Woldemichael The oral microbiome consists of all of the microscopic organisms—such as protozoa, fungi, viruses, and bacteria— that live inside of the oral cavity and is second only to the gut microbiome in diversity. In your mouth live over 700 different species of bacteria alone. It was first characterized in 1674, when Antony van Leeuwenhoek first observed “little living animalcules prettily moving” on his own dental plaque. These microbes form a symbiotic relationship with us, helping to ward off disease-causing bacteria, while they make our mouths their home.1 As more research shows how oral microbes play a role in homeostasis and immune response, there is great interest in how these microbes can have other effects, such as influencing the way that we perceive taste. Camilla Cattaneo and her research team at the University of Milan wanted to explore the relationship between taste perception, food preferences, and the microbiota. Taste perception plays a crucial role in our ability to assess the nutrient content of food as well as to distinguish 10


between toxic and non-toxic substances. Moreover, preferences in foods can lead towards unhealthy eating tendencies, such as the overconsumption of fat and sugar, which can lead to a nutrient-poor or overfed state. The potential capability of microbes to influence taste perception and consequently food preference is not well-studied. Cattaneo’s team wanted to elucidate the relationship between the two. Specifically, they studied the microbiota of two groups of people: super-tasters and non-tasters.2 Super-tasters were first classified by Linda Bartoshuk in studies of how genetic differences influence taste. She found in her research that there were some people extremely sensitive to the bitter-tasting compound known as 6-n-propylthiouracil (PROP). About 2530% of the population are sensitive to PROP, while 25-30% cannot taste it at all, and 45-50% of the population falls somewhere in between.3 The reason for this appears to be that super-tasters have a greater density of fungiform papillae on their tongues than do non-tasters.


Super-tasters have a tendency to dislike many foods due to their elevated senses.4 This sensitivity has been linked to a propensity to avoid bitter foods such as vegetables as well as hot, spicy foods.5,6 Non-tasters, on the other hand, are more likely to be attracted to sweet, high-fat foods and have a preference for alcohol. 7 Overall, there is clear evidence that supertasters and non-tasters have distinct food preferences. Given that the types of food consumed have a direct impact on health, there is a link between sensitivity to PROP and health status, although there is still much discussion on whether there is a causative relationship.8 The researchers studied fifty-nine subjects, thirty non-tasters and 29 supertasters. They measured the participants

taste sensitivity, Fungiform Papillae Density (FPD), and tongue microbiota composition, three characteristics usually studied independently. Taste sensitivity was judged based on the threshold concentration of compounds at which the individuals were able to distinguish a specific taste—sweet, salty, bitter, and sour. This supported previous work that determined super-tasters more strongly perceive all tastes, rather than just bitter. The non-taster group showed a much higher threshold than did the supertaster group. FPD, which is hypothesized to be correlated to greater responsiveness to taste, was greater in super-tasters. They obtained samples from swabs of the dorsal tongue surface of the subjects, and characterized the microbes present through sequencing. Although the diversity of microbes present did not differ between the two groups, five bacterial genera were present in significantly greater amounts in the super-taster group: Actinomyces, Oribacterium, Solobacterium, Catonella, and Campylobacter, while members of the family Lachnospiraceae and class Erysipelotrichaceae were depleted. These findings are significant, as previous research has shown that bacteria such as Actinobacteria, the phylum to which BREVIA FALL 2020



Actinomyces belongs, are connected to an increased taste sensitivity, particularly to bitter tastes. Additionally, Actinobacteria are also known to produce compounds that are precursors to bitter acids or able to enhance the bitter taste in foods. The authors ultimately concluded 12


that there is a statistically significant difference between the oral microbial compositions of super-tasters and nontasters, and that difference may play a role in taste perception and food preferences. However, they also noted further work that must be done to establish a causal


relationship. A larger sample size of subjects in both groups is necessary, as well as more in-depth study of eating habits. Furthermore, this study focused on normal weight, young adults. Given the influence of food preferences on weight, research on a more diverse group would yield more interesting data on the role oral microbiota play in an individual’s food

choices and consequent health outcomes. Although more work needs to be done in order to determine exactly what oral bacteria are doing to contribute to our sense of taste, we may be able to better understand the complexities of taste and its interactions with the oral microbial population.




The Balancing Act Between Senses

By Lara Van Rooyan

Many of us take our senses for granted, and often don’t take note of the fact that the mechanisms by which we perceive our world are actually mediated by a precisely tuned cascade of molecules and sensors. Similarly, we interact with the world around us every day without being aware of our brain’s plasticity - its ability to change and adapt to the input it receives. These changes can occur through physical reorganization of which parts of the brain perform which functions, as well as in the connections between neurons and the extent to which they communicate with one another. While neuroscientists once believed that the brain was essentially unchanged once adulthood was reached, it is now a well-accepted fact that our daily experiences in adulthood are still able to alter our brain over time. Most excitingly, this property of the brain can be exploited in people with brain injuries, in which brain functions that have been lost can be regained due to the responsibility of performing those functions being transferred to a noninjured area of the brain. 1 14



In November 2019, from Maryland too fact that the brain and showed ho allowed the brain the dark for a we by changing conne of the brain respo



a group of scientists ok advantage of the can be reorganized ow neuroplasticity ns of mice kept in eek to compensate ections in the parts onsible for hearing.2

This result was surprising, considering that even such a short time of visual deprivation would cause the brain to compensate by enhancing other senses. Researchers have long been debating the length of sense deprivation that is needed in order to induce compensatory mechanisms, with some scholars being of the opinion that only those with long term sensory deficiencies, like those who have been born blind, having any meaningful degree of improvement in acuity of their other senses. This study, which was published in the Journal of Neuroscience, measured changes in the primary auditory cortex of the brain through a technique called two-photon calcium imaging, which is a highly accurate way for scientists to track the activity of single neurons. In doing so, they discovered changes in the way individual neurons of the primary auditory cortex responded to sounds of different frequencies, with the overall population of neurons now having more cells that responded best to sounds with higher frequencies. BREVIA FALL 2020




This finding is present in humans as well, with those that have been blind from an early age showing analogous changes in the auditory cortex that enable them to be more proficient at processing sounds, and therefore allow them to be more proficient at tests involving locating the sources of sounds and the perception of pitch. 4 However, this new study draws attention to the possibility that even those that have lost their sight or other senses much later in life can also accrue these adaptations to their other modes of perception. As those who have lost a sense or suffered a brain injury practice using their complementary senses, they are able to form new neural connections and strengthen the communication between existing neurons, thereby allowing the individual to recover. Giving the brain certain types of inputs does not only help it transmit neural impulses more effectively, but can also alter the organization of modules of the brain. For example, if no visual input is received for an extended period of time when someone is blind, parts of the visual cortex is actually repurposed in order to process the sensations of touch and hearing - the brain truly does not let any unused parts go to waste! 5 So, while most of us are fortunate BREVIA FALL 2020


enough to have possession of all of our senses, we can rest assured that our brain is a magnificently complex organ, constantly re-arranging its neural pathways in order to help us adapt, even in the face of traumatic injury. The brains of those with impairments in one or more senses are reconfigured at a molecular and cellular level in order to provide the best functionality possible, compensating for the loss of sensory input through increased development of other sensory pathways. Despite the research that scientists have done on this subject so far, there is much to be done in elucidating the molecular mechanisms that drives the changes in re-organization of parts of the brain. Once these molecular mechanisms are properly understood, we can make more accurate predictions of how long sensory deprivation actually needs to be experienced in order for functional changes to start occurring. In addition, knowing the exact processes occurring in neurons during this time of functional reallocation in the brain could help us determine whether any drugs could be developed in order to enhance these changes and provide people with better sensory capacities after their recovery from injury. BREVIA FALL 2020




The Skeleton of the Acute Phase Response By Gabriela Escalante

You may have heard stories of mothers who suddenly become strong enough to lift a car or fight a polar bear to save their children. You may even have experienced something similar when a professor calls on you unexpectedly in class: a pounding heart, rapid breathing, pale face and a sudden desire to bolt from the room. It’s the acute stress response, or ASR, also known as the “fight or flight” reaction. You may believe that the cause of ASR is simply the release of adrenalin into circulation. First described by the eminent physiologist Walter Cannon in 1915, ASR is commonly attributed to the sympathetic nervous system going into overdrive. But it turns out that it may not be quite that simple. When conjecturing about the purpose of an organ, scientists sometimes look back at the evolutionary record for ideas. A group of scientists headed by Gerard Karsenty from the Columbia University Medical Center did just that, as described in their paper “Mediation of the Acute Stress Response by the Skeleton,” 18


published in the November 5, 2019 issue of Cell Metabolism1.Karsenty and colleagues surmised that the skeleton must have developed in early vertebrates to protect vital organs and to facilitate locomotion in the wild environment of the early Earth; however, a skeleton alone would not be enough to save an animal from peril. Facing danger, an animal needs not just a skeleton, but also strong muscles to move the bones, fuel for the muscles to contract swiftly, a circulatory system to supply the fuel, and an alert nervous system to inform the animal of the best counterattack strategy or escape route. All of these conditions are primed by what is known as the acute phase response (APR), a series of changes involving neurotransmitters and hormones including glucocorticoids and epinephrine, —also known as adrenalin— catalyzing reactions that prepare the animal for fight or flight: an increase in the heart rate, elevation of blood pressure, mobilization of glucose from tissue stores, dilation of muscle arteries, and increased attentiveness. Karsenty and his colleagues argued that if the skeleton evolved for defense, then it should not be surprising that bones have a role

FEATURES in the APR. Supporting their hypotheses with an impressive series of experiments,

Karsenty and colleagues pointed to osteocalcin, a hormone produced in the skeleton, as a principal driver of the acute phase response in vertebrates. The authors showed that, under stressful conditions, mice, rats and humans exhibit a burst in blood levels of osteocalcin1.

To stress the mice, the authors restrained them inside a test tube for 45 minutes (the tube had breathing holes) or delivered electrical shocks. Both stimuli raised blood osteocalcin. They also allowed mice to smell a component of fox urine, which caused osteocalcin to rise, whereas rabbit urine did not. The authors also transfected the mice’s fear center in the brain with a designer receptor exclusively activated by designer drugs, or DREADD receptor, after which the designer drug clozapine-N-oxide was able to block the osteocalcin rise after smelling fox urine, suggesting that stressors signal through the central nervous system. In fact, osteocalcin is produced by osteoblasts in bone, but not in response to circulating hormones or mediators. Osteocalcin is produced in response to glutamate released by neuronal endings in the bone.

Karsenty and colleagues found that osteocalcin is necessary for the ASR – osteocalcin knockout mice did not display ASR when exposed to stressors. Mice knockouts for the peripheral receptor for osteocalcin did not display ASR after exposure to stressors either, but mice deficient for the central osteocalcin receptor did have ASR, suggesting that osteocalcin acts peripherally. Intrigued by their results in mice, Karsenty et al. also studied humans. To stress volunteers for the study, the authors sham accused them of a transgression, and then gave them two minutes to prepare a 3-minute speech defending themselves. Then subjects were made to watch a videotape of their speech. These trying activities caused rises in the volunteers’ heart rate, blood pressure, and even the white blood cell count. Osteocalcin also rose 50% compared to baseline in the humans put through the ordeal. So, it seems that what we thought we knew about the acute stress response was incomplete. There is a new player on the team, osteocalcin. Now, the next time a professor cold calls you in class, you’ll know it’s your skeleton that’s making you want to flee from the classroom.!




The Middle Ground


Dreams, ingrained into sleep, are as much a part of our day as the waking state: common for all, but distinct among individuals. As kids, we may be plagued by typical nightmares of the monster under the bed or the boogeyman. However, as one ages, dreams may echo the worries of daily life such as job interviews and tests. From Achilles’ dream of Patroclus in Homer’s The Iliad to Carl Jung’s theory of the collective unconsciousness, dreams and their interpretations have been a fascination for many since ancient times, yet there is much about them that remains unknown. Pindar, a Greek lyric poet from the fifth century, attempted to explain the nature of dreams in a lament (of which we only have a fragment): “The body of all men is subject to overpowering death, but a living image of life remains for it alone is from the gods. It slumbers while the limbs are active, but to men as they sleep, in many dreams it reveals an approaching decision of things pleasant or distressful.”1 The powers that govern the waking state are separate from those of the dream state. The soul, seen here as the “living image of life,” is awake when the individual is asleep, while the mind is said to BREVIA FALL 2020

By Abigail Miller

have governed the waking state.2 There was a separation between the mind and the soul, the physical and the spiritual. The division of labor between the soul and the mind to the dream state and waking state, respectively, could have been with the insight that there were different levels of mental processing required for each state. This perception is retained as modern-day experiments on sleep show differences in metacognition traits in those who are awake compared to those who are asleep. Metacognition, typically defined as the awareness of one’s thoughts, can be described in this case as awareness of the dream state. The idea of separation does not go away entirely, and evidence for that separation exists in lucid dreams— dreaming with consciousness. In the middle of the waking and the dream states, lucid dreams occur when the individual’s mind is active and aware of the dream. According to Pindar’s piece, lucid dreaming would be an example of the mind, the authority of the waking state, entering the domain of the sleeping state, or the soul, the living image of life. In that time, the edges of the waking state and our conscious mind touch the borders of

FEATURES the dream state, and one is able to perceive the difference between the two otherwise different states. In a 2020 study published in Frontiers in Psychology, a team led by Yu and Shen at the South China Normal University looked at the effects of metacognition associated with lucid and non-lucid dreams. Participants were all either undergraduates or postgraduates who participated by completing an online questionnaire. The questionnaire consisted of a lucid dream frequency scale, an eightpoint scale that measures how often one has lucid dreams (from 0 being never to 7 being several times a week), and a dream recall frequency scale, a seven-point rating to test how often one recalled dreams (from 0 being never 6 being almost every morning).3 Bizarreness of dream content was also tested and shown to be inversely related to metacognition functions, so a higher level of metacognition function was associated with a lower bizarreness of the dream.3 Metacognition was measured using the Selfreflection and Insight Scale from Franklin’s and Grant’s 2002 paper.4 Bizarreness was also significantly lower in lucid dreams than in non-lucid dreams.3 Lucid dreaming is associated with higher levels of metacognition than non-lucid dreams, which supports the idea that they exist as an intermediate

between the waking and sleep state. There are two different levels of consciousness: primary and secondary. Primary consciousness includes emotion and perception. Secondary allows individuals to think conceptually and interpret emotions and actions.3 A 2009 study done at Harvard Medical School by J. Allan Hobson added that individuals in the waking state have a secondary consciousness in addition to the primary consciousness, while the dream state is mainly the primary consciousness.5 Here, the dream state is seen as separate because it does not reach the full capabilities of the waking state. This does not mean that it is subordinate to the waking state, but rather that they accomplish different roles. This study demonstrates that past beliefs are not so different from what is believed now, but advances in science allow us to better describe certain features and add to the overall narrative. The separation of ability between the dream state and the waking state noted by Pindar in the fifth century BC is echoed in the varying metacognition levels observed in the 2020 study by the Yu and Shen team. Though the language between the two cultures is different, it is evident that Pindar’s original words have made a lasting impression on the perception of consciousness and dreams. BREVIA FALL 2020



A Fresh Approach to Genome Editing By Chibuike Uwakwe


Our genome, made up of approximately 3 billion DNA base pairs and 25,000 genes, is everything. It is how traits like tall stature, eye color, and hair color get passed down through generations of your family. It is why a stranger might exclaim “you look just like your mom/dad!” when seeing you together for the first time. It is why male pattern baldness and familial dispositions to disease exist. The genes we inherit are canonically unchangeable. If you inherit sickle cell genes from their parents, you will have sickle cell anemia. If you inherit mutations on your CFTR gene, you will have cystic fibrosis. Scientists developing genome editing techniques are interested in weathering the aspects of gene inheritance that are seemingly set-in-stone. Although many viable genome editing techniques appeared in the 21st century, one tool that was more accurate, faster, and more efficient swept the scientific community off its feet: CRISPR (Clustered Regularly Interspaced Short Palindromic Repeat) in conjunction with a Cas9 enzyme (CRISPR- associated protein 9).1 CRISPRCas9 genome editing involves the use of a BREVIA FALL 2020

small segment of RNA (Ribonucleic Acid) to bind to a particular DNA (Deoxyribonucleic Acid) target sequence where the edit is to be made. The Cas9 enzyme then cuts the double stranded DNA sequence and natural DNA repair machinery adds or deletes portions of genetic material. For the full completing editing, the specifically customized DNA sequences are incorporated into the sequence via Homology Directed Repair (HDR).1 CRISPR-Cas9 has been the star genome editing tool for the bulk of the past decade, and for good reason. Genetic diseases like Sickle Cell Anemia and Cystic Fibrosis are curable with CRISPR-Cas9, and crops with desirable traits like pest or drought resistance can be engineered. Despite these advantages, CRISPR-Cas9 still has its fair share of limitations. The double-stranded breaks made by CRISPR-Cas9 are susceptible to undesirable outcomes—dubbed offtarget effects—that are a result of a mixture of random insertions and deletions within DNA.1 Additionally, CRISPR-Cas9 is incapable of making targeted and accurate insertions, deletions, and bases-to-base conversions on a scale required for correcting the vast

FEATURES majority of genetic diseases.2 In the Liu Lab at the Broad Institute of Harvard and MIT, Andrew V. Anzalone et al. developed a new genome editing technique that addresses these considerations, dwarfing the feats of CRISPR-Cas9 technology in comparison. Yes, they figured out a way to make precise insertions, deletions, and baseto-base conversions through a technique called Prime Editing. Anzalone developed this new genome editing method using much of the same components involved with CRISPR. First, he engineered a prime editor guide RNA (pegRNA) to specify the portion of DNA he wished to edit and serve as a template for the edited DNA. Next, he engineered a prime editor protein made up of a disabled Cas9 enzyme fused with another enzyme called a reverse transcriptase that can create DNA strands using a template (i.e. the pegRNA).2 The prime editing system made up of the pegRNA and prime editor protein binds to target DNA and synthesizes a strand of edited DNA, resulting in a new flap of DNA that must be incorporated. Luckily, the prime editing system is able to help incorporate the edited DNA so that both strands of the double helix contain the pegRNA encoded edit.2 Because the pegRNA can be practically any sequence, it can be used to

perform all possible mutations, insertions, deletions, and their combinations in DNA with prime editing. Anzalone et al. ‘s project shows more than 175 edits in human cells using prime editors, including successful tests in four different human cell lines.2 With prime editors, they have performed various disease-relevant edits that were once too complex to do efficiently. That being said, David Liu, Principal Investigator of the Liu lab, expressed that “In many respects, this first report of prime editing is the beginning, rather than the end, of a longstanding aspiration to be able to make any DNA change in any position of a living cell or organism, including human patients with genetic diseases.” With the development of prime editing, one could say that the Liu Lab has rendered CRISPR-Cas9 genome editing technology obsolete, but that would be quite a radical perspective.

People should view prime editing as a nifty addition to the genome editing toolkit with immense potential to change the way we perceive human genetics.




How Our Brains Perceive Mental Illness… Without Us Knowing By Isabella Trasolini When Nicky’s boss discovered her anxiety disorder, he stopped assigning her projects until she was left with nothing to do.1 After Helen explained that she was struggling with depression, her friend asked, “Is that all you’re depressed about?”1 Lizzie battled anorexia throughout college, but she was reluctant to seek help because she felt “embarrassed, ashamed and that [she] had let people down.”1 Poor mental health is common throughout the United States, with about 1 in 5 adults diagnosed with a mental illness every year.2 Nevertheless, victims of mental illness often encounter social stigma due to the misconceptions and prejudices surrounding their conditions. Research indicates that people tend to perceive those with mental illness as more incapable, dangerous, unpredictable, and generally different than mentally healthy individuals.3 Such stigmatization makes it difficult for victims of mental illness to excel in the workplace, maintain relationships, and obtain equal housing and education.4 Furthermore, 24


it also affects them on a more fundamental level. For example, the most common barrier to help-seeking amongst individuals with mental illness is the fear that others will judge and stigmatize them.5 Consequently, only about 25% of adolescents experiencing depression or anxiety seek medical attention for mental health.5 When people do not seek help, it is impossible for healthcare professionals to diagnose and treat them, and the result is unaddressed illnesses that worsen over time. In a phenomenon known as self-stigmatization, individuals with poor mental health often internalize social stigma, leading to prejudices about their own conditions.4 Compared with other mental illness victims, those with high levels of self-stigmatization tend to report lower quality of life, demonstrating that stigma exacerbates the negative thought patterns associated

FEATURES with many mental illnesses.4 Considering the destructive effects of stigma, it is clear that change is needed. Of course, this is no recent realization, and researchers have been searching for stigmareduction strategies for decades. Dr. Young, a clinical psychologist in Toronto, Canada, is interested in the implicit stigma associated with mental illness.4 Most interventions have focused on educating the public about mental illnesses and increasing contact between those with and without mental conditions.4 While these interventions have seen some success, Dr. Rebecca Young and her colleagues insist that scientists are only addressing half of the problem. Explicit stigma is typically easily identifiable, and people are aware of their explicit biases.4 In contrast, people are typically unaware of their implicit biases, making them more difficult to address.4 To illustrate the difference, imagine two people who dislike brussels sprouts. The first has tried the vegetable numerous times but simply cannot tolerate the bitter taste. The second cannot pinpoint why she hates brussels sprouts, but something about them

makes her gag. The first is an example of explicit bias because the person is fully aware of his reason for abhorring brussels sprouts, while the second is an example of implicit bias because her issue with brussels sprouts is subconscious. According to Dr. Young, previous stigma-reduction efforts have targeted primarily explicit mental illness biases.4 Convinced that implicit biases also play an important role, Dr. Young set out to identify the implicit biases associated with mental health and methods of alleviating implicit stigma. The group began by gathering 65 mentally healthy subjects to participate in a Go/No-Go Association Task (GNAT).4 A GNAT is a computerized test that helps reveal specific implicit biases by measuring associations between categories and attitudes.4 For instance, in a GNAT designed to measure the association between mental illness and negativity, the test would consist of stimuli representing negative mental illness, positive mental illness, and distractors that do not fit into either category. If a subject can more easily distinguish between negative mental illness stimuli and distractors than between positive mental illness stimuli and distractors, then the subject has an implicit bias that mental illness is negative. BREVIA FALL 2020




After all of the participants had taken the GNAT, Dr. Young found that many of them implicitly perceived those with mental illnesses as dangerous, helpless, and negative.4 They were biased against people with poor mental health, and they did not even realize it. Next, Dr. Young began searching for ways to reduce the stigma associated with mental illnesses by addressing implicit bias. To accomplish this, Dr. Young chose three interventions that she predicted might decrease people’s subconscious prejudices against those with poor mental health: education about the definition and impact of implicit bias, feedback on personal implicit biases based on a GNAT, and contact with individuals with mental illnesses.4 The researchers gathered 195 subjects and assigned them to various groups.4 Some participated in only the education and feedback interventions, some participated in only the contact intervention, and others participated in all three interventions.4 At the end of the study, Dr. Young compared the participants’ initial GNAT results to their results after undergoing intervention, and the results were promising. The group that participated in all three interventions had significantly lower levels of association between mental illnesses and


danger (Figure 2).4 Furthermore, the group that participated in only the education and feedback interventions had much lower links between mental illnesses, helplessness, and negativity (Figure 1; Figure 3).4

Overall, the results indicate that making people aware of their implicit biases (through education and feedback) and the impact of their implicit biases (through education and contact) has the power to reduce the implicit prejudices associated with mental illness.4 Combined with efforts to minimize explicit bias, these interventions could help lift the burden of stigma from the shoulders of those suffering from poor mental health. Without stigma, we can replace judgement and exclusion with support and recovery.


Figure 1: The subjects had a weaker association between danger and mental illness if they underwent the education plus feedback intervention or all three interventions. The contact-only intervention did not successfully reduce the association

Figure 3: After the study, the participants were less likely to associate mental illness with helplessness. Similar to the results for negativity, the education plus feedback intervention proved most effective.4

Figure 2: After undergoing implicit bias interventions, the participants were less likely to associate mental illness with negativity. This was especially true for the group that undertook the education plus feedback intervention.4




Healthcare Wearbale Technologies:

Exploring Remote Medical Tracking and Personlaized Telemedicine-


Healthcare wearable technologies have become increasingly popular within the fields of patient monitoring, small data acquisition, and remote non-acute care alternatives. These technologies include both hospital-grade devices and consumer products such as Fitbits and other related smartwatches. The devices are multifunctional and allow for constant vital sign monitoring in patients currently undergoing or recovering from treatment. They are equipped with embedded sensors to facilitate the accurate collection of user data noninvasively. “Wearables” are becoming increasingly crucial for the development of future healthcare delivery practices aimed at remote, personalized, and non-acute care.1 The majority of US healthcare institutions including hospitals, clinics, and specialized care facilities are considering incorporating wearables within the process of patient data collection and its subsequent transfer to electronic health record (EHR) providers.3 This change would drastically improve closed-loop communication among healthcare providers and organizational capabilities to deliver sensitive patientBREVIA FALL 2020

By Eric Costa Pimentel related information to patients, doctors, nurses, and caretakers.2 Researchers at NYU and Accenture Inc. conducted a recent survey analysis study intended to scope both current and potential markets for wearables, as well as the likelihood of integration among established healthcare institutions.3 They sent the surveys to businesses that were developing wearables or highly interested in incorporating data from smart devices into standard patient health records. The study examined both the opportunities and challenges of instituting wearables, gathering enough information for developers to create an integrative roadmap detailing the future direction of these devices. As such, researchers and developers alike intend to standardize the large-scale use of wearables for both the collection and relay of sensitive data, as well as to successfully install a cloud database of organized patient information portfolios based on relevant healthcare delivery components such as genetic biomarkers, preceding patient-related cases and real-time vital signs monitoring. Companies such as Apple, Samsung, and Fitbit currently dominate the consumer-

FEATURES based wearables market share with their smartwatch devices, which have grown in popularity throughout recent years.1 These devices are designed to collect users’ personal health and exercise data on a regular basis. Embedded sensors, including optical heart sensors, electrocardiograms, and pulse oximeters, actively measure vital signs and develop an integrated user profile by systematically organizing the collected data.4 Wearables, with their available personalization and customization features, have increased self-tracking of health for both patients and regular users pre-, post-, and during treatment.1 The NYU and Accenture research team developed a set of detailed surveys designed to scope existing efforts to integrate wearables within standard EHR systems. This resulted in both inclusive and qualitative searches taken from several databases and client lists.3 They were consequently able to offer deeper insight into the wearables industry and its economic potential by detailing both novel features and current issues hindering their expected integration timeline within healthcare delivery and monitoring systems. According to their analysis, ten tech startups and four insurance companies—each exploring the possibilities of wearables development—expressed

common goals including a personalized patient experience, reward and incentive programs, data analytics, remote monitoring capabilities, access to patient records, and AI-enabled functions.3 The combination of these components enables effective and decentralized healthcare delivery for remote patients based on their data-oriented health goals and medical history.2 The study also demonstrated that, although wearables offer a promising future of home-care monitoring through decentralized telemedicine, there are still significant challenges to overcome before their widespread implementation in the medical world including system privacy, interoperability between devices, and patient data overload, among others.3 Firstly, we need a secure method of relaying sensitive user data collected from wearables to the proper patient healthcare providers. Insecure healthcare devices and wearable technologies could potentially suffer from breaches or intercepted data transmissions, exposing sensitive patient files to unsolicited parties such as hackers. Health organizations including the FDA and the MITRE Corporation are actively trying to mitigate these threats by implementing initiatives such as a cybersecurity sandbox— an isolated testing environment designed to enable security research and technical BREVIA FALL 2020


FEATURES evaluation of medical device vulnerabilities.

Interoperability between wearables, meaning the ability of interconnected devices within a system to both exchange andmakeuseofrelevantinformation,isalso crucial in seamlessly connecting remote patients to their healthcare providers, as well as relaying their vital sign readings in real-time.2

These features promote increased patient safety and reliability, allowing healthcare professionals to act rapidly on adverse medical reactions as soon as they are detected.4 Healthcare researchers and firms continually seek to improve system interoperability by incorporating disruptive technologies such as 5G network communication, artificial intelligence, and the Internet of Things (IoT). Patient data overload is another obstacle in the adoption of wearables. This issue describes the challenge of efficiently retaining and assessing an abundance of incoming Figure 1: “Wearable technology set to drive EHR growth through 2020.” A graphic detailing the projected economic growth of several wearable healthcare technologies through the upcoming year, as well as survey analytics depicting the current public and user perception of wearable devices intended for health management.



FEATURES patient data. Cloud-enabled databases are key to providing a seamless and constant flow of information throughout standardized EHRs found within healthcare institutions, thus facilitating the integration of wearables into patient data collection. Moreover, if these cloud databases lack interoperability, it becomes more difficult to efficiently store secure patient files as well as promote seamless system accessibility, making it strenuous for healthcare professionals to both accurately diagnose and quickly treat patients.2 Software developers within the healthcare industry continue working meticulously to establish a platform that can accomodate personalized notification protocols, such as customizable alert frequencies for patients receiving different treatments.

Health wearables is a developing industry, and there are still unresolved questions regarding its likelihood of integration within future healthcare monitoring protocols. However, numerous tech companies and research teams across the US are steadily making strides in architecture development, possibly resulting in these devices soon becoming standardized in healthcare. By further decentralizing current healthcare delivery and monitoring, as well as establishing an interconnected architecture consisting of wearables and other smart medical devices, remote patients will receive high quality and cost-effective treatment opportunities.1 We can expect the next generation of wearables to play a significant role in chronic illness prevention and potentially help researchers to better understand the genetic implications of certain biomarkers through real-time patient monitoring, hence aiding in the ongoing development of groundbreaking cures targeting major diseases. Figure 2: “Consumers increasingly using a variety of technologies to manage their health.” A bar graph depicting (in percentages) the amount of different methods in which consumers and patients are currently utilizing to measure their respective vital sign levels as well as manage their health. BREVIA FALL 2020



Origin of Life: the Initiative and Beyond Founded in 2007 by Phillips Professor of Astronomy Dimitar Sasselov, The Origins of Life Initiative is an interfaculty community of over 150 professors, lecturers, senior researchers, postdoctoral fellows, graduate students, and undergraduates. The team spans the University in Cambridge and in Boston, where researchers at the Harvard Medical School and Massachusetts General Hospital campuses collaborate with their colleagues at the University. Together, the members of the Initiative study a vast array of life and physical sciences, all aimed in some way at uncovering the origin of life in the cosmos. The work carried out by the Origins community can be categorized into four main groups. There are astronomers searching for undiscovered planets that might be host life. These researchers utilize special instruments designed by experimental physicists, and are at the cutting edge of planetary science. Meanwhile, geochemists analyze sedimentary rocksonEarthandMarstoexamineplanetaryprocesses and environmental changes throughout history. Chemists and chemical biologists work to uncover the pathways atoms took to assemble complex selfreplicating molecules.They study the simple molecules thought to exist primitive planets, hoping to replicate the miraculous process that occured on Earth—and perhaps countless other worlds. At the same time, molecular biologists investigate how chemistry gave 32


By Kemi Ashing-Giwa rise to biology on host planets. To uncover the origin of Origins, I spoke with ProfessorSasselovandFisherProfessorofNaturalHistory Andrew Knoll, both members of the Initiative’s Council. Professor Sasselov, a founding exoplanet researcher, realized that communication between astronomers and biologists was necessary to truly advance the emerging field of astrobiology. Such a realization might be taken for granted, but although cooperation between scientists is certainly encouraged, it is often rare in the competitive academic environment. Most scientists worked alone or in pairs, and many continue to do so. Further, the hyper-interdisciplinary science currently taking the contemporary world by storm is relatively new. In Professor Sasselov’s own words, “astronomers need to learn more about life to find it,” just as biologists and chemists must be well-informed about the extrasolar environments and conditions that might be suitable for life in the first place. The Initiative’s doors are open to everyone in the Harvard community. All it takes is curiosity. The Initiative offers a graduate consortium/certificate program, as well as a postdoctoral fellowship. Undergraduates are given the opportunity to learn about and participate in research pertinent to astrobiology. There are no requirements, and I can personally confirm that joining the Initiative as an undergrad is a great way to get involved in the origins

OPINIONS of life work beings at the University and beyond. So what, exactly, is life? According to Professors Knoll and Sasselov, any self-sustaining chemical entity capable of Darwinian evolution qualifies, but the final answer is unclear. The working definition is more a list of attributes than a set description, but even the traits on that list are still under study. In short, scientists are still working on it. Professor Sasselov’s research focuses on the many modes of interaction between radiation and matter. Chemistry can be driven by ultraviolet light in a process where photons interact with atoms. It is thought that the products of such reactions may have created the building blocks of life which, in turn, gave rise to life. Sasselov’s groundbreaking work involves the use of a laser that simulates the light from a star, and then studying what chemistry results. Recently, his research has led him to explore the nature of planets orbiting other stars. Professor Sasselov was recently involved in the NASA Kepler mission that determined Earth-like planets are extremely common in our galaxy, far more so than previously expected. Meanwhile, Professor Knoll is investigating the paleontology and sedimentary geology of Precambrian terrains, as well as the evolution of vascular plants in geologic time. He first became interested in the origins of life during his sophomore year of college, during which he enrolled in a multitude of STEM courses. Unsurprisingly, he loved geology and biology most of all, finding them to be“two sides of the same coin.” By the time the year was over, Knoll knew

he wanted to study the intersection of life and Earth, a conviction that was only strengthened as he read about new ideas about the origin and evolution of life. Professor Knoll served on the science team for NASA’s MER mission to Mars for the past decade, interpreting data and images sent back by Opportunity. His team reconstructed environmental history on another planet using the same tools to study geologic history on Earth. In the end, the researchers were able to draw genuine inferences about the Martian surface. According to Knoll, there was no single fascinating finding, though “seeing stacks of sedimentary rock on another planet and being able to do something about it”was particularly exciting. When I asked the professors whether they thought humanity might discover life on other worlds within our lifetimes, I received two very similar answers. According to Knoll, there are three different ways we can look for life. We can search in our own solar system (there are upcoming several NASA and ESA missions to Jupiter’s moon Europa). Another route involves obtaining quantitative data about the atmospheric composition of other planets around other stars. Astrochemists search for substances that can be used to diagnostically prove that life is present. Methane and oxygen are promising signs. For the rest of the universe, Professor Knoll said, all we can do is “sit by telephone and hope someone answers.” But the final answer to my question, of course, is that the professors certianly hope so.



REFERENCES Why White has No Color: Receptive Fields for Color Vision 1. Hubel, D. H. & Wiesel, T. N. Receptive fields of optic nerve fibres in the spider monkey. J. Physiol. 154, 572– 580 (1960). 2. De Monasterio, F. M. & Gouras, P. Functional properties of ganglion cells of the rhesus monkey retina. J. Physiol. 251, 167–195 (1975). 3. Wiesel, T. N. & Hubel, D. H. Spatial and chromatic interactions in the lateral geniculate body of the rhesus monkey. J. Neurophysiol. 29, 1115–1156 (1966) 4. Kandel, E. R., Schwartz, J. H., Jessell, T. M., Siegelbaum, S. A., & Hudspeth, A. J. (Eds.). (2012). Principles of Neural Science, Fifth Edition (5th edition). McGraw-Hill Education / Medical. The Balancing Act Between Senses 1. Chen, R, et al. “Nervous System Reorganization Following Injury.” Neuroscience, vol. 111, no. 4, 2002, pp. 761–773. 2. Solarana, Krystyna, et al. “Temporary Visual Deprivation Causes Decorrelation of Spatiotemporal Population Responses in Adult Mouse Auditory Cortex.” ENeuro, vol. 6, no. 6, 2019, pp. 0269–19. 3; image 1. The Learning Corp. “The Science of Neuroscience” Brainwire, 25 Nov. 2016, brainwire/brain-plasticity-is-the-key-to-recovery-afterbrain-injury-or- stroke/ 4. Huber, Elizabeth, et al. “Early Blindness Shapes Cortical Representations of Auditory Frequency within Auditory Cortex.” The Journal of Neuroscience : the Official Journal of the Society for Neuroscience, vol. 39, no. 26, 2019, pp. 5143–5152. 5. Renier, Laurent A, et al. “Preserved Functional Specialization for Spatial Processing in the Middle Occipital Gyrus of the Early Blind.” Neuron, vol. 68, no. 1, 2010, pp. 138–148. Oral Microbiome and Taste Perception 1. Deo, Priya Nimish, and Revati Deshmukh. 2019. “Oral



Microbiome: Unveiling the Fundamentals.” Journal of Oral and Maxillofacial Pathology : JOMFP 23 (1): 122–28. 2. Cattaneo, Camilla, Giorgio Gargari, Ranjan Koirala, Monica Laureati, Patrizia Riso, Simone Guglielmetti, and Ella Pagliarini. 2019. “New Insights into the Relationship between Taste Perception and Oral Microbiota Composition.” Scientific Reports 9 (March). https://doi. org/10.1038/s41598-019-40374-3. 3. Bartoshuk, L. M., V. B. Duffy, I. J. Miller, PTC/PROP Tasting: Anatomy, Psychophysics, and Sex Effects. Physiol Behav, 1994; 56: 1165-1171. 4. Prescott, J. and B. J. Tepper, eds., Genetic Variation in Taste Sensitivity, Marcel Dekker, New York, 2004. 5. Hayes, J. E., B. S. Sullivan and V. B. Duffy, Explaining Variability in Sodium Intake Through Oral Sensory Phenotype, Salt Sensation and Liking. Physiol Behav, 2010; 100(4):369-380. 6. Giovannucci, E., Modifiable Risk Factors for Colon Cancer. Gastroenterol Clin North Am, 2002; 31:925-943. 7. DiCarlo, S. T. and A. S. Powers, Propylthiouracil Tasting as a Possible Genetic Association Marker for Two Types of Alcoholsim. Physiol Behav, 1998; 64:147-152. 8. Mattes, R. D., 6-n-Propylthiouracil Taste Status-Dietary Modifier, Marker, or Misleader? In J. Prescott and B. J. Tepper, eds., Genetic Variation in Taste Sensitivity, Marcel Dekker, New York, 2004, pages 229-250. Osteocalcin: The skeleton of the Acute Phase Response 1. Berger JM, Singh P, Khrimian L, Morgan DA, Chowdhury S, Arteaga-Solis E, Horvath TL, Domingos AI, Marsland AL, Yadav VK, Rahmouni K, Gao XB, Karsenty G. Mediation of the Acute Stress Response by the Skeleton. Cell Metab. 2019 Nov 5;30(5):890-902 The Middle Ground 1. Pindar. Fragments. 1997,

REFERENCES tI&result=1&rskey=ACHHB9. 2. Granger, Herbert. “Death’s Other Kingdom: Heraclitus on the Life of the Foolish and the Wise.” Classical Philology, vol. 95, no. 3, 2000, pp. 260–281. JSTOR, www.jstor. org/stable/270378. Accessed 28 Mar. 2020. 3. Yu, Chunyun, and Heyong Shen. “Bizarreness of Lucid and Non-lucid Dream: Effects of Metacognition.” Frontiers in psychology vol. 10 2946. 9 Jan. 2020, doi:10.3389/ fpsyg.2019.02946 4. Grant A. M., Franklin J., Langford P. The self-reflection and insight scale: a new measure of private self-consciousness. Soc. Behav. Pers. 30 821–835(2002). [5]Hobson, J. Allan, et al. “Dreaming and the Brain: Toward a Cognitive Neuroscience of Conscious States.” Behavioral and Brain Sciences, vol. 23, no. 6, 2000, pp. 793–842., doi:10.1017/S0140525X00003976. A Fresh Approach to Genome Editing 1. Urnov, F. D. (2018). Genome Editing BC (before CRISPR): lasting lessons from the “old testament”. The CRISPR journal, 1(1), 34-46. 2. Anzalone, Andrew V., et al. “Search-and-replace genome editing without double-strand breaks or donor DNA.” Nature 576.7785 (2019): 149-157. How Our Brains Perceive Mental Illnes..Without us Knowing 1. Time to Change. “Personal Stories.” 2020, https://www. Accessed 22 February 2020. 2. National Alliance on Mental Illness. “Mental Health By The Numbers.” 2019, mental-health-by-the-numbers. Accessed 22 February 2020. 3. Boysen, Guy A, et al. “Evidence for Blatant Dehumanization of Mental Illness and Its Relation to Stigma.” The Journal of Social Psychology, 2019, pp. 1–11. 4. Young, Rebecca E., et al. “The Subtle Side of Stigma:

Understanding and Reducing Mental Illness Stigma from a Contemporary Prejudice Perspective.” Journal of Social Issues, vol. 75, no. 3, 2019, pp. 943–971. 5. Gulliver, Amelia, et al. “Perceived Barriers and Facilitators to Mental Health Help-Seeking in Young People: a Systematic Review.” Bmc Psychiatry, vol. 10, no. 1, 2010, p. 113. Healthcare Wearable Technologies: exploring Remote Medical Tracking and Personalized Telemedicine 1. Bove, Lisa Anne. “Increasing Patient Engagement Through the Use of Wearable Technology.” The Journal for Nurse Practitioners, vol. 15, no. 8, 2019, pp. 535–539. 2. Costa Pimentel, Eric, and Julio Cesar Gomes Pimentel. “A Case for the Development of Medication Delivery Systems Under Health 4.0 Model.” 2019 BMES Annual Meeting, Biomedical Engineering Society, 16-19 October 2019, Philadelphia, PA. Conference Presentation. 3. Dinh-Le, Catherine, et al. “Wearable Health Technology and Electronic Health Record Integration: Scoping Review and Future Directions.” JMIR MHealth and UHealth, JMIR Publications, 11 Sept. 2019. 4. Zheng, Jiewen, et al. “Emerging Wearable Medical Devices towards Personalized Healthcare.” Proceedings of the 8th International Conference on Body Area Networks, 2013.



Brevia is a forum for science, culture, and other big ideas. We are committed to bringing all disciplines of research out of the ivory tower and into the discourse of the interested public. Through our opinion, features, and primary research articles, we explore the myriad connections in the world of intellectual endeavor. Our stories are brief because we want to make knowledge accessible and interesting, providing a palette of perspectives on the world around us. Brevia Volume 8 • Issue 1• Fall 2020 © President and Fellows of Harvard College

Profile for nva

Brevia Fall 2020  


Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded

Recommendations could not be loaded