Volume 17, Issue 1: Fall 2020

Page 1

COLUMBIA SCIENCE REVIEW FALL 2020 Volume 17 Issue I


,

Cover illustrated by Aeja Rosette Layout Editor (Articles): Alejandra Nunez (Changing the Vaccine Paradigm), Amanda Klestzick (H. Pylori: A Warning from the Unknown Superbug), Ashley Chung (Time in a Timeless Universe), Christine Shao (Letters + Table of Contents), Kevin Li (Development and Ethics of Brain-Computer Interfaces), Lia Chen (Cocktail Articles), Ningxin Luo (Looking Through the Eyes of Our Furry Friends), Sally Hwang (Leveling the Playing Field)

Fair Use Notice Columbia Science Review is a student publication. The opinions represented are those of the writers. Columbia University is not responsible for the accuracy and contents of Columbia Science Review and is not liable for any claims based on the contents or views expressed herein. All editorial decisions regarding grammar, content, and layout are made by the Editorial Board. All queries and complaints should be directed to the Editor-In-Chief. This publication contains or may contain copyrighted material, the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of issues of scientific significance. We believe this constitutes a “fair use” of any such copyrighted material, as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, this publication is distributed without profit for research and educational purposes. If you wish to use copyrighted material from this publication for purposes of your own that go beyond “fair use,” you must obtain permission from the copyright owner.


EDITORIAL BOARD EDITOR-IN-CHIEF SARAH HO CHIEF DESIGN OFFICER AIDA RAZAVILAR CHIEF ILLUSTRATOR LIZKA VAINTROB EDITORS AIDA RAZAVILAR, ALICE SARDARIAN, ANNA CHRISTOU, CHERYL PAN, EDWARD KIM, EMILY SUN, ENOCH JIANG, ETHAN WU, KIMIA HEYDARI, LUCAS MELO, RACHEL POWELL, SARAH BOYD, SASHA HE, SERENA CHENG ILLUSTRATORS AEJA ROSETTE, CHERIE LIU, KARENNA CHOI, KATE STEINER, MEGAN ZOU, REBECCA SIEGEL, SABRINA RUSTGI, TIFFANY QIAN, VANSHIKA SRIRAM, YI QU, ZOE HEIDENRY

MANAGING EDITOR LINGHAO KONG WRITERS ALAN ZHAO, ALLISON LIN, ANGELA MU, ANUVA BANWASI, APARNA KRISHNAN, CHARLES BONKOWSKY, CLARE NIMURA, ELAINE ZHU, ELEANOR LIN, ELLEN ALT, ETHAN FENG, HANNAH PRENSKY, JEFFREY XIONG, JENNA EVERARD, JIMMY LIU, JOSHUA YU, KEVIN WANG, MICHELLE LU, SHIVANI TRIPATHI, TANISHA JHAVERI, VICTORIA COMUNALE LAYOUT EDITORS ALEJANDRA NUNEZ, AMANDA KLESTZICK, ASHLEY CHUNG, CHRISTINE SHAO, KEVIN LI, LIA CHEN, NINGXIN LUO, SALLY HWANG

EXECUTIVE BOARD PRESIDENT JASON WANG PUBLIC RELATIONS JOHN NGUYEN SECRETARY AROOBA AHMED TREASURER KAT WU SENIOR OCM ABHISHEK SHAH, ALLI GREENBERG, HANNAH LIN MEDIA TEAM BRENDON CHOY, CHENOA GALE BUNTS-ANDERSON, MAGGIE ZHONG, NICK ZUMBA

VICE PRESIDENT ADRIANA KULUSIC-HO OCMs AIDA RAZAVILAR, ALANA PALOMINO, BOYUAN CHEN, CHINMAYI BALUSU, CHRISHON CAMPBELL, EMILY KHINE, ESME LI, EVERETT MCARTHUR, JOJO WU, JOSHUA YU, PRANAY TALLA, SAVVY VAUGHAN-WASSER, SONALI DASARI

The Executive Board represents the Columbia Science Review as an ABC-recognized Category B student organization at Columbia University.


Table of Contents

Fall 2020 Issue

06

Letters from the Editors

08

Cocktail Articles

12

Looking Through the Eyes of Our Furry Friends

Sarah Ho & Linghao Kong

Alan Zhao, Kimia Heydari, Ethan Feng, Alice Sardarian

Victoria Comunale

14

H. Pylori: A Warning from the Unknown Superbug Alan Zhao


18

Development and Ethics of Brain-Computer Interfaces Jeffrey Xiong

21

Changing the Vaccine Paradigm Anuva Banwasi

24

Time in a Timeless Universe Jenna Everard

26

Leveling the Playing Field Allison Lin


LETTERS FROM Dear Reader, Welcome to the Columbia Science Review’s Fall 2020 print issue! As you peruse this issue, you’ll find articles such as Anuva Banwasi’s timely explanation of vaccine delivery modes and Jeffrey Xiong’s exploration of brain-computer interfaces and the ethical challenges that they present. A remote semester does not create the easiest conditions under which to produce a fully formed publication. This issue—the third one published since the beginning of the pandemic—is a product of many Zoom meetings, Slack messages, and emails, not to mention the hard work of everyone on the Editorial Board. I want to extend a huge thank you to every writer, editor, illustrator, and layout editor who contributed to this issue despite time zone differences, disrupted access to the necessary software, as well as the endless personal difficulties that the year 2020 presented. This issue would not have been possible without your dedication and diligence! News about the pandemic and ensuing vaccine development has introduced a lot of scientific jargon into everyday conversations. While this has demonstrated how deeply relevant science is to our day-to-day lives, the past few months have also presented science as a complicated and intimidating discipline. My hope is that after reading through this issue, you’ll find science a bit more exciting and enjoyable. Happy reading! Sarah Ho

Editor-in-Chief

6


THE EDITORS Dear Reader, I’m really excited to welcome you to the Fall 2020 print issue of the Columbia Science Review! Many have worked tirelessly to bring you this edition, and I hope that you find the topics ranging from prime numbers to dog neurobiology as insightful and interesting as I did. CSR was founded on the principles of accessibility to science, and never has that principle been more important in recent memory than now. In addition to our standard biannual publications, we just came out with a special COVID-19 edition, and I hope that our articles in that edition can help clear up some of the confusion surrounding this pandemic, as well as some of them in this edition. These past months have been incredibly challenging for all of us, and I am incredibly grateful for the hard work that members of the CSR have put into making this edition. Without their passion and diligence, this edition would not have been as complete nor as cohesive as it is, especially in these circumstances. Even as the pandemic consumes much of our and the scientific community’s attention, I hope that after you read this edition, you can rest assured that science as a whole continues on regardless of the challenges. Stay safe!

Sincerely, Linghao Kong

Managing Editor

7


COCKTAIL

Flipping Coins with Prime Numbers

Written by Alan Zhao Illustrated by Zoe Heidenry

The typical process of flipping a coin works because both people have direct access to the result, and so a winner can be agreed upon. But what if these people were thousands of kilometers apart, where lying about the result would be easy? This is where prime numbers come in. From grade school, we are all familiar with the concepts of remainders, produced when you divide a number by a prime n. Now, suppose we are to group together integers which have the same remainder, and represent each group by a remainder in the list 0, 1, ... , 𝑛n − 1 (mathematicians call this the integers “modulo n”). Observe that these remainders multiply exactly like normal integers. For example, given the remainders of 7 and 8 upon division by 5 (2 and 3, respectively), I can calculate the remainder of their product 56 upon division by 5: 2 × 3 = 6, which is represented by 1. Given this, we can attempt to solve x2-a=0 modulo p, where p is prime and𝑎a is an integer. Now, in grade school we are taught that normally, x2-a=0 has two or zero real-valued solutions. It is no different when working modulo p. And if we find that b2-a=0 modulo p (and infinitely many p may be chosen to guarantee this) then (-b)2-a=0 modulo p as well. This is enough to flip a coin! Let’s say Alice and Bob are 100 kilometers apart. Alice can choose a pair c of two primes. Al8

ice sends the product pq to Bob, who then chooses a perfect square b2 and forwards its value modulo pq back to Alice. Now here comes the key idea: because p and q are prime, by the Chinese Remainder Theorem, this square uniquely corresponds to the pair (b2 modulo p𝑝, b2 modulo q). And so by solving x2-b2=0 modulo p and x2b2=0 modulo q, we obtain 4 solutions to x2-b2=0 modulo pq: x = ±b𝑏, ±c𝑐. Alice can now send one of b2 or c2 to Bob (these represent sides of a coin!). And if Alice matches Bob’s square, she wins. Here is the crux: say that p and q are chosen appropriately and pq is faithfully sent to Bob. What happens if Alice matches Bob’s square and he lies? In this case, Bob would then have every solution to x2-b2 modulo pq𝑞, and Alice, in accordance with mathematical theory, can reasonably demand that Bob send her back p and q within some time frame. Given large enough primes, with what is currently widely accessible, Bob will fail to do this if he was lying and he will succeed if he was truthful. In summary, this mechanism now achieves what we wanted to do at the outset, which was to flip coins both anonymously and faithfully. A parting question: can we do the same thing with a dice roll?


What Makes a Vaccine Intranasally Viable? Written by Kimia Heydari Illustrated by Rebecca Siegel Most vaccines are injected intramuscularly into the arm’s deltoid tissue. However, as noted by the editorial board of The New York Times, the intramuscular route of vaccine administration demands logistical necessities that may be difficult to meet amidst this pandemic [1]. For example, most intramuscular vaccines need to be kept at low temperatures (as low as -20°C) before delivery [2]. This temperature requirement is because the immunizing material (mRNA, for example) is encapsulated in different chemicals that would decompose at room temperature. In light of this logistical limitation, many study groups draw from past intranasal vaccination research to propose intranasal mRNA vaccine products [3].

rial, scientists usually encapsulate the RNA with a coating of lipid nanoparticles (LNPs). LNPs bundle up and protect the RNA material to cushion its passage through the lipid bilayer [4].

Generally, the effectiveness of RNA vaccines (like Moderna and Pfizer vaccines against the coronavirus) is related to the amount of vaccine product that can traverse multiple barriers—namely the lipid bilayer that defines the borders of cells—to reach and use cells’ RNA translation machinery. In an effort to maximize this amount of RNA immunizing mate-

Unlike intramuscular vaccines that regularly include temperature-sensitive LNPs, intranasal RNA vaccines include material other than LNPs and can be stored as dry powder formulations that elongate shelf life and vaccine durability [5]. In short, intranasal vaccines don’t need freezers and cold-chain maintenance [5]. Besides this logistical benefit of increased stability, intranasal vaccines would also only require one dose, instead of multiple, in order to confer long-term immunity. This single-dose sufficiency is because the intranasal immunization delivers the mRNA directly to nasal-associated lymphoid tissue, which is rich with professional immune cells [6]. This ease of delivery of vaccine product to immune cells stands in contrast to intramuscular administration, in which the vaccine product is delivered into muscle tissue, which does not have as many immune cells as does lymphoid tissue.

However, this bundling material requires 4°C or -20°C storage to maintain the integrity of mRNA as it travels across cell membranes [2]. Since maintaining this temperature range is difficult in global vaccine distribution, this cold-chain requirement makes intramuscular vaccines products less publicly available and therefore less efficacious in a pandemic response.

ARTICLES

While current intramuscular COVID vaccine administration is facilitating the global community’s progress into a post-pandemic world, considering options for intranasal routes of delivery is a pithy scientific question that could save both time and money in our current and future practices of preventive and emergency public health.

9


COCKTAIL

The Bayesian Intuition Written by Ethan Feng Illustrated by Tiffany Qian

are 20 who fit the initial description; of the 5 librarians, 4 fit the description. As we can see, even if the stereotype of librarians being more “meek, tidy,” etc., is accurate, Steve is still more likely to be a farmer, because the probability is overwhelmed by the sheer number of farmers. The point of this exercise is not to argue over the specific numbers, but rather, to test whether one should even consider the number of farmers and librarians in the first place. This illustrates one of the most important concepts in probability: new data should not completely determine one’s conclusion; instead, it should update previously known information. That is, in this scenario, if you were asked whether a random person is more likely to be a farmer or librarian, with no other information, you would pick “farmer.” The new data about Steve’s personality should not totally override this knowledge, but instead should shift the scales in one direction. In statistics, this is the central intuition behind Bayes’ theorem. It can seem counterintuitive at first, but following it allows us to obtain more accurate answers. I suggest you keep it in mind the next time you make decisions based on data—from scientific research to everyday choices. This article was inspired by the paper by Kahneman et al. [1], where the question about Steve was first articulated, and the excellent video on the subject by math educator 3Blue1Brown [2].

Consider the following description: “Steve is shy and withdrawn but loves to help others; he is meek, tidy, and has a passion for detail.” Is Steve more likely to be a librarian or a farmer? Instinctively, most respondents gravitate towards picking librarian as the more likely option, relying on typical mental images of each profession—but this is not an unreasonable conclusion. However, regardless of whether this stereotype is accurate or not (that is a separate issue), this line of thinking misses a crucial step: almost nobody stops to consider the fact that there are far more farmers than librarians in the world. To illustrate why factoring in this information is important, let us quantify things. (1) Say there are roughly 20 times more farmers than librarians; (2) additionally, let us estimate that 80 percent of librarians fit the initial personality description, while only 20 percent of farmers do (that is, suppose the stereotype is correct to a degree). Now, imagine a group of 105 people, all of whom hold one of the two professions. Of them, approximately 100 are farmers and 5 are librarians. Based on the given percentages, of the 100 farmers, there

10

Achieving Clarity Written by Alice Sardarian Illustrated by Lizka Vaintrob The part of the eye that is primarily responsible for the clarity of our vision is the lens. With the assistance of ciliary muscles, this biconvex and flexible structure can alter its thickness and curvature to focus incoming light rays on the retina at the back of the eye. In a process known as accommodation, the lens adjusts so that we may perceive objects at various distances with the appropriate clarity [1]. The lens is constantly supported by renewed cells, which lack organelles; however,


these cells may shift from transparency to opacity as a result of photooxidation. An accumulation of these opaque cells prevents the lens from being able to focus light on the retina, thus reducing the field and clarity of vision. These opacities, also known as cataracts, can form on various areas of the lens depending on their cause. Cataracts are a common ophthalmic affliction, resulting from normal senescence, trauma, or other conditions, like diabetes [2]. Given the prevalence of cataracts, surgical techniques have been perfected to allow for a complete recovery of normal vision. The phacoemulsification method involves the use of injected gel to stabilize the eye and a needle to emulsify the lens and then withdraw its pieces through a small, self-healing incision [3]. The only component that remains in the eye after this procedure is the anterior component of the lens capsule, which may be used to support the introduction of a new lens. This procedure has been advanced by the introduction of LASIK, a laser surgery which can perform each of the aforementioned steps without relying on manual manipulation.

References What Makes a Vaccine Intranasally Viable?

[1] Board, The Editorial. “We Came All This Way to Let Vaccines Go Bad in the Freezer?” The New York Times, The New York Times, 31 Dec. 2020, https://www.nytimes. com/2020/12/31/opinion/coronavirus-vaccines-expiring.html [2] Hassett, Kimberly J., et al. “Optimization of Lipid Nanoparticles for Intramuscular Administration of MRNA Vaccines.” Molecular Therapy. Nucleic Acids, vol. 15, Feb. 2019, pp. 1–11. PubMed Central, doi:10.1016/j.omtn.2019.01.013. [3] Li, Man, et al. “Enhanced Intranasal Delivery of MRNA Vaccine by Overcoming the Nasal Epithelial Barrier via Intraand Paracellular Pathways.” Journal of Controlled Release, vol. 228, Apr. 2016, pp. 9–19. ScienceDirect, doi:10.1016/j. jconrel.2016.02.043. [4] Pardi, Norbert, et al. “MRNA Vaccines - a New Era in Vaccinology.” Nature Reviews. Drug Discovery, U.S. National Library of Medicine, Apr. 2018, www.ncbi.nlm.nih.gov/ pmc/articles/PMC5906799/ [5] Birkhoff, M., et al. “Advantages of Intranasal Vaccination and Considerations on Device Selection.” Indian Journal of Pharmaceutical Sciences, Medknow Publications, 2009, www.ncbi.nlm.nih.gov/pmc/articles/PMC2846493/. [6] Liang, Bin, et al. “Nasal-Associated Lymphoid Tissue Is a Site of Long-Term Virus-Specific Antibody Production Following Respiratory Virus Infection of Mice.” Journal of Virology, American Society for Microbiology Journals, 1 June 2001, jvi.asm.org/content/75/11/5416.

After cataract surgery, patients are missing a lens and will have difficulty seeing until they are provided glasses or fitted with an intraocular lens. Current intraocular lenses continue to be improved, avoiding astigmatism and other unfavorable outcomes after treatment and even allowing patients to see clearly without the need for glasses. The types of lenses that currently exist include foldable, multifocal, refractive, and toric intraocular lenses [4]. Each type is designed to correct for different deficits such as near- or far-sightedness as well as astigmatism. Surgical techniques and lenses continue to be improved, all while helping patients to acquire visual clarity and a resulting higher quality of life.

ARTICLES

Bayesian Intution

[1] Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131. https://science.sciencemag.org/content/185/4157/1124/tab-pdf [2] Sanderson, G. (2019, December 22). Bayes Theorem [Video]. YouTube. https://youtu.be/HZGCoVF3YvM

Achieving Clarity

[1] Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131. https://science.sciencemag.org/content/185/4157/1124/tab-pdf [2] Sanderson, G. (2019, December 22). Bayes Theorem [Video]. YouTube. https://youtu.be/HZGCoVF3YvM

11


3

Looking Through the Eyes Written by Victoria Comunale Illustrated by Karenna Choi

A faithful companion of all shapes and sizes, dogs are one of the most ubiquitous domestic animals in the world [1]. A recent survey found that over 63 million United States households own a dog, far surpassing the ownership of any other pets [2]. Dogs have been a central part of human lives for an extremely long time: human-canine relationships can be traced back as far as 20,000 years [3]. Dogs even dominate the media, acting as central figures in movies and literature or as social media stars. Despite our familiarity with dogs, they remain enigmatic and puzzling creatures. Dogs communicate with their owners primarily via actions, whether it be scratching at the door to announce their presence or nudging on their bag of kibble to indicate that it’s time for their meal. Unfortunately, we remain unfamiliar with the inner workings of our furry friends’ minds. For example, we don’t even know how exactly they see the world around them. Dogs are often praised for their keen sense of smell and hearing. Their vision, on the other hand, gets little attention, despite the de-

mand for dogs to perform high-performance visual activities such as acting as seeing eye dogs or helping with police work. It is easy to compare dog vision to that of humans since we have analogous mechanisms and structures with which to receive visual information. In both humans and dogs, rod and cone photoreceptors are important in receiving visual information [4]. Rods are more sensitive to lightness and darkness and are more active than cones at lower light levels. Cones are more active at higher light levels and are responsible for color perception. Although both photoreceptors are present in dogs, they have one less type of cone than humans—we have three and dogs have two. As a result, dogs are only able to experience a limited range of colors, similar to people with red-green colorblindness. In this type of colorblindness, red and green hues are indistinguishable, so a bowl of red and green apples are perceived in the same light olive green hue. In addition to their inability to perceive the entire color spectrum, dogs are also nearsighted [5]. Humans can typically see clearly up to 75 feet away. Yet this range of clarity is only available to dogs up to a mere 20 feet. Given this information, one might conclude that dog vision is weaker than human vision. Nevertheless, there are other elements of vision beyond color spectrum perception and distance of focal clarity. What dogs lack in color perception, they make up with a greater presence of rods, which facilitates a greater sensitivity at lower light levels and ability to interpret shades of gray [4]. This higher visual acuity at low light allows dogs to better identify shapes and motion. This advantage is most likely due to the fact that before domestication, wild dogs had to rely on their senses to exploit their ecological niche and to hunt and protect themselves [4]. It would appear that dog vision has evolved in tandem with their changing environments and living conditions: they have retained their low-light sensitivity but are still able to function relatively well in many lighting conditions. Considering that humans have been a part of dogs’ lives for nearly 20,000 years, how exactly do they

12


of Our Furry Friends process visual cues from us? As humans, we dedicate much of our visual processing to facial recognition and our brains become particularly active at the sight of a face. A recent study found that, for dogs, this pattern may not hold true [6]. Researchers in Hungary analyzed the brain activity of 20 dogs using MRI scans. As the dogs laid in the MRI machine, they were shown video clips of the front and back of human heads as well as the fronts and backs of dog heads. Thirty human participants were shown the videos under the same conditions. When dog or human faces (the front views of the heads) appeared onscreen for human participants, there was a spike of activity in the visual cortex of their brains. The activity became quieter when the backs of heads were shown. Interestingly, when dogs were shown these video clips, there was no significant difference in activity between their seeing the front or back of human or dog heads. These findings suggest that the canine brain is not as devoted to recognizing faces as are humans.What this tells us about dogs’ emotions and feelings is still up for debate because a lack of a spike of activity does not necessarily indicate that they do not care about dog or human faces. Although dog brains do not respond as intensely to facial images as do humans, their behavior suggests that such images are still important and can inform their actions [7]. A study conducted in 2010 at the University of Padua explored a dog’s ability to distinguish the faces of their owners from the faces of strangers. In this study, researchers had 30 dogs sit at a distance from two doors. Their owner and a stranger would enter the room through their respective doors, walk across the path between the doors, and exit through the opposite door. The researchers found that the dogs primarily fixed their gaze on the door through which their owner had departed, closely tracking their movements. Dogs regarded strangers as a generic stimulus and did not take more than a quick glance at them. In order to ensure that dogs were able to track their owners by their sense of sight alone and not by smell or gait, the experiment was repeated with the stranger and owner placing a bag over their heads. In this variation of the experiment, the results were highly variable as the dogs did not pay particular attention to either their owner or the stranger. This experiment demonstrated that a dog’s vision plays a key role in informing their actions, specifically when attention is focused on the faces of their human counterparts. Although the previous study found that dog brains do not become excited at the sight of a face, dogs are still able to recognize their owner’s face and this bond impacts the focus of their attention.

While the mechanisms behind dog vision are relatively well understood, determining how dogs analyze and interpret the visual information they encounter is still a promising frontier to be explored. Delving further into this field could help us understand more about how dogs interpret the world around them and how important their vision is compared to other, more notable senses, such as smell and hearing.

References [1] Dog. (n.d.). Retrieved from https://www.britannica. com/animal/dog [2] Facts Statistics: Pet statistics. (n.d.). Retrieved from https://www.iii.org/fact-statistic/facts-statistics-pet-statistics [3] Briggs, H. (2017, July 18). How did dogs become our best friends? New evidence. Retrieved from https://www. bbc.com/news/science-environment-40638584# [4] Miller, P., & Murphy, C. (1995). Vision in dogs. Journal of the American Veterinary Medical Association, 207 12, 1623-34 . [5] Nathan Mcalone, B. I. (n.d.). How Dogs See The World Compared to Humans. Retrieved from https://www.sciencealert.com/how-dogs-see-the-world-compared-to-humans [6] Bunford, N., Hernández-Pérez, R., Farkas, E. B., Cuaya, L. V., Szabó, D., Szabó, Á G., . . Andics, A. (2020, October 21). Comparative Brain Imaging Reveals Analogous and Divergent Patterns of Species and Face Sensitivity in Humans and Dogs. Retrieved from https://www.jneurosci. org/content/40/43/8396 [7] Mongillo, P., Bono, G., Regolin, L., & Marinelli, L. (2010, October 14). Selective attention to humans in companion dogs, Canis familiaris. Retrieved from https:// www.sciencedirect.com/science/article/abs/pii/

13


H. pylori: A Warning from the Unknown Superbug Written by Alan Zhao Illustration by Vanshika Sriram

14


Scientific developments beginning from the 1800s have contributed numerous powerful tools to the treatment and prevention of disease. Medicines are available as effective cures for an ever-increasing proportion of common viral and bacterial illnesses. Vaccines have eradicated polio and smallpox. And the practice of surgeons has become so precise that confidence is even placed in heart valve replacement and opening the skull to implant medical devices within the brain, where the potential risks include, well, no longer having the prerequisites to live. Remarkably, these novel ideas have made their way into the twenty-first century medical canon, making it so that, at least in theory, scientists have concocted a treatment option for a vast majority of patients that does something meaningful to reverse the course of illness. Consequently, there is an increasing attitude that the public can expect to look to science to resolve the greatest pathogenic threats. Indeed, if you are skeptical, look no further than what is at your feet: the hype surrounding the race for a COVID-19 vaccine. There is an obvious incorrect implication of this attitude: patients may come to associate a disease or impairment with terms like “treatable” (almost always meaning a “cure,” see [1]) or, in a similar vein, believe that their immune system can fight it off. For this article, I will refer to both situations as representing a “cure.” This association can become so strong it can be seen as “canonical”—some Grand Unified Theory associating each disease with at least one cure (e.g., [2], where attempting to contract COVID-19 naturally assumes the ability to survive the infection). Conversely, diseases not in line with this theory are wrongly forgotten, thought of as “rare,” and labeled “anti-canonical.” However, they will rightfully laugh in our faces if we keep forgetting them. So, I would like to explain a counterexample to reverse the implications of this “Theory.” Now of course one counterexample to a canonical association between diseases and cures would be “superbugs,” or antibiotic-resistant bacteria. In 2014, these bacteria infected more than two million in the U.S. alone, causing at least 23,000 deaths [3]. But as of writing (November 2020), the U.S. has logged more than 10 million COVID-19 cases and at least 230,000 deaths, yet nationwide restrictions and resolve to social distance seem to have only lessened while transmission has been at one of its highest levels since January 2020 [4]. The implication, then, is that explaining superbugs and COVID-19 does not suffice to reverse a canonical association of a disease to a cure. So let us move to diseases beyond these two to one with even greater numbers. Welcome H. pylori, outnumbering global COVID-19 cases 7-to1 (this amounts to about 50 percent of the global population, making H. pylori a very common disease) [5]. Anchoring and burrowing itself into the interior lining of the stomach (think about that for a moment), H. pylori leads to the inflammation of the stomach lining and causes general gastrointestinal distress. Now, there is no absolute cure for H. pylori, which also has a high prevalence among the global population. Not only is there no absolute cure, but the immune system seems to be wholly incapable of fighting H. pylori! In summary, by blowing up every condition

of what many believe to be a “canonical” disease, H. pylori must be “anti-canonical.” However, because it is very common, it does not fit the label of “rare” which I explained must come with what many believe to be “anti-canonical.” Hence, the popular conception of a “canonical” association of a disease to its “cure” is purely fiction.

H. pylori patients cannot be expected to swallow more than 50 pills a week. Before I explain these things, I want to preface this by saying that this discussion is not to demonstrate that H. pylori will be a permanent handicap should one become infected with it. Indeed, I have previously been cured of this infection, and much is being done to come up with better treatments than the shot-in-the-dark treatment I had to face. Because of my experience, I wanted to first discuss treatment. In the current state of things, H. pylori strongly resists many antibiotics, basically making it a superbug. The standard course of treatment is a twoweek course of two antibiotics and a stomach acid reducer [6]. In my case, this amounted to two acid reducers, five antibiotics, and a million side effects per day. Now in the literature, this is (unfortunately) referred to as the “standard” triple therapy [6]. The typical antibiotics used in this therapy are metronidazole (fear this for its side effects) and clarithromycin. Among developed countries, on average, 1 in 5 instances of H. pylori laugh at clarithromycin and 1 in 2 instances laugh at metronidazole (making its side effects somehow worse) [6]. So now that I have listed these numbers, one might ask, “What is the upshot?” Besides the obvious “Uh oh,” it is—as a benchmark— recommended that this triple therapy be left unused or re-evaluated in areas where H. pylori’s resistance is 15-20 percent [7]. In other words, the average 1 in 5 H. pylori resistance to clarithromycin tells us that it is time for science to move beyond this triple therapy as the standard course of treatment. And these musings aren’t just theoretical. In Europe, much of eastern Asia, and the U.S., this therapy fails to treat 1 in 5 cases, which is science-speak for a failed treatment [8]. Moreover, this failure can occur even after more antibiotics are prescribed. There is now a scramble for new, effective treatments. But they are far from elegant: for instance, a two-week regimen of three antibiotics, followed by ten days of more antibiotics [7]. And at this point, it is not just a discussion of antibiotic resistance. These new treatments feature an even greater number of potent antibiotics that take their toll on the human body. Many patients are left unable to tolerate treatment and simply exit. Even with my treatment of just two antibiotics, the head pharmacist at my local grocery store—understanding of the bodily intolerance for so many antibiotics—felt the need to come out and urge me to finish the medicine I had just picked 15


up. Ultimately, I celebrated finishing them by throwing the empty orange bottles on the kitchen floor. Multiple times. This all brings me back to the remark I made about H. pylori not being a permanent handicap. While this is certainly true, for the sake of debunking the fictional association between diseases and cures — ­­ what I described to be the notion of a canonical disease — I must also emphasize the converse, which is that H. pylori is trending towards becoming a permanent handicap. The new treatments I described above simply are not sustainable. For such a common disease, H. pylori patients cannot be expected to swallow more than 50 pills a week. This is part of a much broader warning to us all that modern treatments do not allow for us to become medically complacent. It ought to be a warning to dispose of the on-the-rise notion of the canonical disease. The stakes are particularly high for H. pylori, as it is the strongest known risk factor for gastric cancer, the second leading cause of cancer deaths worldwide [5]. This, in particular, is why so much energy has been spent on researching this bacterium in the twenty-first century. In fact, infection with H. pylori is classified as a definitive cancer-causing agent by the World Health Organization, sharing ranks with the well-known asbestos, a material whose history also captures the dangers of medical complacency [9]. Now, the fact that research has shown that H. pylori does not cause symptoms for most patients (meaning no need for treatment, according to my gastroenterologist) does not free one from my counterex-

Modern culture has commanded us to reach our head towards the clouds but has failed to tell us our feet should not be far from the ground.

16

ample against the canonical disease, because H. pylori is known to be able to persist for the lifetime of the host (us humans), and so the potential for symptomatic H. pylori always lingers. Moreover, there is no guarantee that asymptomatic H. pylori infection does not increase risk for gastric cancer. For H. pylori, this is where the magic begins. H. pylori is a bacterium living within the stomach. It is wholly capable of thriving amongst our stomach acid, which one might think of as a bubbling cauldron of caustic death juice which no living thing can cross. But H. pylori has a simple solution: first, it burrows into the human stomach lining to find a more neutral environment. Second, it neutralizes the remaining acid by locally immersing itself in ammonia. As long as H. pylori can keep this up (which it does), it can survive human stomach acid. The only remaining defense is the human immune system, which unfortunately is completely inept in killing off H. pylori. In summary, cells hosting H. pylori must alert immune cells of its invader in order to receive help. This communication takes place between protein receptors on both classes of cells, and H. pylori conveniently destroys the receptor of the host cell. This creates a small haven in which H. pylori can survive an immune response, which only acts on uninfected cells bordering the host [10]. Thus, outsmarting both our stomach acid and immune system, H. pylori can stay as an unwelcome guest for as long as it pleases. This is no longer magic, but sorcery. And it is sorcery which should humble us and remind us that the notion of the canonical disease is purely fiction, and to remind us to never become complacent about even the most common diseases. Modern science has been remarkable in preventing disease and coming up with new, inventive treatments for patients. But at the same time, it is enough to shroud people in a feeling of medical invincibility that seems to almost phase out the need to be concerned about the human medical condition. And the belief in this myth is an unfortunate ignorance that is symptomatic of a larger ignorance of the physical and base necessities of the human condition. Modern culture has commanded us to reach our head towards the clouds but has failed to tell us our feet should not be far from the ground. What I mean is, taking care of oneself and others should not be a novelty.


References [1] Batten, Jason N. MA, Kruse, Katherine E. MD, Kraft, Stephanie A. JD, Fishbeyn, Bela MS, Magnus, David C. Ph.D. (2019, March). What Does the Word “Treatable” Mean? Implications for Communication and Decision-Making in Critical Illness. Critical Care Medicine, 47(3), 369-376. 10.1097/CCM.0000000000003614 [2] Tanner, C. (2020, October 14). BYU-Idaho says students may be trying to get COVID-19 so they can sell their plasma. The Salt Lake Tribune. https://www.sltrib. com/news/education/2020/10/13/byu-idahosays-students/ [3] NIH. (2014, February). Stop the Spread of Superbugs: Help Fight Drug-Resistant Bacteria. NIH: News in Health. https://newsinhealth.nih.gov/2014/02/ stop-spread-superbugs [4] The New York Times. (2020). Covid in the U.S.: Latest Map and Case Count. The New York Times. https:// www.nytimes.com/interactive/2020/us/coronavirus-us-cases.html [5] Wroblewski, Lydia E., Peek, Richard M., Jr., Wilson, Keith T. (2010, October). Helicobacter pylori and Gastric Cancer: Factors That Modulate Disease Risk. Clinical Microbiology Reviews, 23(4), 713-739. 10.1128/ CMR.00011-10 [6] Kim, S., Choi, D., Chung, J. (2015, November). Antibiotic treatment for Helicobacter pylori: Is the end coming?. World Journal of Gastrointestinal Pharmacology and Therapeutics, 6(4), 183-198. 10.4292/wjgpt. v6.i4.183 [7] Graham, David Y., Fischbach L. (2010, August). Helicobacter pylori treatment in the era of increasing antibiotic resistance. Gut, 59(8), 1143-53. 10.1136/ gut.2009.192757 [8] Graham, David Y., Lu H., Yamaoka, Y. (2007, July). A Report Card to Grade Helicobacter pylori Therapy. Helicobacter, 12(4), 275-8. 10.111/j.15235378.2007.00518.x [9] International Agency for Research on Cancer. (2020). Agents Classified by the IARC Monographs, Volumes 1-127. World Health Organization. https:// monographs.iarc.fr/agents-classified-by-the-iarc/ [10] Wroblewski, Lydia E., Peek, Richard M., Jr., Wilson, Keith T. (2010, October). Helicobacter pylori and Gastric Cancer: Factors That Modulate Disease Risk. Clinical Microbiology Reviews, 23(4), 713-739. 10.1128/ CMR.00011-10

17


Written By Jeffrey Xiong Illustrations by Kate Steiner From Robocop to Blade Runner, the enhancement of the human body through technology has always captured the imagination of the public, promising everything from extraordinary superhuman-like powers to therapeutic benefits. Brain-computer interfaces (BCIs) are a rapidly growing field of neuroengineering that promises to make at least part of that imagination a reality, with brilliant commercial and clinical possibilities.

in these electrical signals can help clinicians diagnose heart problems, such as a heart attack. In theory, clinicians should be able to use brain signals to detect problems with the brain. The issue with brain signals is that unlike hearts, which have only a small set of cells with electrical activity, brains produce much more complicated electrical signals and at enormous quantities, which means that brain monitors give clinicians too much data and too much noise with which to properly evaluate specific functions or So where does the need for BCIs come from? Normally, the malfunctions [1]. brain and the nervous system communicate with each other through small electrical impulses. These impulses transmit For example, anesthesiologists look at EEGs during information that is important for the brain to process (e.g. surgery to discern information on a patient’s level of pain) or for the nervous system to carry out (e.g. movement). unconsciousness and to adjust their dosage accordingly. If In addition, the brain uses these electrical signals to perform the anesthesiologist subdues the patient’s nervous system unconscious processes, such as forming memories or too much, the patient could never wake up. Likewise, if breathing. Clinicians and scientists can measure these small they don’t suppress the nervous system enough, the patient impulses with an electroencephalogram (EEG) trace, similar could end up being awake enough to feel pain during the to a heart monitor. However, the interpretation of these brain surgery. However, since EEGs provide data that is mostly signals is vastly complicated and has remained one of the too complicated for the human mind to interpret, it is very most pressing problems in the field of neuroscience. difficult for anesthesiologists to determine the exact degree of unconsciousness a patient is experiencing. This can A heart monitor works by measuring the small electrical have intensely problematic consequences—roughly 1 in signals produced by pacemaker cells in the center of the 1,000 patients during surgery are placed under insufficient heart, which tell the heart how and when to beat. Changes anesthesia and thus able to feel pain during a surgery [2].

BCIs are a class of emerging neurotechnologies that aim to assist researchers and doctors in analyzing the brain’s signals. BCIs use modern computational tools to extract relevant data (e.g. memory formation, decision-making, awareness) and convert it into something that is easily interpretable on a screen. Usually this is done through non-invasive electrodes attached to the scalp. The electrodes measure electrical changes within the brain and transmit the data to a computational machine that the user wears. While anesthesiologists have difficulty interpreting the entirety of an EEG, a BCI can process the data much more effectively and translate it into something that is easier to understand. Due to their potential and versatility, BCIs have found their way into everyday life. A recent example of BCIs in day-to-day use was when a Chinese primary school had students wear headbands that measured their levels of attention [3]. The BCIs involved in this case graphed brain waves associated with attention, isolating only the specific information needed from the continuous, overwhelming stream of data coming in from the brain. Just as an Apple Watch can measure your heartbeat and

18

tell if you’re having a heart attack, a BCI can measure brain signals and tell if you’re having problems focusing. BCIs have also had substantial benefits in the clinic. Paralyzed patients who used BCIs in their physical therapy sessions had improved results. In fact, patients using more complex BCIs have demonstrated recovery of motor function in paralyzed areas [4]. With regard to advanced neurodegenerative diseases, such as amyotrophic lateral sclerosis (ALS) or Guillain–Barré syndrome, patients have undergone physical therapy with BCIs analyzing changes in mobility to provide more effective therapy; BCIs have also helped patients communicate through tracking eye movement [5]. In addition, BCIs can potentially help recover lost emotional capacity (e.g. recovery from depressive states) in neuropsychiatric disorders, a potential path in future mental health treatment [4]. BCIs are usually designed specifically for certain goals, which makes them easily employable in a variety of


situations and industries. However, the common thread is the same: taking brain data, storing it within the BCI or in a cloud database, and applying proprietary algorithms to output something useful. Many commercial models are now rapidly entering the market, offering widespread commercial benefits such as helping people learn faster or improving meditation and focus [6-7]. All in all, new technologies have led to rapid growth: the BCI industry was valued at $1.2 billion in 2019 and is projected to more than triple to $3.7 billion in 2027 [8].

But the widespread use of BCIs has also opened up a Pandora’s Box of ethical concerns, ranging from hacking to surveillance to even brainwashing. As such, the development and use of BCIs should occur with extreme caution amidst the complex modern debate between technology and privacy. Given the benefits of BCIs and the rate of their expansion, it is difficult to imagine a future without their use in daily life [9]. However, some of the proposed benefits of BCIs can also serve as major drawbacks. Neuromarketing, or the use of brain wave data in advertising, has already been employed by major companies like Google, Disney, and CBS [10]. Generally, neuromarketing entails tracking changes in brain activity through BCIs, which is then used to craft marketing techniques and ads to manipulate unconscious processes, similar to how browsing data is used [10]. Though this will be a major boon for the marketing industry, popular adoption of BCIs will incentivize companies to take advantage of greater data availability and invade customer privacy. This is exacerbated by the lack of any present protections against brain data collection [10]. The popular adoption of BCIs also opens up the possibility of hacking. In the past, electronic implants such as defibrillators or insulin pumps have been targeted remotely, in some cases resulting in death [11]. Since BCIs deal directly with the brain, hacking can only become more dangerous. Major concerns include the potential for BCI hacking to alter motor control, cause immense pain, and modify emotions [11]. Similar to phones, it can be difficult to even track down, let alone prosecute, hackers due to institutional weaknesses in law enforcement and strategy planning [12]. Furthermore, the quantity of data stored on BCIs make them very attractive targets, since each cyberattack would yield far more data (and far more personal data) than a normal cyberattack onto a computer or phone. As a result, hacking may become exponentially more likely and the consequences are dire.

BCIs also pose a threat in the form of authoritarian control. In China, some factories have begun requiring workers to wear brainwave trackers, ostensibly to increase efficiency and safety, but nevertheless raising concerns of invasion of privacy and unwarranted surveillance [13]. After international outrage over the earlier example of BCIs used to track attention in schools, the company responsible pulled out, but similar projects have continued elsewhere such as in some Boston schools, further demonstrating that any government can be lured astray by the promises of BCIs [14-15]. In addition, certain techniques like transcranial magnetic stimulation (TMS) can be used to alter political beliefs, which has major implications for brainwashing [16]. Transcranial magnetic stimulation, which uses magnetic fields to excite neurons in the brain in particular patterns, is often paired with BCIs [17]. Normally, it is used to treat depression and other psychiatric illnesses, and it is currently being applied in BCI research as well, to positive effect [16]. Through this research, it was found that this stimulation can end up altering or skewing one’s mentality or beliefs [16]. Combined with the likelihood of hacking, this makes BCIs particularly dangerous in the hands of rogue actors. If the opinions of the populace can be altered with ease, this may render democracy and free will moot. The European Union’s Charter of Fundamental Rights states that each individual has the right to physical and mental integrity. Especially important is the right to assert the cognition of the self, which lays the foundation for greater rights, such as those of speech, press, and religion [8]. Inherent vulnerabilities in BCIs will make the violation of these important rights inevitable if stricter regulations are not added. There are certainly great benefits to BCIs, clinically and commercially speaking, and they could very well become a major part of everyday life akin to smartphones, which themselves have very similar benefits. And like smartphones, advanced BCIs have ethical ramifications that make their future deeply concerning. From hacking to surveillance to brainwashing, BCI development is weighed down by the possibility of unethical usage. Moving forward, it would be prudent to severely limit the commercial development of BCIs with strict legislation to prevent abuse. Though this may limit research and innovation, that is a small price for the integrity and freedom of people around the world. 19


[1] Purdon, P. L., Sampson, A., Pavone, K. J., & Brown, E. N. (2015). Clinical Electroencephalography for Anesthesiologists. Anesthesiology, 123(4), 937-960. doi:10.1097/aln.0000000000000841 [2] Shanechi, M. M. (2019). Brain–machine interfaces from motor to mood. Nature Neuroscience, 22(10), 1554–1564. https://doi.org/10.1038/s41593-019-0488-y [3] American Society of Anesthesiology. (2020). Waking Up During Surgery - Made for This Moment. Retrieved November 20, 2020, from https://www.asahq.org/ madeforthismoment/preparing-for-surgery/risks/waking-up-during-surgery/ [4] Jing, M., & Soo, Z. (2019, April 9). Brainwave-tracking start-up in controversy over tests on children. South China Morning Post. https://www.scmp.com/tech/start-ups/ article/3005448/brainwave-tracking-start-china-schoolchildren-controversy-working. [5] Blasco, J. S., Iáñez, E., Úbeda, A., & Azorín, J. (2012). Visual evoked potential-based brain–machine interface applications to assist disabled people. Expert Systems with Applications, 39(9), 7908–7918. https://doi.org/10.1016/j.eswa.2012.01.110 [6] Livni, E. (2019, February 21). You may already know the best brain hack of all. Quartz. https://qz.com/1538407/learning-faster-might-be-possible-with-this-wearableheadset/. [7] Praderio, C. (2019, January 7). I tried meditating with a $249 headband that gives real-time feedback on your brain activity. Here’s what I thought. Insider. [8] Grandview Research. (2020, February). Brain Computer Interface Market Size: Industry Report, 2027. Brain Computer Interface Market Size | Industry Report, 2027. https://www.grandviewresearch.com/industry-analysis/brain-computer-interfacesmarket. [9] Park, C. S., & Kaye, B. K. (2018). Smartphone and self-extension: Functionally, anthropomorphically, and ontologically extending self via the smartphone. Mobile Media & Communication, 7(2), 215–231. https://doi.org/10.1177/2050157918808327 [10] Ienca, M., & Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences, Society and Policy, 13(1). https://doi. org/10.1186/s40504-017-0050-1 [11] Pycroft, L., Boccard, S. G., Owen, S. L., Stein, J. F., Fitzgerald, J. J., Green, A. L., & Aziz, T. Z. (2016). Brainjacking: Implant Security Issues in Invasive Neuromodulation. World Neurosurgery, 92, 454–462. https://doi.org/10.1016/j.wneu.2016.05.010 [12] Eoyang, M., Allison P., Ishan M., & Brandon G (2018, October 29). To Catch a Hacker: Toward a Comprehensive Strategy to Identify, Pursue, and Punish Malicious Cyber Actors – Third Way. – Third Way. https://www.thirdway.org/report/to-catcha-hacker-toward-a-comprehensive-strategy-to-identify-pursue-and-punish-maliciouscyber-actors. [13] Chan, T. F. (2018, May 1). China is monitoring employees’ brain waves and emotions - and the technology boosted one company’s profits by $315 million. Business Insider. https://www.businessinsider.com/china-emotional-surveillance-technology-2018-4. [14] Johnson, S. (2018, December 27). This Company Wants to Gather Student Brainwave Data to Measure ‘Engagement’ - EdSurge News. EdSurge. https://www.edsurge.com/ news/2017-10-26-this-company-wants-to-gather-student-brainwave-data-to-measureengagement. [15] Ebben, P. (2019, December 17). Catholic Memorial Students Use Headbands To Harness Brainpower. CBS Boston. https://boston.cbslocal.com/2019/12/16/catholicmemorial-brainco-headset-technology/. [16] Holbrook, C., Izuma, K., Deblieck, C., Fessler, D. M. T., & Iacoboni, M. (2015). Neuromodulation of group prejudice and religious belief. Social Cognitive and Affective Neuroscience, 11(3), 387–394. https://doi.org/10.1093/scan/nsv107 [17] Mayo Clinic. (2018, November 27). Transcranial magnetic stimulation. Retrieved November 20, 2020, from https://www.mayoclinic.org/tests-procedures/transcranialmagnetic-stimulation/about/pac-20384625

20


Written by Anuva Banwasi & Illustrated by Sabrina Rustgi

VACCINE PARADIGM

CHANGING THE

Connecting the Past and Present: A Brief History of Vaccines While the COVID-19 pandemic has shed light on problems in our healthcare system, it has also illuminated opportunities for novel solutions in medicine. One particular field that is receiving much more attention as a result of the ongoing pandemic is vaccinology, the study of vaccines. For many people, receiving a vaccine means waiting anxiously in a hospital waiting room to experience a painful injection. However, this injection is not the vaccine itself, but rather the vehicle for vaccine delivery. A vaccine, at the most basic level, is intended to provide immunity from an infectious disease by stimulating the body to respond to a biological agent. While hypodermic injection with a needle is the most common vaccine administration method, there are several new delivery technologies that are just now being developed, such as edible vaccines, nasal sprays, and more. The emergence of vaccines first arose in response to diseases like polio and measles and dramatically benefitted the health landscape. When Jonas Salk developed the polio vaccine, for example, cases in the United States decreased from nearly 17,000 in 1954 to less than 1,000 in 1962 [1]. However, when compared to innovation in other areas in medicine, such as cancer therapies and antibiotics, the method of producing and delivering vaccines has stayed relatively constant. For over 70 years, many vaccines, including the annual influenza vaccine, have been developed through an egg manufacturing process in which the virus is injected and subsequently incubated in fertilized hen eggs [2]. Scientists then harvest these viruses and kill them with chemicals such as formaldehyde to create an inactivated, or weakened, version of

the virus to serve as a vaccine [2]. When injected into the body, these weakened viruses will stimulate memory B cells and antibody-producing cells so that they can fight future infections. SARS-CoV-2, commonly known as the novel coronavirus, has posed many challenges for vaccine development. Growing and harvesting viruses for traditional inactivated vaccines takes time and resources and in fact, researchers have already found that coronavirus cannot be replicated in eggs using traditional methods [3]. As a result, many scientists are pioneering new methods for developing vaccines. Two American biotechnology companies, Moderna and Pfizer, have developed mRNA-based vaccines that both demonstrated over 90% efficacy. Instead of containing viral proteins, these vaccines contain mRNA, which serves as a biological message or “recipe” that cells use to make viral proteins themselves, which in turn trigger the production of antibodies [4]. However, developing a safe and effective vaccine is only one part of the challenge. The next big question is scaling up production and distribution. The World Health Organization (WHO) predicted in 2019 that 19.7 million children under the age of 1 did not receive the most basic vaccines [5]. That’s nearly 20 million children left defenseless against a range of diseases including everything from polio to hepatitis. In addition to creating a vaccine, it is crucial to make such a product accessible. Therefore, we need just as much of an innovative approach for vaccine production and distribution as we do for vaccine development.

Understanding the Challenges in Access to Vaccines Before diving into potential solutions for vaccine production and distribution, we need to first consider the problem. The key issue with the distribution of vaccines lies in the cold-chain; just like your fruits and vegetables, vaccines need to be kept cold constantly during transportation. Any exposure outside of the 2℃ to 8℃ temperature range renders most vaccines useless [6]. In areas without refrigeration or access to stable power, this can pose several problems. For example, a study across ten states in India found that up to two-thirds of vaccines are damaged by temperature exposure during transportation [7]. In fact, the WHO estimates that around 50 percent of vaccines are wasted due to such temperature control issues [8].

To better understand this issue beyond facts and figures, I spoke with Marsiela Sawka, a nurse at the Premier Hospital located in Mombasa, Kenya. During our interview, Sawka described how they have to throw out 30 percent of vaccines that they receive. She cited the primary causes of this wastage as faulty refrigerators, power outages, and exposure during transportation. For instance, she mentioned that many vaccines are exposed to temperatures outside of the 2℃ to 8℃ range when being transported from main hospitals to regional clinics. Speaking with Sawka, I realized that vaccines encompass much more than just research development in the lab. To truly improve global health, we must solve these fundamental issues in production and distribution.

21


Escaping the Cold Chain To help reduce vaccine wastage, some companies are working on building better storage containers that keep vaccines cold. However, one of the challenges with such endeavours is that the high manufacturing costs can often make the containers unaffordable for rural hospitals and clinics that need them the most [5]. For example, Sawka explained that the majority of hospitals are unlikely to invest in costly new storage technology and would rather continue using traditional cold boxes, even if they result in vaccine wastage. This raises the question—what if we broke free of the cold-chain altogether? Could it be possible to stabilize vaccines so that they can be stored at room temperature and therefore no longer require transport via the cold chain? In addition to reducing dependency on the cold chain, this development could lower maintenance costs and increase access to vaccines. One such promising idea is the creation of an orally-dissolving film vaccine. The vaccine would be initially stabilized in a sugar or polymer matrix, and then dried and cut into films so that they could dissolve in one’s mouth. Researchers at the University of Texas Austin have shown that vaccines

22

can be stabilized in a sugar matrix and stored at room temperature for up to three months [9]. Another group at McMaster University described a similar procedure for stabilizing vaccines in their recent paper published in Nature [10]. By analyzing different combinations of the polymers pullulan and trehalose, they found the optimal mixture with which to “immobilize” the vaccine and render it temperature-stable. The group then found that all mice treated with the films survived infection of the Herpes Simplex Virus Type 2 (HSV-2) [10]. These results suggest that orally dissolving-films could serve as a viable method of vaccine delivery. A successful orally dissolving film vaccine could drastically transform our approach to vaccines. To illustrate, such a vaccine could potentially be self-administered, similar to over-the counter-medications. This could be especially useful in rural areas that lack access to health clinics. Moreover, traditional injection vaccines require glass vials for storage and transportation and there are many concerns about potential shortages of these materials [11]. An orally-dissolving film vaccine could overcome this problem as it would not require the glass vials, syringes, swabs, and other equipment needed for traditional injections.


References [1] Center of Disease Control and Prevention (1999, April 02). Achievements in public health, 1990-1999. Impact of vaccines universally recommended for Children - United States. Retrieved from https://www.cdc.gov/mmwr/preview/mmwrhtml/00056803.htm. [2] World Health Organization. Pandemic influenza vaccines manufacturing process and timeline (2009, August 09). Retrieved from https://www.who.int/news/item/06-08-2009-pandemic-influenza-vaccine-manufacturing-process-and-timeline [3] Barr, I., Rynehart, C., Whitney, P., Druce, J. SARS-CoV-2 does not replicate in embryonated hen’s eggs or in MDCK cell lines. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7331139/ [4] https://www.cdc.gov/coronavirus/2019-ncov/vaccines/ about-vaccines/how-they-work.html [5] Okwo-Bele, J.-M. (2015, April 22). Together we can close the immunization gap. World Health Organization. Retrieved from https:// www.who.intmediacentre/commentaries/vaccine-preventable-diseases/en/

Scaling up Production: from Lab to Patient The key to any development in medicine, especially one as important as vaccines, is tackling the challenge of scaling up production. Researchers have successfully stabilized vaccines in the form of an orally-dissolving film in the lab, but how do we design an efficient production process to bring this to the everyday person? In industry, methods for manufacturing orally-dissolving films use high amounts of heat to speed up the drying process [12]. However, this heat could potentially destroy vaccines, which are often heat-sensitive. Currently, researchers are working on a new way of producing orally-dissolving films based on printing technologies such as inkjet and flexographic printing. The idea is to print the vaccine on films that can then be fan-dried with low-temperature air [12]. This procedure could both increase the production efficiency of orally-dissolving films and ensure the integrity of the vaccine. The COVID-19 pandemic has taught us all to rethink the way we do medicine. Even once we have effective and safe vaccines for the coronavirus, we must ensure that these vaccines are easily accessible to all. Novel delivery methods such as an orally-dissolving film vaccine could be especially useful in areas that often have very low immunization rates due to challenges with storage and distribution. For both this pandemic and future crises, it is crucial that we explore new techniques and invest time and energy now to scale-up vaccine production and distribution.

[6] Ashok, A., Brison, M., La Tallec, Y., (2016). Improving Cold Chain Challenges and Solutions. Vaccine, 35(17), 2217-2223. https://www.sciencedirect.com/science/article/pii/S0264410X16307307?via%3Dihub [7] Murkehar, M., Dutta, et al. (2013, December 1). Frequent exposure to suboptimal temperatures in vaccine cold-chain system in India: results of temperature monitoring in 10 states. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3845272/ [8] Kartoglu, U. (2005, May). Monitoring vaccine wastage at country level: Guidelines for programme managers. World Health Organization. Retrieved from https://apps.who.int/iris/bitstream/ handle/10665/68463/WHO_VB_03.18.Rev.1_eng.pdf?sequence=1&isAllowed=y [9] Croyle, M. A., Bajrovic, I., Schafer, S. C., & Romanovicz, D. K. (2020). Novel technology for storage and distribution of live vaccines and other biological medicines at ambient temperature. ScienceAdvances, 6(10). https://doi.org/10.1126/sciadv.aau4819 [10] Leung, V., Mapletoft, J., Zhang, A. et al. Thermal Stabilization of Viral Vaccines in Low-Cost Sugar Films. Sci Rep 9, 7631 (2019). https://doi.org/10.1038/s41598-019-44020-w [11] Janßen, E. M., Schliephacke, R., Breitenbach, A., & Breitkreutz, J. (2013). Drug-printing by flexographic printing technology—A new manufacturing process for orodispersible films. International Journal of Pharmaceutics, 441(1-2). https://doi.org/10.1016/j. ijpharm.2012.12.023 [12] Janßen, E. M., Schliephacke, R., Breitenbach, A., & Breitkreutz, J. (2013). Drug-printing by flexographic printing technology—A new manufacturing process for orodispersible films. International Journal of Pharmaceutics, 441(1-2). https://doi.org/10.1016/j. ijpharm.2012.12.023

23


Time in a

TT II M M EE LL EE S SS S

Written by Jenna Everard Illustrated by Zoe Heidenry Whether we’re running late for a meeting, trying to catch up on our favorite show, or finishing that last essay in the Butler stacks at 11 o’clock on a Sunday night, there is something we never seem to have enough of: time. We’re so fixated on this concept that we structure our entire society around it; yet, simultaneously, we struggle to define it. We all experience the passage of time in our daily lives, but we have never seen or even felt it. So then, what is time? Can time be influenced to change? Does time actually exist? These are all questions that scientists today are still asking, and ones that we will attempt to explore here. Perhaps one of the reasons time cannot be easily defined is the differences in the way we each perceive time. Generally, people can be grouped into two categories based on their perception of time—ego-moving or time-moving—each with its own psychological implications [1]. One of the most common ways to categorize people is through a prompt such as this: “You have a meeting scheduled for Tuesday. It has now been moved forward by a day, which day is it on?” Those who answer “Monday” have a time-moving perspective — they see time as moving, whereas they are stationary. However, those who answer “Wednesday” have an ego-moving perspective — they see themselves as moving through time. Psychologists have spent years delving into the implications of this difference in perception, and many believe it reflects different situations and human emotions [1]. As odd as it may be to think that people experience time in different manners, these differences actually go beyond mere perceptions. In fact, time may actually pass differently for people in different situations. It began with the seemingly unrelated Michelson & Morley experiments carried out in the 1880s [2]. In those days, physicists attempting to understand the propagation of light believed in the existence of a “luminous ether,” a physical medium through which light traveled. If such a medium were to exist, Michelson and Morley postulated, the velocity of a beam of light would differ based on the direction it traveled through this ether. However, in testing their hypothesis, they did not observe any significant differences. Regardless, scientists still continued to push forward the concept of the luminous ether until about 20 years later, when Einstein presented his theory of special relativity [3]. For the basis of his theory, Einstein relied on an alternative ex-

24

Universe planation for the propagation of light, suggesting that it traveled through a vacuum at a constant speed [3]. Hence, no medium nor luminous ether existed, and this theory revealed two important pieces of information about time [3]. First is the concept that time is truly relative because it is dependent on an individual’s point of reference and their motion relative to their surroundings. This realization stemmed from the idea of the constant speed of light: Einstein proposed that if one were to be travelling at the same speed as light, then, to that person, the light would have a relative speed of zero [4]. Clearly this contradicts the concept of a constant speed, leading to Einstein’s support of a second important piece of information about time: the concept of spacetime. Einstein realized that if the speed of light must be constant, then his phenomenon could only be explained by variations in a physical space and a relative time. Hence, space and time are intertwined—the two drive relativity in a coordinate grid of reference frames [3]. By 1915, this theory evolved into a new theory of general relativity, which proposed time’s additional dependence on gravity [3]. This leads us to a consequence of Einstein’s conclusions: the phenomenon of time dilation. With time as a relative concept, it can be observed that different factors can influence and change how one might experience time. Most prominently are the factors of velocity and gravity. Velocity time dilation is derived from Einstein’s theory of special relativity, a conclusion which relied heavily on the preexisting concept of Lorentz transformations [3, 5]. These were a series of equations within a spacetime coordinate grid that related two objects moving at two different velocities [3]. Through these equations, it was calculated that time moves more slowly for an object traveling at a faster velocity than for its reference point. From Einstein’s following theory of general relativity, gravitational time dilation was derived. As this theory asserted that varying sizes of distortions in spacetime were produced by objects of varying mass, it could be concluded that time moves more slowly for objects under a greater gravitational pull [5]. Though it may be hard to imagine that time can proceed at different speeds, time dilation is actually commonly observed with the clocks on GPS satellites [6]. It has been experimentally determined that, due to velocity time dilation alone, GPS clocks in space run about 7 microseconds slower per day and that, due to gravitational time dilation alone, these clocks run about 45 microseconds faster per day [6]. The impact of the lesser gravity


is more prominent, and the overall product is that these clocks run about 38 microseconds faster per day than clocks back on Earth, and scientists have had to adjust these satellite clocks to account for this inaccuracy. Now you might wonder why this matters, given how miniscule the time dilation for these clocks is. Well, while we may only be able to produce and observe small dilations currently, future technology may change this. Our fastest spaceship to date, which is part of NASA’s New Horizon mission, traveled at around 16,111 meters per second, which is still much slower than 300,000,000 meters per second—the speed of light [7]. As we develop technology to travel at a speed closer to the speed of light, the amount of time dilation we will be able to experience will consequently increase. Scientists estimate that if a spaceship can travel at 95 percent the speed of light, for astronauts within the spaceship, time will move at 0.3 times the speed that it moves for the rest of us on Earth [8]. This effect is summarized by the twin paradox, a situation in which an astronaut travels in a hypothetical spaceship for what they perceive as three years while their twin remains on Earth. They return to find that ten years has passed on Earth and they are now younger than their twin. Many have used such paradoxical examples of time dilation to argue for the potential of time travel. After all, it could be said that the astronaut in the twin paradox actually traveled forward in time, living only three years and returning to Earth ten years in the future. At even higher velocities, larger “time jumps” may even be possible—one estimate states that one year in space travelling at 99.9999 percent the speed of light would be equivalent to 700 Earth years [9]. Although the logic behind these calculations may be both scientifically and mathematically sound, there are many missing pieces. Other complex variables that have yet to be considered include an absence of technology that travels anywhere near that fast and the question of whether humans could feasibly survive such travel. Though we may have to be content with living out our time travel dreams through science fiction for the time being, the implications of time dilation continue to raise important questions about our understanding of our lives and the universe as a whole. While we draw clear distinctions between the past, the present, and the future, if time can be dilated, then the lines between these begin to blur. It is more likely than not that this concept we know as time is truly only applicable to our minute piece of the overall Universe [10]. Though it may not necessarily be a basis for the larger universe, time provides us with the means to develop a general understanding of our place in the universe, draw relations between different entities, and creates a sense of order and structure. We are a society run by time, a concept modeled on a timeless universe.

REFERENCES [1] Lee, A., & Ji, L. J. (2014). Moving away from a bad past and toward a good future: Feelings influence the metaphorical understanding of time. Journal of Experimental Psychology: General, 143(1), 21-26. doi:10.1037/a0032233. [2] Jones, A. Z. (2019, February 28). History of the Michelson-Morley Experiment. Retrieved from https://www.thoughtco.com/the-michelson-morley-experiment-2699379 [3] Jones, A. Z. (2020). Einstein’s Theory of Relativity. Retrieved from https://www.thoughtco.com/einsteins-theory-of-relativity-2699378 [4] Howell, E. (2017). Einstein’s Theory of Special Relativity. Retrieved from https://www.space.com/36273-theory-special-relativity.html [5] Matson, J. (2010). How Time Flies: Ultraprecise Clock Rates Vary with Tiny Differences in Speed and Elevation. Retrieved from https:// www.scientificamerican.com/article/time-dilation/ [6] Van Sickle, J., Dutton, J. A. (2020). The Satellite Clock. Retrieved from https://www.e-education.psu.edu/geog862/node/1714#:~:text=The%20weakness%20of%20gravity%20makes%20the%20 clocks%20in,than%20the%20clocks%20in%20GPS%20receivers%20on%20earth [7] Redd, N. T. (2017). How Long Does It Take to Get to Mars? Retrieved from https://www.space.com/24701-how-long-does-it-taketo-get-to-mars.html [8] Uses and Examples of Time Dilation. Retrieved from https://users. sussex.ac.uk/~waa22/relativity/Uses_of_Time_Dilation.html [9] Deel, G. (2019). Immortality, Space Travel and Einstein’s Time Dilation. Retrieved from https://inspacenews.com/immortality-einsteins-time-dilation/ [10] Callender, C. (2010). Is Time an Illusion? Scientific American, 302(6), 58-65. Retrieved November 13, 2020, from http://www. jstor.org/stable/26002066

25


Leveling the Playing Field:

A Look into Mother Nature’s Balancing Act and the Heterozygous Advantage

Written by Allison Lin Illustrated by Aeja Rosette On average, adult humans have roughly 1.2-1.5 gallons of blood in their bodies, making up 10 percent of their body weight [1]. This essential liquid is responsible for transporting nutrients, oxygen, and necessary proteins throughout organ systems. Disturbances to the delicate composition of blood can jeopardize body homeostasis and seriously affect one’s health [1]. Within the blood, red blood cells (RBCs) are especially important because they carry hemoglobin, a protein that delivers oxygen to the rest of the body. Having too few RBCs is characteristic of anemia, a class of blood disorders, which can lead a person to feel tired or irritable [1]. There can be various underlying causes of this deficiency: the body might be making an insufficient number of RBCs, destroying its supply of RBCs, or losing RBCs through other mechanisms [1]. While some types of anemia are genetically inherited, others can be due to a malnourished diet, autoimmune diseases, or sometimes even pregnancy [1].

S” [2]. This conversion causes typically round red blood cells to turn into a sickle, or crescent shape [3]. The sickled RBCs are stiff and inflexible, causing them to easily aggregate in small blood vessels and block oxygen-rich blood flow [3]. When red blood cells sickle, they can also break down prematurely, which further contributes to an oxygen shortage for critical organs [3]. Ultimately, SCA can trigger pulmonary hypertension, more commonly known as high blood pressure, in the blood vessels that lead to the heart and can result in heart failure [3]. Humans naturally possess two copies of every gene, one from the maternal side and one from the paternal side. For a given gene, heterozygotes have two different versions of that gene, while homozygotes have two identical copies of that gene [4]. Therefore, a recessive disease, one that requires both copies of a disease-causing mutated version of a gene to be present, would only manifest in homozygous individuals [4].

Accordingly, heterozygous carriers only possess one copy of the mutated hemoglobin gene and do not develop sickle cell anemia but are called “carriers” precisely because they can pass this mutated gene down to their offspring [4]. Moreover, these carriers possess a new interesting One specific form of genetically inherited anemia is sicktrait. They are protected against malaria, a mosquito-borne le cell anemia (SCA), which affects the protein hemodisease caused by a parasite that releases toxic agents, globin and converts it to a form known as “hemoglobin such as hemozoin pigment, which infects red blood cells 26


[4]. Parasitic components that are released by these infected RBCs when they lyse (burst) eventually cause the human host to experience a plethora of flu-like symptoms at best, and organ complications or even death at worst [5]. More common in developing countries, malaria is a deadly disease, responsible for more than 228 million deaths in 2018 [5]. In a series of studies, Miguel P. Soares and Ana Ferreira at the Gulbenkian Institute of Science in Portugal investigated the connection between HBB (hemoglobin subunit beta), the gene that codes for hemoglobin, and protection against malaria [6]. They discovered that heme, a component of hemoglobin, increased the tolerance of infection to the malaria parasite [6]. Additionally, the researchers found that the unnatural sickle shape of RBCs, caused by one copy of the mutated HBB gene, led to porous membranes that limited parasitic growth and proliferation [6]. Ultimately, it is natural to wonder about the benefits and detriments of HBB gene mutations. Did Mother Nature intend for the sickle cell trait to be simply a disease or a protective mechanism against malaria? Today, malaria is only prevalent in tropical areas such as Sub-Saharan Africa and parts of Latin America, even though it seems to have first appeared almost 30 million years ago [8]. It is believed that when early human ancestors inhabited the tropical regions where malaria was most common, there was a greater chance for an individual to die as a result of malaria, as opposed to dying from sickle cell anemia. This hypothesis was proposed by J.B.S. Haldane in the 1940s and was investigated in greater detail by researcher A.C. Allison in 1954 [10]. Thus, before humans migrated out of Africa, the sickle cell trait actually helped humanity to survive, and as a result, was selected for by natural selection. Those with the trait were able to survive in malaria-infested climates and pass it on to their offspring [10]. Yet there’s more to the story of Mother Nature’s mistakes— let’s take a look at the similarly curious connection between the diseases cystic fibrosis (CF) and cholera. Like sickle cell anemia, CF is an autosomal recessive disease and it can ultimately lead to respiratory failure. It is caused by mutations in the gene that codes for a chloride channel called the cystic fibrosis transmembrane regulator (CFTR) [11]. Homozygous recessive individuals with CF lack normal function of these important proteins and are unable to regulate the volume of mucus secretions that occur naturally in the body. Insufficient mucus clearance leads to a higher risk of pulmonary infections [11]. Moreover, the epithelial cells lining the small and large intestines that are responsible for nutrient absorption are similarly affected by the lack of functioning CFTR channels [11]. This results in major digestive system failures and could cause delayed growth, diabetes, liver disease, and other intestinal issues [11]. In addition, CF patients experience a host of symptoms, including but not limited to damaged airways (seen in abnormal scarring, widening, or narrowing of the bronchial tubes), coughing up blood, and even complete respiratory failure [11].

The fatality rate of CF is notoriously high, but a notable spike of CF carriers—those with only one copy of the mutated gene and who did not present with the actual disease—was observed during the nineteenth century in Europe, around the time of the first cholera outbreaks in Europe [12]. John Snow, known today as the “Father of Epidemiology,” was the first to discover the exact source of the outbreak, when he discovered that people who drank from a specific water pump ended up experiencing convulsive diarrhea and extreme dehydration [12]. Their skin turned blue and some developed lethargy or muscle weaknesses, due to a lack of electrolytes and water [12]. Because of this, cholera is sometimes referred to as the “blue” death. During this period, why did the rate of CF heterozygosity dramatically increase? Was it somehow linked to the cholera that was spreading over Europe like a deadly blue mist? In a word, yes. Cholera is caused by a bacterial toxin known as Vibrio cholerae, which upon entering the cell, attaches to guanine-nucleotide proteins [13]. This in turn significantly increases the activity of the CFTR chloride channels. In an individual without the CF mutation, the overactive CFTR channels prompt the efflux of chlorine out of the cell, along with sodium ions and water, which instigates a chain of reactions that lead to excessive diarrhea [13]. In heterozygous individuals with only one copy of the CF gene, the normal CFTR channels are reduced in number and functionality, so the reaction is not nearly as extensive. Thus, heterozygotes are somewhat protected against the most serious symptoms caused by cholera [13]. From these two odd pairings, we can gather that nature’s not perfect. Something that is seen as a dreaded diagnosis could have had a benefit at one point in humanity’s timeline. Just as people with sickle cell anemia heterozygosity might have had increased chances of survival in the early African savanna, people with only one copy of the gene that causes CF may have been better equipped to handle the cholera outbreak in nineteenth century Europe. A third example of mistakes embedded in nature can be seen with the nuanced relationship between Tay-Sachs disease and tuberculosis. A rare inherited disorder, TaySachs is a degenerative disease that primarily affects the central nervous system [14]. Symptoms usually present during infancy and involve impaired development, loss of motor movement, and exaggerated reactions. Further progression of the disease results in loss of cognitive abilities, vision, and hearing loss, as well as seizures and paralysis [14]. Late-onset forms that present during adulthood are rarer and are usually milder than in infantile forms [14].

27


Tay-Sachs, a recessive disorder, is caused by a genetic mutation in the HEXA gene that creates a defective lysosomal enzyme known as hexosaminidase A [15]. This enzyme is necessary to break down a certain kind of body fat; without a functioning enzyme to break these lipids, the fatty substances accumulate to toxic amounts in the brain and nerve cells [15]. Usual symptoms include muscle weakness, ataxia (loss of muscle coordination in the body), and varied forms of mental illness [15]. Infantile forms of Tay-Sachs develop due to an almost complete lack of the hexosaminidase A protein, while in late-onset individuals, there exists a protein deficiency, but not complete loss, which explains the milder symptoms [15]. Tay-Sachs seems to arise with greater frequency in Ashkenazi Jews [15]. It has been suggested that this population is subject to a greater selective pressure for the gene because of segregation, lack of immigration, and crowded environments. The greatest frequencies of this mutated gene actually correlate with areas that are notorious for the frequency of another deadly disease, tuberculosis [16]. A highly infectious disease, tuberculosis is caused by a pathogenic mycobacteria that employs a sneaky tactic against natural immunity in humans and animals: it finds refuge in macrophage cells, white blood cells who are responsible for eliminating foreign invaders from the body [17]. The secret to the relationship between Tay Sachs and tuberculosis lies with the same Hexosaminidase A protein introduced earlier. Heterozygous carriers of Tay-Sachs disease interestingly show an increased production of the hexosaminidase A protein, which in turn provides greater protection against tuberculosis infection at the molecular level [11]. If a particular macrophage has been infected with the tuberculosis mycobacteria, the hexosaminidase enzymes within help restrict intracellular growth and prevent the infected macrophage from infecting others, therefore limiting the proliferation of the mycobacteria [11]. Moreover, as the lysosomal enzyme is secreted from the macrophage, it exerts an effect on macrophagic plasma membranes that prevents uptake of more mycobacteria into cells [17]. Taken together, it can be concluded that Tay-Sachs heterozygotes have a cellular protection against tuberculosis, which would explain the interesting correlation found between the two diseases [16]. Ultimately, it is difficult to determine whether these relationships demonstrate genetic mistakes or benefits conferred by Mother Nature. Yet understanding the biological advantages of a disease afforded to some people simply because of their susceptibility to another disease could yield great advances in treatment and therapy. It’s worth considering that these three diseases, sickle cell anemia, cystic fibrosis, and TaySachs, may have manifested only with a beneficiary intent— to heal, not to harm. Salvation at first, devastation later on.

28

References [1] In Brief: Your Guide to Anemia. (2011). NIH Publication. Volume 11, Section 7629A. https://www.nhlbi.nih.gov/files/docs/public/ blood/anemia-inbrief_yg.pdf [2] Killip S, Bennett JM, Chambers MD. Iron deficiency anemia. Am Fam Physician. 2007 Mar 1;75(5):671-8. Erratum in: Am Fam Physician. 2008 Oct 15;78(8):914. PMID: 17375513.https://www.aafp. org/afp/2007/0301/p671.html [3] Genetic Home Reference. (2020, August 18). Sickle Cell Disease. MedlinePlus. https://ghr.nlm.nih.gov/condition/sickle-cell-disease [4] US National Library of Medicine. (2020, November 3). Autosomal recessive. MedlinePlus. https://medlineplus.gov/ency/ article/002052.htm [5] Global Health, Division of Parasitic Diseases and Malaria. (2020, July 16). Malaria. Center of Disease Control and Prevention. https://www.cdc.gov/malaria/about/biology/index.html [6] Rozenbaum, M. (2019, June 19). How Sickle cell protects against malaria. Understanding Animal Research. https://www.understandinganimalresearch.org.uk/news/research-medical-benefits/howsickle-cell-protects-against-malaria-a-sticky-connection/ [7] Luzzatto L. (2012). Sickle cell anaemia and malaria. Mediterranean journal of hematology and infectious diseases, 4(1), e2012065. https://doi.org/10.4084/MJHID.2012.065 [8] Poinar G (2005). “Plasmodium dominicana n. sp. (Plasmodiidae: Haemospororida) from Tertiary Dominican amber”. Syst. Parasitol. 61 (1): 47–52. doi:10.1007/s11230-004-6354-6. PMID 15928991. S2CID 22186899. [9] Khan Academy. (2020). First Humans: Homo sapiens. Khan Academy. https://www.khanacademy.org/humanities/world-history/world-history-beginnings/origin-humans-early-societies/a/ where-did-humans-come-from [10] Sabeti, P. (2008) Natural selection: uncovering mechanisms of evolutionary adaptation to infectious disease. Nature Education 1(1):13 [11] Withrock, I. C., Stephen J. Anderson, Matthew A. Jefferson, Garrett R. McCormack, Gregory S.A., et. al. (2015). Genetic diseases conferring resistance to infectious diseases. Genes & Diseases. Volume 2, Issue 3. Pages 247-254. ISSN 2352-3042. https://doi. org/10.1016/j.gendis.2015.02.008. [12] Cohen-Cymberknoh, M., Shoseyov, D., & Kerem, E. (2011). Managing cystic fibrosis: strategies that increase life expectancy and improve quality of life. American journal of respiratory and critical care medicine, 183(11), 1463-1471. [13] Gabriel, S. E., Brigman, K. N., Koller, B. H., Boucher, R. C., & Stutts, M. J. (1994). Cystic fibrosis heterozygote resistance to cholera toxin in the cystic fibrosis mouse model. Science, 266(5182), 107109. [14] Genetic Home Reference. (2020, August 18). Tay-Sachs Disease. MedlinePlus. https://medlineplus.gov/genetics/condition/ tay-sachs-disease/#causes [15] NIH. (2011, July 16). About Tay-Sachs Disease. National Human Genome Research Institute. https://www.genome.gov/Genetic-Disorders/Tay-Sachs-Disease [16] Petersen, G. M., Rotter, J. I., Cantor, R. M., Field, L. L., Greenwald, S., Lim, J. S., ... & Kaback, M. M. (1983). The Tay-Sachs disease gene in North American Jewish populations: geographic variations and origin. American journal of human genetics, 35(6), 1258. [17] Koo, I. C., Ohol, Y. M., Wu, P., Morisaki, J. H., Cox, J. S., & Brown, E. J. (2008). Role for lysosomal enzyme β-hexosaminidase in the control of mycobacteria infection. Proceedings of the National Academy of Sciences, 105(2), 710-715.


Develop your scientific career with our support

Search for your new role quickly by discipline, country, salary and more on naturecareers.com

A97909


@columbia.science.review @columbiasciencereview columbiasciencereview.com


Are you looking to make a difference? Take a look at us. We’re a different kind of financial services organization, serving those who serve others. Explore our career site to see how you can build a rewarding and challenging career with TIAA. careers.tiaa.org/early-talent

INVESTING

ADVICE

TIAA is an Equal Employment Opportunity/Affirmative Action employer committed to fostering workforce diversity and inclusion. ©2020 Teachers Insurance and Annuity Association of America-College Retirement Equities Fund, 730 Third Avenue, New York, NY 10017

BANKING

RETIREMENT


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.