__MAIN_TEXT__

Page 1

How war has moulded technology www.eusci.org.uk | FREE

Fast fashion’s role in the spread of pandemics

A fresh start for the human race?


23 Dressed to kill Kasi Patel investigates how the exploitation of materials for the fashion industry is linked to the spread of disease and the emergence of pandemics 7 A bombshell discovery Helena Cornu investigates how the development of the atomic bomb led to the discovery of stem cell therapy 8 Sexism in science: the women who became radical activists… against their wishes Inspired by the film Picture a Scientist, Ishbel Dyke tells the stories of the female scientists who paved the way for women to succeed in research, but there is still so much more that needs to change 10 From the ashes: volcanoes and earthquakes Arianna Schneier examines the reform and response to natural disasters, specifically volcanoes and earthquakes, by focusing on documentation and prevention 12 From the ashes: how war has shaped science Ailie McWhinnie investigates the ways in which war has moulded the technology in our daily lives and even driven changes in the foundations of the scientific system 14 From the ashes: previous pandemics Pandemics have historically brought rapid change; why is that, and what has been borne of pandemics past? Cristina Huguet Suárez investigates

17 A new beginning for the Arctic? Lara Watson investigates the consequences of anthropogenic warming and the effects of ice loss in the Arctic 18 Building back better - new beginnings for the construction industry Emily Oliver examines new materials revolutionising the construction industry 20 Here for good Hrichika Nag shows how artificial intelligence is helping scientists to answer some of the biggest questions of our time

24 The case for a green recovery Tom Edwick discusses the how the fate of humankind is dependent on how we choose to rebuild our economies after the pandemic 26 To mask, or not to mask? That is the $28tn question Ellie Bennett explores the psychology behind people who refuse to follow public health measures during a pandemic that is costing the world dearly 28 The digitalisation of science, education & research Simar Mann examines how science is adapting to the digital age 30 Dial-a-Doctor: a digital movement of healthcare Belén Hackel and Tara WagnerGamble discuss the digital future of healthcare 32 A new era of plastic pollution Ian Yang explores microplastic and other newly emergent forms of plastic pollution, alongside ways to manage them

35 Grand designs Sean Thompson looks at some of the wackier ideas which have been proposed to tackle big issues 36 Living on Mars: a new beginning for the human race? Jessie Hammond investigates how likely it is that humans will be able to colonise Mars in the future 38 Dancing with the data privacy devil Hamish Salvesen considers the future implications of a shifting data privacy landscape 40 A new beginning for biology: how the science can get over its crisis of identity Harin Wijayathunga explores the philosophical underpinnings of biology in an attempt to outline potential changes in the science that can bring about more fundamental and theoretical development

42 Feeding the fire: a fresh start for forestry Fire and forests have a complex relationship, which human interference has turned sour, but Heather Jones explores how we might be able to make things right again

44 Infographic by Katharin Balbirnie-Cumming 45 Top 10 tricks to wipe clean your digital footprint In an era of almost exclusively online interpersonal communication, many have come to view data privacy with a growing sense of futility - Cerys Walsh shares 10 ways you can take back control of your information online and start anew 46 We need diverse representation in STEM, and social media can help Under-representation is chronic in science, but we can tackle this through visibility campaigns and positive discrimination, argues Zandile Nare 48 How can we be better scientists? BY WRITING BETTER! “someone please simplify scientific writing for us smarticle particles who don’t yet know complex science” - Isha Prabhu 2020 49 Superior: The Return of Race Science Angela Saini’s latest book could reshape how you think about science, writes Harry Carstairs 50 Cometh the hour, cometh the mRNA Farrie Nzvere follows the ups and downs of vaccine development, which culminate in the timely arrival of a successful RNA vaccine for SARSCoV-2 52 Paving the road to equality Kate Summerson celebrates five influential women in STEM, all of whom have contributed to the revolution in diversity that is taking place in the industry today 55 Comic strip by Wohrle Marie-Louise 55 Crossword by Sonia Dahal

Winter 2021 | eusci.org.uk 3


Meet the Committee Sub Editors:

Yen Peng (Apple) Chew Editor-In-Chief

Jessie Hammond Head Sub-Editor

Harry Carstairs Head Sub Editor

Clara Morris Art Editor

Eve Miller Co-layout Editor

Isha Prabhu Co-layout Editor

Samantha Cargill Co-treasurer

Shona Richardson Co-treasurer

Karolina Zieba President

Heather Jones Social media/ Marketing Leader

Michael Bell, Madison MacLeay, Tom Edwick, Judah Milvidaite, Samantha Cargill, Mika Kontiainen, Katie Pickup, Hristina Gyurova, Arianna Schneier, Andrew Conaboy, Berengere Digard, Laura Shoveller, Niamh Maher, Zak Campbell-Lochrie, Catriona Roy, Eva Klemencic, Karolina Zieba, Tom Leah, Way Wen Loh, Alyssa Brandt, Yen Peng (Apple) Chew, Anna Norton, Ailie McWhinnie, Jing Qi Chong, Miranda Ferguson, Richenda Rae, Wiktoria Kinder, Madison MacLeay, Jessie Hammond

Copy Editors:

Samantha Cargill, Miranda Ferguson, Emma Wilson, Tom Edwick, Mika Kontiainen, Maria-Mihaela Avram, Linta Nasim, Donia Arafa, Niamh Maher, Hristina Gyurova, Rebecca Atkinson, Tuula Ritakari, Maja Milewska, Max Phillips, Vishal Gulati, Andrew Conaboy, Shona Richardson, Max Phillips, Maja Milewska, Heather Jones, Harry Carstairs, Yen Peng (Apple) Chew

Authors:

Ian Yang Social media/ Marketing Leader

Pligne Pratumsuwan Outreach Officer

Emily Oliver Website Editor

Kruthika Sundaram Website Art Editor

Ailie McWhinnie, Arianna Schneier, Cristina Huguet Suรกrez. Ishbel Dyke, Helena Cornu, Ellie Bennett, Simar Mann, Tom Edwick, Eva Klemencic, Emily Oliver, Hrichika Nag, Lara Watson, Kasi Patel, Belen Hackel, Ian Yang, Heather Jones, Jessie Hammond, Sean Thompson, Hamish Salvesen, Harin Wijayathunga, Zandile Nare, Isha Prabhu, Kate Summerton, Harry Carstairs, Farrie Nzvere, Cerys Walsh

Illustrators: Ishbel Dyke Blog Editor

Tom Edwick Podcast Host

Ailie McWhinnie Blog Editor

Ben Oscroft Social Media Analytics 4 Winter 2021 | eusci.org.uk

Alix Bailie Podcast Manager

Helena Cornu Podcast Editor

Simran Kapoor Managing Director

Diana BarreirosJorge Social Media

Anna Norton, Ananya Ambekar, Judah Milvidaite, Linta Nasim, Mengxuan Zhang, Liat Adler, Yen Peng (Apple) Chew, Prerna Vohra, Elizabeth Stroud, Eve Miller, Clara Morriss, Kruthika Sundaram, Sarah Murphy, Karolina Zieba, Vishal Gulati, Joyce Wang, Alyssa Brandt, Katharin Balbirnie-Cumming, Wohrle Marie-Louise

Crossword: Sonia Dahal


Dear Reader, 2020 has been a rough year for many. With the Covid-19 coronavirus pandemic comes an opportunity to do things differently - and importantly, better. Thus, we wanted to make this issue of EUSci magazine light-hearted and fun to read. Our lives are filled with new beginnings in every chapter: a new job, new goals, and new minds. So it is for STEM (Science, Technology, Engineering, and Mathematics) at many levels: new discoveries, new successes, and new challenges. In the spirit of a new year of 2021, our authors took this time to reflect on new beginnings in STEM yesterday, today and tomorrow. We first explore new beginnings from the ashes - the past. Here, we take a trip down memory lane to examine how war, natural disasters and previous pandemics have shaped science (p.10-15). Next, we explore new beginnings of our current time - the present. You’ll find articles discussing everything from new eras of AI (p.20-21) and plastic (p.32-33), to digital movements in education and healthcare (p.28-31). Last but not least, we look at potential new beginnings for the future of science. Join us to dance with the data privacy devil (p.38-39), discuss our prospects for a fresh start on Mars (p.36-37) and consider whether we should redefine biology (p.40-41). In the Regulars section (p.46-54), you’ll find a selection of articles on the intersection between science and society, ethnicity, and sex. As we’re increasing our online presence due to lockdowns, head to page 50 to look at top 10 tricks to wipe clean your digital footprint. We hope you enjoy this issue! Thank you to all the authors and volunteer editors who made this production possible during these tough times, and to you, the reader, for your continued support. Best wishes, Yen Peng (Apple) Chew Editor-in-Chief

Illustration by Eve Miller

Winter 2021 | eusci.org.uk 5


Dear Reader, How has science begun in the past? How have the beginnings of the past affected today’s world? Join us in exploring the history of new beginnings of technologies, discoveries, and social movements! In this section, there are five articles which explore this theme of previous times when science has had its beginnings. Helena Cornu discusses how the invention of nuclear weapons, deemed as a horrendous creation, led to the discovery of stem cell research, from which so many benefits have been seen (p.7). Ishbel Dyke has been inspired by the film Picture a Scientist to tell about female scientists who have shaped science and made it possible for other women to make their mark on science today (p.8-9). They also explore how there is still a need for change in the present. We have three authors who explore the theme of science in relation to wars, natural disasters, and pandemics of the past. In the first, Ailie McWhinnie considers the technology that has come from science in wartime and how science itself has developed due to wartime applications (p.12-13). Arianna Schneier focuses on records of earthquakes and volcanoes in history and how devestation can be prevented in the future (p.10-11). Cristina Huguet Suårez talks about past pandemics, giving us an overview of diseases like the Black Death and SARS (p.14-15). These are amazing articles which explore our overall theme of beginnings further to reflect on the start of science and the impact it has had on our past. Best wishes, Jessie Hammond Head Sub Editor

6 Winter 2021 | eusci.org.uk


An atomic bomb kills three times. Prior to 1945, the first two deadly mechanisms - heat and pressure - were well known from previous bombings. However, the effects of radiation were not. It was only after the attacks on Hiroshima and Nagasaki on August 6th and 9th of that year, that the grim details of acute radiation poisoning became clear. Doctors struggled to understand the odd combination of symptoms — hair loss, low white blood cell counts, and destruction of the gut lining — nicknaming it ‘the atomic plague’. At the turn of the 20th century, the high rates of cancer among radiologists and radium dial painters, who were routinely exposed to radioactivity, was obvious enough to prompt initial research into the effects of ionising radiation. However, the real impetus for this type of research was the Manhattan Project, the program which developed the bombs detonated over Japan. In an effort to understand the consequences of radiation exposure, the US funded thousands of experiments in radiobiology. A key figure of this research was Leon Jacobson, who demonstrated that the blood system was particularly susceptible to radiation. Jacobson’s work with mice determined that if the spleen was protected, then the animals could recover. Jacobson proposed the existence of a mysterious ‘recovery factor’ helping the body to heal. After the war, Jacobson collaborated with Egon Lorenz, who discovered a similar recovery factor in bone marrow. In 1956, Lorenz demonstrated that mice could survive ostensibly lethal doses of radiation after receiving bone marrow transplants from a healthy donor. Building on this work, bone marrow transplants were soon tested on humans: Dr. E. Donnall Thomas treated a leukemic patient by infusing bone marrow cells from their identical twin. A 1958 nuclear reactor accident in Yugoslavia prompted the next scientific leap. To treat the physicists who had been heavily exposed to radiation, the French physician George Mathé performed the first transplant in which the donor was not the patient’s

identical twin. But his most remarkable achievement was the conclusive demonstration that bone marrow cells can integrate into the recipient. In 1963, Mathé cured a patient of leukemia by wiping out their blood cells, then administering an infusion of bone marrow cells. Following the treatment, the patient’s blood type changed to that of the donor: proof that the donor’s cells had replaced the patient’s.

“Following the treatment, the patient’s blood type changed to that of the donor, proof that the donor’s cells had replaced the patient’s” Though bone marrow transplants were now being used routinely to study the effects of radiation on the blood system, the identity of the recovery factor remained a highly debated mystery. This question was finally solved through the work of the Canadian scientists Ernest McCulloch and James Till. In 1960, while trying to establish the minimum number of transplanted bone marrow cells required to treat mice following a lethal dose of radiation, they noticed lumps on the mice’s spleens - roughly one lump for every 10,000 injected bone marrow cells. They theorised that each of these lumps were made up of clones of a single cell. This idea was followed up by Lou Siminovitch, who showed that, within these colonies of cells,

there were certain cells which themselves could give rise to more colonies. In other words, stem cells had finally been discovered. The existence of stem cells had been postulated since the end of the 19th century. There had to be a cell which could quickly divide to replace the ones that were lost through normal wear and tear. Radiation affects stem cells in particular because it damages their DNA, thereby limiting their ability to create new cells. Therefore, although some victims of the atomic bombings may have appeared to escape unscathed, their stem cells had been damaged. They died slowly as their body failed to renew itself. In her book, Suffering Made Real: American Science and the Survivors of Hiroshima, Susan Lindee explains that “the bomb was a frightening manifestation of technological evil, so terrible that it needed to be reformed, transformed, managed or turned into the vehicle of a promising future. It was necessary, somehow, to redeem the bomb.” Perhaps the atomic bomb, after killing in three ways, also redeemed itself in three ways: as a diplomatic threat to prevent future war; as a potent source of energy; and as the origin of stem cell therapy. Helena is a Biomedical Sciences graduate and the editor of the EUSci podcast

Winter 2021 | eusci.org.uk 7


past

Sexism in science: the women who became radical activists… against their wishes Inspired by the film Picture a Scientist, Ishbel Dyke tells the stories of the female scientists who paved the way for women to succeed in research, but there is still so much more that needs to change

Illustration by Ananya Ambekar

8 Winter 2021 | eusci.org.uk

The year was 1990. Nancy Hopkins was conducting a pioneering genetics study involving zebrafish at the Massachusett Institute of Technology. The only trouble was, she needed more fish. More fish means more tanks, and that is not particularly easy to achieve when this senior faculty member had been allotted a mere broom cupboard of a lab. She raised her concerns with the gentleman in charge of allocating lab space. He scoffed at the suggestion that her lab was any smaller than anyone else’s. So what did Nancy do? Like a true scientist, she gathered data. With bated breath, she would wait until the final footsteps around the MIT corridors had fallen silent. Then, armed with her tape measure, she would tiptoe from lab to lab, measuring the area of each one. The results were stark. Nancy had less space than the male junior professors who, on average, had 500 extra square feet to play with. Even worse than this, the male senior faculty members had the equivalent of up to four of Nancy’s labs. She brought the findings of her covert mission back to the man who had dismissed her first request. He point-blank refused to even look at her drawings. And this, Nancy says, is the moment “when I became a radical activist… against my wishes”. Sexism still plagues science today. Disparities are too often dismissed as a sort of innate gender difference. But study after study has blown these tired tropes out of the water. At Yale, researchers fabricated an application for a lab manager role. They sent the profile out to universities across the United States, asking professors to evaluate the student. What the professors didn’t know is that half of them received the profile with the name ‘Jennifer’ written at the top and the other half received the profile from ‘John’. The results were in. The Yale researchers were dumbfounded. At first, they frantically checked for

errors in the code they were using to analyse their data. “This couldn’t be right”, they thought. But there were no errors. ‘John’ had surpassed ‘Jennifer’ on every single metric used to assess the candidates. Despite having identical applications, John was deemed more competent, hired more often, given more mentoring opportunities, and a higher starting salary. In this study, ‘John’ and ‘Jennifer’ were made-up characters - figments of the Yale researcher’s imaginations. But this doesn’t mean that ‘John’ and ‘Jennifer’ don’t exist. Every day, real-life Johns and Jennifers send out applications for science positions and, according to this study and many more like it, every day the Jennifers fall victim to the internal biases ingrained in the science industry. So, back to Nancy Hopkins - what did the newly self-proclaimed radical activist do next? She drafted a letter to MIT management about the discrimination that she’d observed. The report went beyond her own lab space not measuring up to her male colleagues’. She outlined the framework of gender-based discrimination that MIT was built on. At lunchtime, in an MIT campus cafeteria, Nancy tentatively slid her drafted letter across the table to the widely-respected biology professor, Mary-Lou Pardue. Pardue read the letter in its entirety with a stern expression. She put the letter down. Nancy had concisely summarized everything that Pardue had been thinking over her decades-long career at MIT, but had never vocalised. Pardue said, “I’d like to sign this letter, and I think we should go and see the president”. They tracked down each of the 15 female tenured professors from the MIT School of Science, who were peppered amongst the 200 male tenures. United, they brought their report to MIT president Charles Vest. For these esteemed academics to do this, Nancy said, was “hard, embarrassing, and awkward” but their efforts paid off. Charles


past Vest enthusiastically took everything the professors had to say on board. Thanks to Nancy and the other female tenured professors or, as some opponents called them, “hysterical women”, changes started to happen at MIT and across some of America’s top universities. MIT committed to recruiting more female faculty staff, promoting more women to leadership roles, and ending the stigma surrounding maternity leave in academia. Following the report, the number of women in the School of Science faculty increased by 50%. When president Charles Vest left his post in 2004, the role was filled by Susan Hockfield, the first woman to ever govern the institute.

“You don’t need a computer to measure this bias - a sundial will do” In-roads are being made for women to join the science cause. But we are not there yet. We cannot be ignorant enough to think that sexism is always confrontational. It lies in experiences like those of Nancy Hopkins being presumed to be a technician, not a senior faculty member, and being expected to wait patiently to use her own equipment while others used it. It lies too in the assumption that John is more competent than Jennifer. Subconscious gender biases like this, often reinforced by well-intentioned people, can be spotted by tests like the Implicit Association Test (IAT) which measures the ability of a participant to associate certain words with different demographics. Even one of the developers of the IAT test, despite being a female scientist, instinctively associated science and technology vocabulary with traditional male names, and household tasks with female names. It took her great effort to perform the task the other way around! Her own job title is among the science vocabulary, yet the test has her, along with the majority of the population, battling to associate science with femininity. The IAT test has its downsides. It is a useful tool for illustrating the public’s biases but not necessarily the most accurate metric we have at our disposal. However, sexism in science is so prominent that, as the researcher who struggled with her own test put it, “You don’t need a computer to measure this bias - a sundial will do”. An arguably steeper slope lies ahead for other minorities. Women from the LGBT+ and the Black, Asian and Ethnic Minority (BAME) communities continue to face the compounded challenges

that come with experiencing multiple intersecting forms of discrimination. Raychelle Burks, professor of analytical chemistry at American University in Washington D.C., described her feelings of panic as she looked at the progress her peers were making as she felt her academic career was beginning to stagnate. She wondered how others had so much time on their hands to take on so many projects simultaneously. At this time, Raychelle Burks, who is Black, was receiving a barrage of racially-charged and misogynistic emails. She described how she would spend hours drafting eloquent email responses, wary of being “characterized as the angry black woman trope”, as she put it. This took time and energy away from her work. This was the moment she realised that this was something “people not from these groups don’t even have to think about, let alone it be a time suck in their schedule”. Reychelle Burks reports a dichotomy of being both invisible and hyper-visible at the same time; constantly being ignored in meetings but often the only female scientist of colour at the majority of events she attends. Fewer than 1 in 4 speakers at chemistry conferences are women, and fewer than 1 in 25 are women of colour. The playing field is slowly being evened out, but we need to pick up the pace. With antibiotic resistance, the en-

vironmental crisis and the Covid-19 pandemic looming over our heads, this world needs all of the minds it can get its hands on. If scientific industries systematically restrict women, they instantly lose half of the people that could save us from these crises. Women are taking a stand against sexism in science. But these professionals did not go into the field to become radical activists; they wanted to continue the exploration of the natural world and contribute to the development of technology. Nancy Hopkins said that becoming an activist “was the last thing in the world I wanted to do.” She simply wanted to do science. “But”, she said, “I found I couldn’t do science unless something changed”.

Author’s comment: This article was inspired by the film “Picture a Scientist”, an official selection of the 2020 Tribeca Film Festival, directed by Sharon Shattuck and Ian Cheney. This incredibly thought-provoking piece of cinematography tells the story of both Nancy Hopkins and Raychelle Burks, as well as a number of other women in STEM, as they relay their lived experiences with gender-based discimination in science. Ishbel Dyke is a third year chemistry student who hopes to pursue a career in science communications

Illustration by Ananya Ambekar

Winter 2021 | eusci.org.uk 9


Natural disasters have not only provoked changes to geological features and wildlife, but also government response and technology. Documentation and prevention is the key to handling the threats of natural disasters. Documentation to keep track of what has occurred in the past, and prevention to minimise the damage done in the future. In this article, volcanoes provide a glimpse into the past, whereas earthquakes exemplify the current technology and government guidelines. Volcanoes In 79 AD, Pliny the Younger documented the eruption of Mount Vesuvius which buried the towns of Pompeii and Herculaneum. As he wrote to Tacitus, “I cannot give you a more exact description of [the eruption] than by likening it to that of a pine-tree, for it shot up to a great height in the

10 Winter 2021 | eusci.org.uk

form of a very tall trunk, which spread itself out at the top into a sort of branches”. Future reports of the eruption all stemmed from Pliny the Younger’s account, though rarely in the same detail. The historian Suetonius wrote of the eruption in around 110 CE, but focused on the response of the emperor at the time, and how he sent aid to the region. The eruption of Mount Vesuvius resulted in not only the burial of two towns, but also the first documentation of a plinian eruption, documentation being the first essential step in the scientific process. Circa 237 to 227 million years ago, the Wrangellia igneous eruptions were a massive tectonic event that led to land formation in the eastern Panthalassic Ocean, modern day parts of Canada and Alaska. Apart from generating new land, the Wrangellia eruptions released large volumes of greenhouse gases into the atmosphere,

coinciding with the Carnian Pluvial Episode (CPE). The CPE was marked by warming and increased humidity, in contrast to the dry periods that characterised the rest of the Triassic Period. Evidence of the CPE was found in Europe and potentially extended to the southern hemisphere, as a recent study by Adriana Mancuso et al., found. There are suggestions that the greenhouse gas emissions of the Wrangellia eruption lead to acid rain which attacked the flora and fauna of the land and the ocean, though conclusive evidence is lacking. As Jacopo Dal Corso, a geologist at the University of Leeds, explains in his recent review in Science Advances, the oceanic biome was completely reshaped due to this increase in CO2 levels, eerily similar to what we are seeing in the world’s oceans today, affecting marine life with carbonate shells and resulting in a significant decrease in


marine diversity. After this stripping of sea life, the ocean was left a bacterial ‘slime world’ which opened the opportunity for the expansion of new carbonate based organisms, such as corals, dinoflagellates, and other organisms. While oceanic life was being transformed, the acid rain and climate change also resulted in a mass change in terrestrial environments. This was followed by the rise of the dinosaurs and tetrapods, however, there is not enough substantial evidence in the fossil record to conclusively state a causal relationship between the CPE and the evolution of dinosaurs. The extent of causation is still unknown, but there are correlations between the Wrangellia eruption, the CPE, and mass extinction events (both on land and in the ocean), giving way to new creatures that were unable to walk before.

“The extent of causation is still unknown, but there are correlations between the Wrangellia eruption, the CPE, and mass extinction events (both on land and in the ocean), giving way to new creatures that were unable to walk before” Volcanic eruptions not only led to new beginnings, but have also provided us with the essential basis to study these events and understand what is happening today. Uranium-lead dating, as used by Manuso et al, is possible due to zircon from eruptions which become trapped with uranium in the geological strata. The amount of decayed uranium can be used to determine timescales for past major events, such as the CPE, thus broadening our understanding of the past.

motionless, and similar devices have been traced back to China in 132 AD. However, through the years this technology has been refined and improved to increase sensitivity to movement and accuracy in measuring magnitude. Even with seismographs and close observation of tectonic plates, earthquakes are still sudden and unpredictable. According to the Seismic Safety Commission of the California Senate Office of Research, every 5.4 years an earthquake between the magnitude of 6.3 and 8.3 strikes California. How do they prevent absolute disaster? Besides keeping the citizens informed, building codes put in place by the government are a key element to prevention. The first California legislation implemented was in response to the 1933 Long Beach earthquake, which caused the collapse of several schools, giving rise to new widely adopted building codes and the Field Act, which declared that the state could intervene in the construction process of schools. Unfortunately, most of the early policy was reactionary, aiming to prevent similar scale damage. For example, the San Fernando earthquake in 1971 spurred seismic codes for dams and hospitals along with other legal action. The challenge to build earthquake-safe structures has generated the field of earthquake engineering. This field aims to design seismic infrastructure or support systems, especially in power

plants or lines.

“According to the Seismic Safety Commission of the California Senate Office of Research, every 5.4 years an earthquake between the magnitude of 6.3 and 8.3 strikes California. How do they prevent absolute disaster?” Volcanoes and earthquakes are two types of natural disasters that have shaped the world we live in, not only through the reshaping of land, but also by spurring government legislation such as earthquake building codes and disaster responses, such as aid sent by the Roman emperor. By documenting and studying these phenomena, we aim to learn enough in order to not only prevent future catastrophic events, but also, as in the case of uranium-lead dating, perhaps use them in our favour. Volcanoes and earthquakes leave us with many new beginnings to explore. Arianna Schneier is an undergraduate third year biological sciences student specialising in cell biology

Earthquakes Earthquakes are another major natural disaster that have moulded the history and geology of our planet. Earthquakes are the result of tectonic plate movement which creates friction, releasing seismic waves that cause shocks as they travel to the surface, consequently moving the ground. Invented in 1890, the seismograph arose as a technology to record these ground vibrations. The basis of a seismograph is that the device can to shift alongside the earth, but the device recording the shifts stays Winter 2021| eusci.org.uk 11


The history of conflict extends as far back as our history books. It has evolved alongside and infiltrated into almost every aspect of the societies we have today. Science has not escaped its influence, and this influence has never been more apparent than during the 19th century. The first and second World Wars and the Cold War that followed would change the role of science in both combat and community forever. By the second half of the nineteenth century, science was driving the Industrial Revolution. The application of scientific theories was beginning to displace the practical experience of craftsmen. Science still had its critics though, with some claiming that existing knowledge was sufficient for further technological development, leaving no role for further scientific research. This debate on the relevance of science battled on to the end of the nineteenth century, and may well have for much longer, if it were not for World War I (WWI).

12 Winter 2021 | eusci.org.uk

When it became clear that the war was not going to be ‘over by Christmas’, military leaders acknowledged the necessity for new military technology. Scientists, industrialists, military figures, and politicians came together after historically being at loggerheads. Professional researchers began applying scientific theory to build new technologies that changed the nature of warfare forever. Chemists began synthetically producing poisonous gases on a large scale for the first time; physicists worked on radio communication and primitive systems for locating submarines; and the first tanks - once imagined by Da Vinci rolled onto the battlefields. These technologies caused unprecedented damage and loss of life. Simultaneously, however, there were leaps and bounds in medical practice that meant more soldiers could come home. In 1914, 4 in 5 soldiers suffering from a broken leg would die. By 1916, 4 in 5 would survive.

Such developments changed the place of science in society for good – it proved that further research could improve technology, bringing benefits with it. When the war ended, research began to receive significantly more government funding. Science could proceed faster, and with better resources, than ever before. With this integration came somewhat of a restructuring of the scientific constitution. What was once a collection of hobbyists pursuing their interests became a more bureaucratic system, matching the military and political models. Whilst we must recognise the benefits that structured oversight can have on the organisation and efficiency of research that was once conducted on random whimsy, many argue that stifled creativity is now one of the greatest threats to science. Undoubtedly, this also marked the beginning of politicisation in science, as projects were now dependent on approval from the governments who funded them. WWI also marked the first time that women worked widely in industry, building what should have been the springboard for women’s participation in life outside the home. However, the Restoration of Pre-War Practices Act meant that women lost their jobs as soon as the war ended. It planted the seeds of possibility, though. When WWII arrived, once again, women were recruited to fill the roles left by men – including in scientific research. This time, there was no order for women to leave their new jobs after the war. WWII thereby created an unprecedented opportunity for women to work in previously inaccessible fields, and they proved their capabilities by contributing greatly to the war effort. Mary Sears, an oceanographer in the US Navy, managed a naval unit who aided strategic manoeuvres by analysing ocean metrics. Grace Hopper, a mathematician, became one of the first computer programmers. Isabella Karle,


a chemist, developed a technique to isolate plutonium chloride as part of the Manhattan Project. The involvement of a whole new demographic in science was conducive of new, productive thought, and although it would be many years until their achievements would be properly recognised, it opened the door – if only a sliver – for new faces in science.

“Although it would be many years yet until their achievements would be properly recognised, WWII opened the door – if only a sliver – for new faces in science” Once again, rapid advances were made in military technology and medicine during WWII. The cost of such development programmes were immense, so it was unsustainable for their fruits to become redundant in just a few years when peacetime arrived. The repurposing of military technology for commercial use was therefore an invaluable strategy for economic recovery, as well as improving commercial technology. In WWI, cellucotton was developed as a cheap alternative to bandaging during a cotton shortage. It was produced under government contract by Kimberly-Clark Co, and the Red Cross Nurses who were using it on the fronts caught-on to its absorbency and began using it during menstruation. Kimberly-Clark repurposed it for sanitary pads after the war, marking a revolution in menstrual hygiene products. The use of radar during WWII was paramount to locating enemies. One of the engineers involved in microwave radar development noticed that a chocolate bar in his pocket melted while he was working with the equipment, sparking the idea that the microwave energy could be used to heat food. Sure enough, the first commercial microwave was manufactured in 1954. One of the most influential examples of military-repurposing came from the Cold War. America was forced to decentralise its command centres after the launching of surveillance satellites by the Soviet Union. They created a network of computers which could send and receive data remotely. This laid the foundations for what we now know as the Internet. The

networks quickly went from military to academic use and, in 1990, with the invention of HTML language at The European Organization for Nuclear Research (known as CERN) by Tim Berners-Lee, the World Wide Web was born. It is hard to think of an innovation that has changed the nature of any part of our society more than the internet – and it was born, like a phoenix, rising from the ashes of war. There is no doubt that the rapid innovation driven by war can have positive repercussions on the technologies and medicine available to us all afterwards. This is evident not only in products developed in wartime, but also in the knowledge gained and the new techniques developed. The most obvious example is nuclear physics, an understanding of which came about through the development of nuclear weapons, but was then reapplied to the energy sector. All of these advances have come with a cost in human lives. This is particularly true for nuclear physics, which reminds us of the massive loss of life and devastation caused by the dropping of atomic bombs in WWII. Additionally, in order to accelerate the research relevant for war, resources – time, money, and scientists – must be pulled from elsewhere. This means that work previously considered important is neglected. This problem can be seen today, with experts from all scientific backgrounds jumping aboard the Covid-19 research wagon to find a solution as quickly as possible. Inevitably, reports have emerged about the impact of this on other vital fields such as cancer research and care. Undoubtedly, there must have been similar consequences during wartime.

For example, we can only imagine what advances could have been made if the immeasurable resources ploughed into developing the atomic bomb were instead poured into treating heart disease and stroke – the leading causes of death worldwide. ‘What-ifs’ aside, we must focus on the lessons that can be learned from the productivity of wartime science, and how we can apply them to other crises or everyday life. Many wartime practices have already been deployed to aid the current Covid-19 pandemic. The fast-tracking of military tech is being applied to vaccine development. The re-purposing of factories, like the car factories that produced bomber planes in wartime, is echoed today, with alcohol brands producing hand sanitiser, and fashion retailers making face masks. Looking forward, we know that crises like war have always brought social change, and so we should not expect life to return to ‘normal’ when the pandemic ends. Perhaps in this lies the most important lesson of all. WWII saw the birth of universal healthcare in the UK, arguably the country’s greatest achievement. So, instead of fearing the new, we should learn from the past and take the opportunity to build back a better, stronger world. Ailie McWhinnie investigates the ways in which war has moulded the technology in our daily lives and even driven changes in the foundations of the scientific system.

Winter 2021 | eusci.org.uk 13


Examining the past allows us to learn and grow from previous mistakes and events. Looking back on previous pandemics and the periods that followed can give us an insight and maybe some hope: hope that we can rise from the ashes and reforge our ‘normal’ once again. Additionally, looking back can help us identify discoveries that were borne as a result of pandemics, and allow us to determine how they may have made an unintended impact in scientific advancement. Pandemics bring about periods of constant flux and change, and those of the 21st century have just amplified this point. Outbreaks that were previously confined to local areas can now spread and become global pandemics within months. The 2009 Swine flu pandemic spread across the globe in nine weeks, aided by intercontinental flights and migration of people (population shifts).

“Pandemics bring about periods of constant flux and change, and those of the 21st century have just amplified this point” This rapid spread allows the disease-causing organism (pathogen) to infect an increasingly greater number of individuals, allowing the pathogen to evolve and mutate accordingly. Furthermore, this spread across the globe allows the pandemic to impact all areas reliant on interaction between people: the economy, travel, trade, social gatherings and important life-events. All of these are affected, and all have to adapt to not further the pandemic occurring, causing rapid and significant shifts in the daily lives of individuals. Nevertheless, these times of rapid change can also help to advance scientific discovery and communication. One of the most well-known and fatal pandemics throughout human history is the Black Death (circa 1341

14 Winter 2021 | eusci.org.uk

to 1352 CE). The second and largest plague caused by the bacterium Yersinia pestis, the Black Death spread throughout Eurasia and North Africa, and outbreaks were seen as late as the 1660s – an example of this being the Great Plague of London. In some cities quarantines and lockdowns were put in place, such as in Dubrovnik and Venice, with other cities practising social distancing, such as in Milan. Incoming

ships in Venice were forced into quarantine for 40 days before they were allowed to disembark. These were some of the earliest known public lockdowns and quarantines, and were later shown to be effective in containing some infectious diseases. Furthermore, as a result of the Black Death, medical inspections of individuals became routine, and plague hospitals were built throughout


Europe to treat the ill. Plague hospitals contained chapels, space for the disinfection of equipment and other materials, and proto wards to take care of patients in isolation, similar to how hospitals are structured now. Although not necessarily a technological marvel, we can also attribute one of the first instances of medical identification tags to pandemics. The Justinianic Plague (circa 541 to 750 CE) is considered to be the first of three plagues caused by Y. pestis. The Justinianic Plague mainly affected the Mediterranean (western Eurasia) and continued to re-occur in western Europe and North Africa in the subsequent two centuries. Individuals would wear identity tags on necklaces and/or bracelets to be identified in case of death, constituting one of the earliest examples of medical identification tags. Medical identification tags are used to inform a first responder, for example a doctor or a paramedic, of the possible medical condition of an individual which may require immediate attention, in the case that the patient is not conscious, too injured, or too young to convey this information. Additional information carried by these tags are the personal details and emergency contacts of the individual. Turning to one of the more recent, devastating pandemics in history, the 1918 Spanish Flu ravaged the globe, killing an estimated 50 million people

worldwide. The pandemic was caused by the H1N1 strain of Influenza virus, which has since been replaced by a new H1N1 strain in 2009 (the “Swine Flu”). During the pandemic, some physicians administered the blood of recovered patients to convalescent patients, potentially reducing the risk of death, as (unknown at the time) antibodies against the virus from recovered patients were being transferred to the sick patients. Furthermore, as a result of this devastating pandemic, the fields of epidemiology and virology were reinvigorated - many important discoveries and advancements were made in virology after the 1918 pandemic.

“As a result of this devastating pandemic, the fields of epidemiology and virology were re-invigorated” In 1931 it was shown that the influenza virus can grow in fertilised chicken eggs, a method of vaccine production that is still currently in use. The first clinical trials of influenza vaccines were conducted in the 1930s, with real world tests carried out during the 1940s using an inactivated form of the virus. This pandemic brought us the effective vaccine technology we use today. The SARS outbreak in 2002-2003 ushered in a new era of public health,

international cooperation and communication at a level unprecedented at the time. International public health laws were expanded and developed to fully address pandemics in the 21st century, aiming to put safeguards in place with regards to travel and trade. Examples of these include establishing information sharing networks that made it possible to track and control the spread of SARS in real-time, and passenger screening measures that were carried out before boarding flights. Pandemics are times of volatility, uncertainty, and lack of information. All of these can be - and have been used to drive scientific research further, to try to understand how and why we have come to the situation that we are in. Pandemics have given us vaccines, public health laws, screening and identification measures, and hospitals, showing how ideas can grow from the ashes of the greatest fires. Cristina is a 4th year Immunology student at the University of Edinburgh

Winter 2021 | eusci.org.uk 15


Dear Reader, “The distinction between the past, present and future is only a stubbornly persistent illusion” - Albert Einstein Perhaps we should have heeded Einstein’s advice before attempting to divide this magazine into three distinct sections. As you read on, you will notice how every new beginning is incomplete without its own past, present and future. I would like you to notice, also, that what counts as present is necessarily relative to the observer. You can think of this in a physical sense – that these articles were written, edited and published certain points in time. But there are also personal and cultural elements. The present is as much about perspective as it is about physics. The astronauts aboard the first ever commercial flight to the International Space Station would surely argue that 2020 was a new beginning in space travel. Those involved in creating the world’s first room temperature superconductor might view this as the major breakthrough of the year. In the UK we might see Covid-19 as the great current challenge to science and technology. However, farmers in Kenya may remember these times as a battle against devastating plagues of locusts. Equally, our writers have their own viewpoints on what constitutes the here and now. There is no doubt that this section is topical. It hones in on some big talking points of 2020, from climate change (p.17), plastic pollution (p.34) and sustainability (p.18) through to digitisation (p.30), artificial intelligence (p.20) and, of course, the coronavirus pandemic (p.23, 24, 26, 32). However, do these hotly debated themes truly reflect the most important new beginnings of 2020? The answer to that depends on your reference frame, so I will let you draw your own conclusions. Best wishes, Harry Carstairs Head Copy Editor

16 Winter 2021| eusci.org.uk


Human-driven global warming could be the catalyst for a new beginning in the Arctic, with consequences reaching far beyond the northern region. A new study published by the National Center for Atmospheric Research has concluded that the Arctic is transitioning from a mostly frozen state into a new phase. Rapid warming in the Arctic, accelerated by anthropogenic change, has resulted in decreasing sea ice and an increasing number of rainy days each year. These findings predict long term changes that will affect local populations and ecosystems, as well as the rest of the world. Scientists Laura Landrum and Marika Holland input data into various simulations to estimate when the Arctic will enter into a new ‘climate phase’ in various areas. They concluded that the Arctic has already begun its transition from a predominantly frozen state under some modelling parameters. They predicted that the latest time at which the transition will occur will be at the end of the 21st century, and the first ice-free years would be as early as 2023. The new climate is characterised by the decreasing sea ice extents (SIEs) in the arctic. The study found that annual SIE minimums (the lowest amount of sea ice recorded) in the Northern Hemisphere have been lower in the past 13 years than at any other point since records began in 1979. The average SIE minimum has decreased

by 31% in the period 2009-2018 compared to 1979-1988. The Arctic is not the only region experiencing dramatic change. The September 2020 issue of National Geographic revealed that there was also extensive ice loss in the Great Lakes region in North America. Maximum ice coverage on Lake Erie decreased from an average of 83% between 1973-2019, to a mere 16% in 2020. The dramatic ice loss that is being seen at the south of the polar region shows just how dire the situation in the Arctic truly is. However, ice extent is not the only variable for measuring climate change in the region, nor does it have the most immediate consequences for many Arctic populations. For the animals who live there now, the biggest concern is rainfall. The research further looked at the number of precipitation days that occurred in the Arctic. Oceanic areas were linked to a higher number of rainy days than land areas, with up to 90 more predicted days of rain per year by the end of the century. The increasing number of rain-on-snow days profoundly affects the animals that inhabit the region. The Arctic is home to low-growing vegetation, and the rain can cause ice layers to grow over these food sources into what are known as ‘ice-locked pastures’. With no access to food supplies, many of the animals in the Arctic could starve, especially when unable to migrate. A study conducted in the Svalbard

region of Norway investigated the effect of increased rainfall on the local populations, and found that increased rainfall was linked with an ‘increasing risk for natural disasters’, such as avalanches with the potential to destroy buildings and hurt the local tourist trade. Furthermore, locked pastures caused crashes in the local reindeer population, which could spiral to affect the local ecosystem. But what about an aspect of global warming already recognised by the general population: rising sea temperatures? As sea ice reflects much of the solar radiation received during the summer months, the decrease in ice coverage has resulted in rising sea temperatures. This has far-reaching consequences for local ecosystems, such as species migration. This is something we have already begun to observe in the Arctic region, and it is widely accepted that ‘climate change is triggering a global reorganisation of marine life’. Species that previously remained further south have begun to travel northwards into the polar regions, forcing a change in structure of the food chain. The Arctic does not have ecosystems or food chains as extensive as the majority of aquatic regions on the planet, so even the slightest disturbance could have profound impacts on its populations. The transition of the Arctic into a ‘new climate phase’ indicates consequences that go beyond the region itself. The changes show that we are already experiencing the effects of global warming beyond expected atmospheric change. Whilst some species may be able to adapt to the new marine climate and ecosystems, there will undoubtedly be many more that fail to. Extinction of species is one of the biggest environmental problems we are currently facing, and we may be running out of time to solve it. Lara Watson is a third year undergraduate student studying history with an interest in all areas of science, particularly conservation and environmental issues

Winter 2021 | eusci.org.uk 17


present

Building Back Better - New beginnings for the construction industry Emily Oliver examines new materials revolutionising the construction industry The recent global slowdown in the construction industry, caused by various lockdowns and restrictions, has made many in and outside the industry take a step back and question if the methods currently being used are fitting for today’s political and environmental climate. With the construction industry producing 40% of all carbon emissions world-wide, this is surely one industry that needs to address its environmental impact if we are to see a decrease in global emissions and revolutionise the future of construction. The building materials that have dominated the last 100 years, concrete and steel, have hugely polluting production processes and are fundamentally non-renewable. One tonne of carbon dioxide (CO2) is released for every tonne of concrete made. Additionally, the archi-

Illustration by Yen Peng (Apple) Chew

18 Winter 2021 | eusci.org.uk

tecture and infrastructure of the past is often unsuited to the climate of today; energy-inefficient and often unhealthy designs are still commonplace around the world. A new era of construction using more environmentally friendly materials and efficient designs is on the horizon, with Covid-19 hopefully an opportunity to speed up this transition. At the forefront of these new technologies is the reworking of one of the oldest building materials – wood. Cross-laminated timber (CLT) is a form of wood panelling, made by gluing together lengths of wood in perpendicular layers. Consequently, this forms a solid, wooden panel that is substantially lighter than a concrete block of the same size. These panels can be used almost like Lego to prefabricate buildings in a much shorter amount of time than previous

construction processes. They also act as a massive carbon store, locking up CO2 absorbed by trees over tens, if not hundreds of years of its lifetime. Not only is it much longer than if a tree was allowed to degrade on its own, but it also involves freeing up space for more trees to grow. This assumption does rely on the timber used in this construction being from a sustainably managed forest, something that can be trickier to ensure than it seems. The outcome of these benefits results in CLT buildings having a significantly lower carbon footprint and environmental impact than a traditional concrete and steel construction. However, CLT cannot be used alone because it does have its drawbacks. For example, its acoustic properties are comparatively poor; suitable insulation must be used for soundproofing and to increase


present CLT’s thermal mass (the ability to capture and slowly release heat). Luckily, there is a sustainable material that can combat just these issues: hempcrete. Hempcrete is a bio-composite made up of sand or lime and the coarse fibres of the hemp plant. It is also a carbon store and non-toxic, making it much more preferable than traditional insulation materials such as fibreglass. Discussion of bio-based materials may make some people curious about the lifespan of these materials in buildings. Surely, they would naturally biodegrade and compromise the structural integrity of the building? However, this is not the case. Bio-based materials using hemp and straw are generally prepared with lime or other naturally anti-pest and antimicrobial components. This stops any destruction by bio-degrading microbes or animals such as termites and rats. When dealing with CLT, preparation and maintenance are the key. Generally, if CLT is used as an exterior, it is encapsulated with more weatherproof materials. such as brick. Similar problems are faced when using steel and concrete – particularly, reinforced concrete, which can begin to degrade in as little as ten years if not protected. When exposed to dampness, changes in temperature, salt water, and other environmental factors these materials can rust, undergo freeze-thaw deterioration and cracking.

“With the construction industry producing 40% of all carbon emissions world-wide, this is surely one industry that needs to address its environmental impact” Of course, it is not just the materials that have to be taken into consideration in construction; the design of a building also has a massive impact on its energy efficiency, cost, environmental impact and, ultimately, sustainability. Too often, the appearance of a building is given precedence over its environmental impact, with the run-of-the-mill glass skyscraper, present in every major city the world over, being notoriously inefficient and requiring massive amounts of air-conditioning. That’s not to say sustainable buildings are ugly – far from it, but the priorities of architects need to be realigned to produce buildings that are both aesthetically beautiful and environmentally friendly. Although still in its relative infancy,

Illustration by Yen Peng (Apple) Chew

CLT has already been used in multiple, high profile building projects. For example, the UK-based company Waugh Thistleton Architects has been involved in the construction of multiple high-rise CLT buildings, such as Dalston Works in London, at present, the biggest CLT building in the world. This change in attitude towards timber construction has not been limited to the architectural sphere; government policy has also begun to embrace and accommodate these new materials. In February 2020, France announced that all public buildings must be constructed using 50% wood or bio-based materials like hemp or straw. This policy will push architects and construction companies to think creatively when considering materials, forcing them to consider thoroughly the environmental aspect of what they are building, and providing a boost to these often-overlooked materials. Unfortunately, in the UK, government policy looks set to curtail this new beginning in construction. In May 2020, the UK government set out to ban combustible cladding (the outer layer of a building) on high rise buildings in the wake of the Grenfell fire, a long awaited and much needed law to regulate the cladding sector. However, despite years of evidence showing the superior fire resistance of engineered wood when compared to other materials, this ban included structural and mass timber. Timber has been subject to significant scrutiny over the years for its perceived performance in fire compared to other materials. Although wood does indeed burn, not having the inherent non-combustibility of concrete and steel means that fire safety is never taken for granted

when constructing a CLT building. Fire risks are calculated in great detail, potentially providing safer and more suitable fire prevention and escape strategies. Additionally, the main issue when constructing a building centres around how predictably and quickly a material will lose its structural integrity. CLT performs well here, burning slowly in a highly predictable manner due to charring occurring on the outside, which protects the inside of the panel from ignition. In comparison, steel loses all structural integrity very quickly once a critical temperature (around 1300°C) is reached. Concrete is slightly more complicated, as it is a complex material and behaves unpredictably. Sometimes, it explodes at temperatures around 200°C and has the potential to lose structural integrity catastrophically if reinforced with steel. Although it is entirely possible to build a CLT building and use another external cladding, this much-publicised ban may discourage construction companies, developers, and architects from using CLT. The effects of such a decision may be damaging for CLT’s uptake and potentially, its long-term economic viability as a construction material. Despite this setback, CLT and biobased constructions in the UK and other parts of the world continue at an ever-increasing rate. Their environmental benefits, flexibility, and decreasing cost will hopefully win out over more polluting materials in the long term, creating urban landscapes that are healthier, more beautiful, and better suited to our changing climate. Emily Oliver is a third year Biological Sciences (Biotechnology) student Winter 2021 | eusci.org.uk 19


The phrase artificial intelligence (AI) should not sound futuristic. It is here. It is now. It is very likely in your pocket or on the other side of the computer screen. But it is not imminently going to morph into a killer robot and take over human civilisation. In fact, AI is helping us to tackle some of today’s biggest research problems in healthcare and climate change, amongst many other fields of science. Let us take a moment to reflect on what AI actually is, and how it can be a force for good. Data analysis has been long used by scientists to aid their work and has become an integral part of the scientific method. There are examples of statistical methods used to aid scientific progress throughout history. In the early seventeenth century, Johannes Kepler formulated the laws of planetary motion by analysing the astronomical measurements of his previous mentor, Tycho Brahe. These laws helped to describe the rules of the solar system as we know them today. Machine Learning (ML) – a subset of AI – is akin to a computerised version of data analysis, in which algorithms ‘learn’ by analysing large amounts of data to create ‘models’. The growing power of ML has been aided by an explosion in the availability of large datasets, new algorithms, and the exponential growth of computing power. In medical research, the staggering growth in data has helped to uncover unknown disease risk factors, leading to more accurate diagnostic and prognostic predictions. Many claim that advances in data generation infrastructure and data analysis methodologies will revolutionise healthcare. The application of data science to health data is already transforming medical research. Accurate predictive models for clinical outcomes can help with early diagnoses, and subsequently, either prevention or occurrence of the disease. One recent example is the prediction of inpatient episodes of acute kidney injury. Another is the use of ML-based sepsis prediction algorithms, which, through randomised clinical trials, have proven beneficial in decreasing hospitalization time and mortality. 20 Winter 2021 | eusci.org.uk


This predictive algorithm is helpful, since early administration of antibiotics and intravenous fluids is considered crucial for the management of sepsis. AI is also expected to facilitate cheaper, quicker and more effective drug discovery. Developing a single treatment is estimated to cost $2.6bn, and much of this is spent on the nine that fail in clinical trials for every one that is successful. AI algorithms are capable of sifting through millions of compounds to narrow down potential options for a particular drug target. Some innovations in healthcare AI go beyond the confines of supercomputers. Smart contact lenses, developed by a joint research team from the Ulsan National Institute of Science and Technology and Sungkyunkwan University, are being engineered to pick up early indicators of cancer, and to measure blood sugar values in the tears of diabetic people, to help them manage their diet and medications. At Google, researchers have used AI to train diagnostic tools that read tissue samples and radiologic scans. They used training data from more than 250,000 patients’ retinal scans. The algorithm learned to spot the patterns that predict a patient with high blood pressure or an increased risk of heart attack or stroke. In some cases, the accuracy of digital tools has outperformed human pathologists, dermatologists, and radiologists. However, AI is helping us to tackle many problems beyond those in medicine. Not least of these is understanding the causes and effects of climate change (see climate on page 22). In addition to this, we are seeing leaps forward in physics, astronomy and cognitive science. have proven beneficial in decreasing hospitalization time and mortality. terabytes (1012 bytes) of data every day. AI systems,

such as artificial neural networks, can help in processing these mountains of data and detect patterns and anomalies with minimal human interaction. A key challenge in astronomy is to distinguish features and phenomena from noise in observational data. For example, NASA’s Kepler mission aims to observe the Orion Spur (the spiral arm of the Milky Way) and beyond, seeking Earth-sized planets orbiting other stars. A ML algorithm is used on the mission to identify noise generated by on-board thrusters and stellar activity, thereby cleaning the data for later analysis.

“In some cases, the accuracy of digital tools have outperformed human pathologists, dermatologists, and radiologists” The growing interaction between AI and cognitive science is equally exciting. In the past decade AI has performed remarkably well at identifying patterns in large, complex data sets. This ability has been extensively used in image recognition technology. AI can now easily differentiate between images of cats and coconuts, identify if there is a zebra in a picture, or spot pedestrians accurately enough to be able to direct a self-driving car. It has also been used to recognize and respond to speech. Cognitive science itself is benefiting from the growth of AI by having a model for testing and developing ideas on how the brain functions and also as a tool for processing complex data sets. Chethan Pandarinath, a biomedical engineer at Emory University and the Georgia Institute of Technology, Atlanta, wants to use AI to “identify

the patterns of electrical activity in neurons that correspond to a person’s attempts to move their arm in a particular way, so that the instruction can then be fed to a prosthesis”, built for people with paralysis. The impact of AI has been ubiquitous in enhancing scientific discovery and research. So where do we go from here? Professor Andrew Briggs at the University of Oxford believes that “silicon-based computers may only have 10-20 years of advances ahead and so we need to accelerate work on new materials and on the next breakthroughs that will come from quantum computing or eventually from molecular computing”. Bringing quantum computing and ML together is expected to have advantages, allowing complex algorithms to be run significantly faster than on even the most powerful classical computer. This combination could help in new drug discovery, enhanced natural language processing, weather predictions, and so much more. While it’s still early to suggest that quantum computing will be the panacea to all informatic and scientific problems, we can expect to apply it to tasks where there are a vast number of variables and permutations, such as calculating the optimal path through a traffic congestion or best delivery routes, which falls under “the travelling salesman problem”. Companies such as Mitsubishi and Volkswagen have already explored solutions to similar issues by deploying quantum computing with AI. The future is not likely to be a world dominated by AI, outdoing humans at every step. However, doctors might start consulting AI models for diagnosis, climate models might be further assisted and made more accurate by AI, advances in natural language processing and speech recognition might result in smart assistants that are able to hold fully fledged conversations with us. It is much more probable that instead of replacing humans as doctors or scientists or friends, AI will augment and supplement human abilities, helping us achieve things that seem impossible today. Hrichika Nag is a 3rd year Computer Science (BEng) student

Winter 2021 | eusci.org.uk 21


The latest IPCC assessment estimated climate sensitivity – the global temperature rise due to a doubling of CO2 levels – to be within a range of 1.54.5°C. Predicting the climate entails a lot of uncertainty. This is partly because of natural fluctuations in variables such as rainfall, sea surface temperature or wind speed. Some fluctuations have global effects, like the El NiñoSouthern Oscillation. Part of the uncertainty also arises because of the difficulty in simulating complex processes such as cloud formation. Climate scientists have recently brought machine learning (ML) to their aid, with the aim of reducing such uncertainties in their models. Climate models do not only have to be accurate, but also fast; they must run through some thousand years’ worth of paleoclimate data to bring them to the current climate state, before predicting future climates. Therefore, they cannot include all atmospheric processes down to the millimeter scale. Instead they make use of empirical formulae or parameterizations. An emerging idea has been to use ML to more accurately simulate small-scale aspects of the ocean and atmosphere. However, generalising ML algorithms from one climate situation or region to another is a complicated task. Training an ML algorithm on the current climate,

often means it then fails to simulate warmer climates, since current climate situations are not analogous to those found at higher temperatures. However, training on the current climate then going to colder conditions works better. This is because the ML model finds examples at higher latitudes to help simulate the tropics in colder climates. There is a possibility for models to learn from events like El Niño, when the global atmosphere gets warmer on average. Although not a perfect analogy to global warming, similar physics might be operating at higher temperatures. Certain parts of climate models are more successful than others, owing to their level of complexity and heterogeneity. For example, we have fairly accurate simulations of atmospheric convection – vertical transport of heat and moisture in the atmosphere. But interactions between the atmosphere and the land surface are much more difficult to simulate, due to the unpredictable responses of various plant and soil types. ML algorithms are therefore being deployed in these areas. Hrichika Nag is a 3rd year Computer Science (BEng) student

101100010101010010110101010001 011010010101001010101100111010 001101010100101000101011101010 22 Winter 2021 | eusci.org.uk


You are probably aware that the fast fashion industry has environmental problems. It contributes 10% of the world’s carbon emissions and uses an estimated 1.5 trillion litres of water each year. That is hard to miss. However, not many people are aware of the link between the fashion industry and the spread of disease. Around 75% of new infectious diseases are zoonotic, which means they spread from animals to humans. Due to its links with industries such as wildlife trade and trafficking, fashion has led to increased interactions between humans and wildlife. These increased interactions are the perfect conditions for the spread of zoonotic diseases, and consequently the emergence of a pandemic.

“These increased interactions are the perfect conditions for the spread of zoonotic diseases, and consequently the emergence of a pandemic” Thomas Lovejoy, president of the Amazon Biodiversity Center stated that every year, two to four viruses “emerge as a result of human interference in the natural world” and that “any one of those has the potential to turn into a pandemic”. The outbreak of SARS in 2003 was linked to a wet market in Shenzhen, China, that was selling Himalayan Palm Civets. This year, Covid-19 has

further highlighted the connections between wildlife trading and diseases. Fur plays a particularly large role in the wildlife trade, accounting for 75% of the industry in China. It is therefore heavily linked with the spread of emerging zoonotic diseases. The emergence of Covid-19, that is, the transmission of the virus to humans, has been associated with live animal markets in Wuhan, China. In these markets, animals like Civet Cats and Raccoon Dogs are sold for their fur. The virus has indeed been found in these animals, but their role in the spread of the disease is unknown. The appalling conditions of wet markets allow diseases to spread easily. In these unsanitary and crowded markets, faecal matter, blood and saliva are pooled together, in close contact with humans and animals, facilitating the transmission of viruses. Moreover, breeding animals for the sole purpose of their fur and keeping them in cages can cause animals to develop psychological disorders, leading to cannibalism and self-mutilation. Furthermore, scientists have found a link between animals kept in stressful environments and the transmission of infectious diseases: corticosteroid, a stress hormone, can suppress the ability of the immune system to fight infection. Therefore, animals are more susceptible to infections under stress. However, wet markets are not the only place where the virus can spread easily. The Netherlands, Denmark

and Spain have all reported Covid-19 infections in fur farms, with 87% of Mink testing positive in infected fur farms in Spain. In Denmark, 207 mink farms have had infections. What is more, mutated virus strains have been discovered in these mink farms. This has led to the culling of approximately seventeen million mink, as well as a lockdown in the north of the country to prevent the spread of the new strains. Mutant virus strains have been identified on five different mink farms, with over two hundred humans now infected with the new strain. Multiple mutant viruses have been isolated from mink, seven of which have mutations in the spike protein, raising concerns that new vaccines would be ineffective against them. Thankfully, it has been found that the new strains are not more contagious or lethal than the original strain. It is too soon to say whether this event will pressure other European countries into closing down their fur farms. Attitudes towards the fur trade and use of animal products in the fashion industry are changing, with brands such as Gucci and Prada announcing plans to become fur-free and use more sustainable and humane materials. Furthermore, countries are becoming more aware of the effects of fur farms, with the UK becoming the first country to ban fur farming in 2000 and many countries following since. However, is it enough that as a country we do not farm fur? Note that our commercial demand for wildlife products still drives wildlife exploitation and trade in countries with less stringent regulations. Will fast fashion be boycotted, will wildlife trade restrictions be more tightly regulated, and is it realistic that the fur industry as a whole will be banned? The answers to these questions are uncertain. However, the current pandemic has shown that strong action is needed to prevent the emergence and spread of new diseases, and that part of that action will need to come from the fashion industry. Kasi Patel is a third year Bsc Biological Sciences (hons Biochemistry) Student

Winter 2021 | eusci.org.uk 23


“The fact is that no species has ever had such wholesale control over everything on earth, living or dead, as we now have... In our hands now lies not only our own future, but that of all other living creatures with whom we share the earth.” These words were written by David Attenborough, and they have never been more truthful or more important. Covid-19 is a disease of a high magnitude, which was not seen since the Spanish flu of 1918. As we grapple with the pandemic, many have paused to reflect: what if we did things differently after this? It has become clear that things cannot carry on as they have been We cannot continue to destroy the natural world, streaming greenhouse gases into the atmosphere as if we have some other planet to go to.

“We cannot continue to destroy the natural world, to keep streaming greenhouse gases into the atmosphere, to act as if we have some other planet to go to” But, to quote an eminent science journalist (it's me – check out the last issue of EUSci magazine), “with every crisis comes an opportunity”. The world will, of course, have to rebuild from the ashes of the pandemic, but we have a choice: a choice of how we choose to rebuild. As governments begin to put together plans for a post-Covid-19 economic bounceback, the idea of a green recovery is gaining traction. In a bid to rebuild the economy in a clean and sustainable way, nations could invest in renewable technologies and energy efficiency measures, instead of funding the dirty, polluting industries of yesterday. The David Attenborough quote from the start of this article is actually from 1979. We knew then that things had to change, but we didn’t act. Now, we know that things have to change, and fast. Will we act this time around? In 2008, a bank that was seemingly ‘too big to fail’ well...failed. The financial institution’s collapse sparked 24 Winter 2021| eusci.org.uk

a chain reaction that sent the world’s economy into freefall. What followed was the worst economic recession since the Great Depression (though the current pandemic-induced recession, or ‘pancession’ as some have called it, has now taken this crown – thanks to Covid-19). Similar to the situation today, governments needed to bring the economy back up and off its knees, by investing – and investing a lot. Huge stimulus packages were deployed to build back the affected industries, and they worked. But this came at a price. The buildback was so energyand carbon-intensive that, despite a dip in emissions in 2009 (caused by the crash), in 2010 the world saw the biggest single-year emissions increase in history. Looking just at carbon dioxide (CO2), this was a fall of 1.4% in 2009, followed by an increase of 5.1% in 2010. The 2020 decline in CO2 is predicted to lie somewhere between 47%. Considering that we release far more CO2 now than we did then, this is a substantial decrease, and we can only expect the rebound to be even greater, as the economic crisis is much more severe this time around. Scientists have already worked out what the impact of a huge fossil-fueled economic recovery would look like. In a recent multi-institution study, published in Nature Climate Change and led by Professor Piers Foster of the

International Centre for Climate, researchers modelled the climate response to four different scenarios of economic recovery. These consisted of a baseline scenario in which emissions are the same as before the pandemic, a fossil-fueled recovery, and two green recovery scenarios – moderate and strong. The scenarios are based on differing levels of government investment into low carbon technologies, measured as a percentage of global GDP. The authors found that a fossil-fueled recovery would bring us perilously close to an increase of 2oC from pre-industrial levels. This may not sound like much, but 2oC of warming is considered by scientists to be the limit beyond which we tip the balance towards irreversible, catastrophic climate change. The words ‘irreversible’ and ‘catastrophic’ have also been used to describe the experiences of those unfortunate enough to hear me singing in the shower. Thankfully climate change is different, because we can actually do something about it (whereas I will never stop singing in the shower). Both green scenarios put a healthy distance between us and dangerous levels of climate change, and a strong green recovery would give us the best chance of limiting warming to 1.5oC. A strong green recovery would require governments to channel 1.2%


of global GDP into green investments, which equates to roughly $1.7 trillion USD. Unfortunately, even the most promising stimulus packages from Germany and the EU don’t meet the necessary investment targets, and many of the world’s big emitters haven’t even released fully-formed packages yet. In times of recession, people tend to care less about the environment as economic recovery moves to the top of the agenda. I can’t say I blame them. If you’ve been made unemployed, or are feeling the pinch from the recession, the environment isn’t exactly the first thing on your mind. So politicians can easily convince voters to forgo the environment in favour of rebuilding the economy. But they needn’t be mutually exclusive: this time around, a green recovery can protect the environment and boost the economy. Stimulus packages are considered based on two things. Firstly, how long is it going to take the package to boost the economy? Secondly, is it going to have a positive, long-term impact? In practice, this favours government investments that will get a large number of people employed quickly, in industries that will provide value in the long-term. In a recent study, researchers surveyed 231 economic experts from across the G20 countries about the best investment and policy decisions for a post-Covid-19 response. They found that stimulus packages “can act to decouple economic growth from GHG emissions and reduce existing welfare inequalities that will be exacerbated by the pandemic in the short-term and climate change in the long-term”. Translation: green stimulus packages will be better for the economy. In the paper they identified five areas that will help rebuild the economy and hit our climate targets: investment in clean infrastructure (such as electric car charge points); energy efficiency retrofitting; education and employment training; investment in ecosystem services to promote climate resilience; and investment in clean research and development. I want to highlight this last point, for which I will take you back to the 2008 crash. Despite the damage caused by the carbon-intensive rebuild, two technologies that are now mainstream cost- competitors in the energy market - wind and solar - were beneficiaries of the investment windfall. In fact, the growth of wind and solar over the last decade can be traced back to the stimulus packages of

the post-crash aftermath. Now, they are mature technologies ready to be deployed at a large scale. Furthermore, there are technologies that are in the same place now that wind and solar were. Given the necessary boost, these technologies could grow to be mainstream and cost-competitive, and, along with further massive-scale deployment of wind and solar, a cornerstone of a green economy.

“The economy and the environment needn’t be mutually exclusive. A green recovery can protect the environment and boost the economy” There is a strange circularity to all of this. Human-induced climate change and habitat destruction is pushing animals into areas where they increasingly come into contact with humans. More human-animal contact is predicted to significantly increase disease spillover from mammals to humans, as happened with the novel coronavirus. This virus has created a pandemic which might just be our last opportunity to fight climate change. This would reduce the likelihood of future animal-borne pandemics. The crux of the matter is this: we need to halt climate change and environmental

destruction to avoid the devastation they could reap, and we won’t get a better chance than this. Despite the drop in emissions from Covid-19, the consequences for the climate will be insignificant – it is predicted that it will only prevent 0.01oC of warming. This highlights how even with huge behavioural changes, like those that accompanied the global lockdown, we won’t reduce emissions enough to hit our climate targets. What we need is huge, structural changes in the form of a green recovery. If we return to our old ways, the path to a carbon-free future will be sealed off irreversibly. The pandemic has given society the chance for a do-over – quite literally a once in a lifetime opportunity. Though in this case, a lifetime isn’t measured in terms of any one human being, but of our collective presence on the planet and the history of our species on earth. How we choose to rebuild from the ashes of this pandemic will decide the fate of humankind. Tom (he/him) is an ecology graduate from the University of Edinburgh, and the host of EUSci’s Not Another Science Podcast

Winter 2021 | eusci.org.uk 25


present

To mask, or not to mask? That is the $28tn question Ellie Bennett explores the psychology behind people who refuse to follow public health measures during a pandemic that is costing the world dearly

Image by congerdesign on Pixabay

The 2020 Covid-19 pandemic has been a wakeup call for those of us who took our healthcare systems for granted. Infectious diseases in the modern era have caused some concern in the Western world, but have largely been seen as a problem for developing countries. But now there’s one on our doorstep. It’s on everyone’s doorstep, whether the mat says welcome, bienvenue, or ‘ahlaan bik. The coronavirus doesn’t care about borders, US Elections, or how entitled we feel to a pint at the pub. This reality has been accepted, for the most part, by governments and people across the world. Whilst initially we saw mass compliance with preventative public health measures, fatigue is setting in. People are fed up with lockdowns and curfews, and have watched governments and health officials tie themselves in knots over which measures are effective. Some leaders, such as Boris Johnson, appear at times to be confused over their own lockdown measures. The result is that more people are choosing to ignore public health measures put in place to curb the spread of Covid-19. 26 Winter 2021 | eusci.org.uk

From a psychological perspective, an interesting question crops up as to why some people follow preventative measures, whilst others don’t. ‘Selfish’ might be a word that comes to mind, especially when watching clips of spring breakers on Florida’s beaches and British party-goers packed into East London streets. Young people have been the focus of contempt for the way some have publicly ignored lockdown and social distancing rules. A recent study found that this lack of compliance with public health measures could be attributed to the risk-taking, impulsive traits seen more often in young people. This, tied in with the fact that they are less likely to suffer from Covid-19, makes them less inclined to follow measures which severely restrict their social freedoms. Othering rule-breakers and dismissing them as delinquent or careless, whether young or old, is not going to get us any further in understanding and persuading these groups to join efforts to control the virus. Nor will it inform our response to future public health crises. Instead, psychologist David B. Abrams, from the

School of Global Public Health at New York University, says that those who fervently deny and those who stridently comply to Covid-19 measures are coming from the same place, emotionally and psychologically. He uses the mask debate to demonstrate this. The anti-mask and pro-mask movements are perhaps the most visible and vocal across the pond. In the US, the wearing of a face mask has become a highly politicised act. Those on the promask side are confused by the fact that people are making a fuss over a piece of cloth which has the capacity to save lives. Meanwhile, the anti-maskers are outraged that others can’t see how such a measure is a gross act of social control and an infringement upon our liberties. Abrams says that these extreme reactions to a seemingly innocuous, apolitical health measure are the result of an embedded human instinct: ‘fight or flight’. This is a common fear response amongst primates that causes people to either act out against the threat or try to escape it. But in the case of the Covid-19 pandemic, a further complicating factor is that


present the threat is unknown. It is a novel virus that we have had precious little time to study and understand. According to Abrams, the unfamiliarity of the Covid-19 virus has led people’s normal fear response to become ‘hyperstimulated’. He says that we are seeing people react to the pandemic “with a strong and powerful set of emotions that completely override and erase the usual rational cool thinking”. To put it simply, everyone is scared, but the difference lies in the way we respond to that fear — and our responses aren’t necessarily going to be the most rational. People are grasping at whatever makes them feel safe in this context, whether that be wearing a mask or rallying against the government. Joseph J. Trunzo, a professor and chair of the department of psychology at Bryant University, Rhode Island, explains what factors can influence our behavioural response: “Any human behaviour — even seemingly simple behaviour, such as wearing a mask or not — is determined by multiple factors: political beliefs, ideology, social factors, and education.” This means that the decision not to wear a mask may not be as flippant or uncaring as we think. Instead, it is influenced by a number of complex social and individualistic factors that the anti-masker has used to inform their behaviour consciously and subconsciously. In their view, they have made an informed, rational decision according to their past experiences and world view. A crude example of this might be someone who hasn’t had any personal

experience with the virus – they haven’t fallen ill and neither have those close to them – and they traditionally see authorities as corrupt and dishonest, including those in government and healthcare. Another recent study described the strong link between the lack of trust in authorities and poor uptake of public health measures. Without a visible threat, people aren’t convinced that the virus is real and believe instead that it is a crafty way for the government to do what they do best: control us. This may cause them to believe that they are privy to a reality that others are not, giving them a sense of control over a situation that the rest of us are struggling to overcome.

“those who fervently deny and those who stridently comply to Covid-19 measures are coming from the same place, emotionally and psychologically” Advice to governments, in the event of future public health crises, might be to take an exceptionally honest approach when faced with the level of uncertainty seen during the Covid-19 pandemic. They must admit when they don’t have all of the answers and avoid making definitive statements and promises that they will later fall back on. This transparency has to be present throughout the crisis because many preventative public health measures must be implemented in

the long term to be effective. This relies on continued public faith in the government and its messages. Whilst the UK government rallied collective action with the mantra ‘Stay home, protect the NHS, save lives’, the message has since lost its power, as the authority that voiced it in the first place has become inconsistent and confusing in their subsequent calls to action and implementation of further public health measures. Two stand-out examples of leaders who demonstrated candor and consistency in their messaging were the German Chancellor Angela Merkel and New Zealand’s Prime Minister Jacinda Ardern. Merkel’s messages at the very beginning of the pandemic were extremely frank and did not sugarcoat the seriousness of the virus, instead stating plainly that up to 70% of the German population could become infected. Similarly, Prime Minister Ardern was praised not only for clarity in her messaging but also her empathetic approach. Both have been lauded for their swift response to the pandemic and transparency over the following months. Both countries were successful at keeping their case numbers relatively low compared to other nations and their leaders have seen a surge in their public approval ratings. In such fraught times, it is tempting to shame those who don’t follow the rules. Understandably, they make us angry because their actions are prolonging the suffering of everyone else. But taking an aggressive approach backs them into a corner where they feel they must defend their behaviour, which, as we learnt from Trunzo, is also a defence of their personal experience and ideologies. If we want to convince anti-maskers to wear a mask, we are going to have to try and empathise with them. This means that we need to appeal to their way of thinking, even when putting our own arguments across. For example, we might want to acknowledge that the government has been ineffective and bewildering in its response to the pandemic, yet try to appeal to the fact that they are humans like us, dealing with a virus previously unknown to science. As for mask-wearing in particular, we can reiterate that whilst wearing a mask feels restrictive, its overall purpose is to eventually restore our freedoms that have been taken away by Covid-19. Ellie Bennett is a Medical Law and Ethics Masters student with a background in Biological Sciences

Illustration by Linta Nasim

Winter 2021 | eusci.org.uk 27


Science and technology depend upon each other, and this symbiotic relationship only strengthens with each new wave of discoveries and technological advancements. The 1970s saw the onset of the digital age, which necessitated science to evolve and rely on the innovations of the Information Technology industry. Since then, the adoption of digital technology has been rapid, not least since the start of the Covid-19 pandemic. Across Europe, the concept of ‘open science’ has gained prominence, which stresses the benefits of collaboration and accessibility. However, there are growing concerns over data security, plagiarism and preserving the authenticity of scientific research. An expanded use of digital platforms in scientific research and education during the pandemic will be pivotal. But, in the longer term, the emergence of a hybrid model - a balance between digital formats and one-to-one interactions - is inevitable. Education is at the forefront of change. It was feared that the lockdown caused by the coronavirus pandemic would render millions of students unable to

28 Winter 2021 | eusci.org.uk

complete their schoolwork due to the temporary closure of schools and universities. In fact, the abrupt pause in teaching was followed by a rapid shift to online learning. But the switch to online lectures, teleconferencing, digital open books, online examinations, and interaction at virtual environments does not, unfortunately, cater to all students.

“the emergence of a hybrid model - a balance between digital formats and one-toone interactions - is inevitable” Firstly, students have unequal access to computers and internet connectivity. Moreover, many science students have the additional disadvantage of restricted access to laboratories. In order to bridge this gap, various educational institutions have incorporated computer simulations to minimize the disruption to research-based learning. Furthermore, many universities,

have integrated programming including the University of Edinburgh, languages for statistics and data handling into some courses and promoted online programming courses to further encourage digital literacy. There have been other adverse effects to the shift to online learning. The lack of interaction between students and teachers has reduced engagement. Online learning is especially detrimental to conversational activities such as the clarification of doubts after lectures and social exchanges with peers. These chance encounters simply cannot be reproduced by pre-recorded lectures. On the other hand, the need for cultivating digital literacy has been acknowledged within the scientific community, and the pandemic has provided an opportunity to accelerate this process. Digitalisation has allowed for the conduction of meta-analyses of scientific collections and the promotion of collaborations, allowing the concept of open science to flourish, pushing the boundaries of research and development across geographic borders and scientific disciplines. Moreover, the endeavour for


openness also enhances the value of research by increasing its uptake, relevance and potential downstream value. For example, the search for an effective Covid-19 vaccine was born out of previously conducted research into a different virus. The accessibility of that older research gave scientists a headstart in tackling the pandemic.

“the need for cultivating digital literacy has been acknowledged within the scientific community, and the pandemic has provided an opportunity to accelerate this process� Openness also fosters innovation by providing a platform for effective collaboration between research groups across the globe. Digital interaction between researchers is both economical and an inclusive form of knowledge

exchange. Although digital formats cannot completely substitute social interactions, they can complement them. One of the biggest concerns about digitalisation is data privacy. Protection rights for data used in scientific research are not always clear, as research generally falls under the umbrella of intellectual property. In Europe, the GDPR (General Data Protection Regulation) protects any personal data that is collected for research purposes, but the guidelines for pooled data from meta-analysis platforms are vague. Furthermore, digitalisation has entailed a shift from the traditional method of publishing scientific research papers in peerreviewed journals to uploading research papers onto open-access, preprint digital platforms. While this guarantees greater accessibility, it also makes the data presented in these papers vulnerable due to the current insufficiency of cyberinfrastructure. Digitalisation has the power to

reshape scientific education, research and technology. There is, however, a need for a reliable hybrid model that amalgamates the fundamentals of STEM-related industries. This can be achieved by empowering progress towards a balanced reliance on innovation without compromising the age-old ideals behind invention. With better cyberinfrastructure and data protection laws, the future of learning will be bright and digital Simar Mann is a 3rd Year Biomedical Sciences student

Winter 2021 | eusci.org.uk 29


present

Dial-a-Doctor: a digital movement of healthcare Belén Hackel and Tara Wagner-Gamble discuss the digital future of healthcare

Illustration by Yen Peng (Apple) Chew

Due to Covid-19 there has been a huge movement to online services, including the healthcare system. Seeing your doctor in this day and age now involves shouting your symptoms down the phone or oversharing on Zoom. The advances that we have seen in recent months, due to Covid-19, have revolutionised how we interact with each other. Zoom, Microsoft Teams, and Google Meet have soared in popularity, supporting online pub quizzes, family catch ups, and work meetings. So why not our healthcare system too?

“The “Save the NHS” campaign has led to many people letting their ailments go unseen and untreated” Hospitals, as well as GP clinics, are Covid-19 infection hotspots. People working in or visiting these facilities could be interacting with a large number of potentially vulnerable individuals or be Covid-19 carriers themselves. Healthcare professionals are also at risk of exposure. Therefore, an online screening system ensures that only patients really in need of face-to-face appointments re30 Winter 2021 | eusci.org.uk

ceive them, avoiding unnecessary Covid-19 exposure. When feeling ill or needing medical advice, you can book a phone call appointment, after which you can collect any prescriptions from your local pharmacy or come into the clinic if required. This allows appointments to be done from the comfort of your home, independent of where you are. Additionally, virtual appointments for less urgent matters, such as discussing contraception, saves time as patients do not need to travel to the clinic – nor do they need to wait in the waiting room. Although online appointments and screening come with many advantages, there are some disadvantages which need to be considered. Most importantly, the lack of face-to-face interaction. Doctors unable to see their patients during the initial screening process may miss vital symptoms as their diagnosis is heavily dependent on description. This loss of personal connection can also make it harder for patients to express themselves or feel comfortable when discussing their concerns. The NHS has tried to combat this by providing video consultations for GP and hospital appointments, mental health services, and community care services. However, communicating

more severe matters, such as sensitive diagnoses, may be challenging for both the healthcare professional and patient. Although this may be the case, there are instances where teleappointments are a preferred alternative to in-person hospital meetings due to the stressful clinical environment. Patients are already concerned about their health and the prospect of catching Covid-19 intensifies this. Communication is also a major challenge for in-person appointments as both doctors and patients are wearing masks and have to shout to be heard, as highlighted to us by Dr Sarah Carstairs, an ophthalmologist with NHS Tayside. “I think because of all the masking in hospitals, speaking to somebody in the comfort of their own home - when patients are not wearing a mask and are not in a scary hospital where they think they’re going to catch Covid-19 - they’re more relaxed. Additionally, patients can’t see very well (as this is why they are in the eye clinic) or they’re elderly and they can’t hear very well... Communicating between masks is much less effective and particularly our generation and older are used to talking on the telephone without video.” - Dr Sarah Carstairs Dr Carstairs does, however, highlight that this only works for some cases, as conditions such as glaucoma require in-person investigation. “I think the remote appointments are going very well, but it depends [on] what the problem is. A condition like [macular degeneration], which is easily picked up by a scan that is widely available in the community, lends itself very well to [telephone referrals], whereas for [more nuanced conditions] somebody physically needs to actually see it. You can’t do that remotely.” - Dr Sarah Carstairs Privacy is another major, multifaceted issue. For example, a patient may not want other household members to hear or know about an appointment, or they may not have a safe space to make the phone call. In an interview with Emily Ross, a


present Senior Medic on placement at the University of Aberdeen, she also addressed a concern for privacy. “I worry a lot about confidentiality with telephone and video call appointments. You can never really be sure who else is in the room with your patient. This is obviously a big problem for patients in abusive home situations who may not feel safe to tell the truth. It also means that people are more reluctant to make appointments if they think their family members will overhear them, and if they do get an appointment, they may not feel comfortable being open and honest. All this means that fewer patients are being seen and information is missed so they’re not receiving the treatment they need. Inevitably there will be some major things that are not picked up which could have massive consequences.” - Emily Ross Another pitfall is that although more healthcare providers can be reached, older people may struggle or feel less comfortable with the online transition. This is crucial as they are also in the more ‘atrisk’ age bracket and should avoid Covid-19 exposure as much as possible. This, along with the ‘Save the NHS’ campaign, has led to many people letting their ailments go unseen and untreated.

“Combining AI with conventional doctor practices will result in a more efficient use of the healthcare resources” A recent paper published in the Lancet estimates that between 3291 and 3621 avoidable deaths will occur due to late cancer diagnosis of breast, colorectal, lung, and oesophageal tumours in the Covid-19 lockdown. This cannot be completely attributed to remote consultations, as essential diagnostic services were cancelled. There was a 90% drop in endoscopies in April 2020 in comparison to those carried out in January, February, and March 2020. This highlights that in-person examinations and treatments will always be necessary. “There’s this misconception that all the GPs have been redeployed to fight Covid-19 and so people are reluctant to make appointments because they see it as ‘putting strain on the NHS’. This is compounded, I think, with the ‘Protect the NHS’ government message. There’s been so much discussion in the last months about helping/saving/donating to the NHS that people are beginning to

see it as a struggling charity that they shouldn’t take advantage of. I don’t want to think about how many serious conditions will be missed because people feel that they are not worth the time and money of the NHS. The NHS is struggling right now but that’s nothing that individual patients will solve by keeping themselves away from care they are entitled to.“ - Emily Ross It has been a bumpy transition, but have we glimpsed into the future of healthcare? Instead of GPs wasting precious time on the screening process, artificial intelligence such as machine learning could be the solution. This could involve being asked questions by an automated programme, where your answers would determine what type of doctor you should see and whether you should come into the practice for a faceto-face appointment. Furthermore, machine learning could also be used in diagnostics to some extent where, for example, images of a patient’s rash may be compared to those in a pre-existing and growing database. Combining AI with conventional practice will result in more efficient use of healthcare resources. When interviewing Kelly Tan, a final year master’s speech pathologist at the University of Melbourne, she highlighted that the move to online healthcare and therapy is just the beginning and that we need to embrace it and adapt. “Virtual therapy has made a promi-

nent mark in our field... I think [virtual therapy] will continue to exist as an option, or even as a consistent method, of providing speech pathology services if it is proved to be the most efficient and preferred way by clients. There is evidence available that supports the effectiveness of online speech therapy. Nonetheless, speech pathologists aim to help adults and children to improve their swallowing and communication disorders, be it through in person faceto-face sessions, video calls, or even over the telephone - we have to adapt with the times! “ - Kelly Tan Choi Yin Although AI may be the future of healthcare, we are still far away from implementing it. However, the online healthcare transition, facilitated by the Covid-19 pandemic, has pushed us closer to this reality. The unforeseen nature of the pandemic forced the transition and, given the circumstances, virtual appointments are a good short term alternative as people will always continue to get ill. From a long-term perspective, a lot needs to be done to improve these services and come up with a more effective and efficient hybrid model, combining AI and conventional practices. Tara Wagner-Gamble is a final year Biological Sciences student specialising in Immunology. Belén Hackel is a final year Biomedical Sciences student specialising in Neuroscience

Illustration by Yen Peng (Apple) Chew

Winter 2021 | eusci.org.uk 31


present

A new era of plastic pollution Ian Yang explores microplastic and other newly emergent forms of plastic pollution, alongside ways to manage them In 1907, the world’s first plastic Bakelite was invented. Throughout the years, it has brought much convenience to our lives. In chemistry, plastic is referred to as a polymer (poly = many), a long-chain molecule made up of many repeating subunits. For example, polyethylene, the plastic used for plastic bags and films, is composed of many ethylene components. As a material, plastic is cheap and lightweight, yet durable and resistant to destructive environmental factors and can be readily disposed of. An example of its benefits is improving hygiene in food and drinks by using food packaging .

Polyethylene: a polymer containing a finite number of ethylene components It is this durability and ease of disposal that also escalated the issue of persistent plastic waste in our environment. Moreover, in recent years, and especially during the Covid-19 pandemic, new plastic waste issues emerge. Poor waste management practices have already caused great harm to the environment and cannot be easily swept under the carpet for much longer. Background: Current Waste Management Practices Currently, there are four ways plastic waste is handled: recycling, biodegradation, incineration and landfill. Different regions of the world may have different waste management policies and not all waste is properly managed. Especially during this pandemic, many city councils have halted their recycling activities over fear of Covid-19 infection. Unmanaged plastic waste often ends up in the ocean, causing devastating effects to marine life. By the 1960s, people believed that oceans had self-cleansing 32 Winter 2021 | eusci.org.uk

abilities and could break down any waste we dispose of; many authorities even accepted this as a scientific fact. The problems marine debris brought about were only starting to be noticed by the general public. Since plastic was not so widely used by then, the problem was not as serious. However, as more and more non-biodegradable plastics were brought onto the global stage, they persisted in the marine environment and now take up as much as 80% of marine debris. Current estimates only address the tip of the iceberg, as 99% of plastic waste lies deep under the ocean, remaining hard to estimate. Serving Microplastic Realness Plastic’s negative impact on the marine environment is the most prominent, most notably with marine animals ingesting plastics, becoming entangled and suffocating, which often leads to death. In recent years, the public eye turned to microplastics as researchers found plastics smaller than 5 mm in length inside marine animals. To understand the issue, we can examine the two main sources of microplastics: primary and secondary plastics. Primary microplastics are deliberately manufactured to meet human needs, such as microbeads in cosmetic products like exfoliating body cleansers. Secondary microplastics are those created from the degradation of larger plastics. In the marine environment especially, plastic is susceptible to photodegradation when exposed to the sun (UV light), forming smaller plastic components, which are often either mistaken for food by animals or sometimes entangle them.

“To achieve visible impact, countries must work cooperatively to support a sustainable economy under the background of globalisation” The problem with microplastic is that it persists in the environment and can accumulate in animal bodies, and get passed through the food chain and possibly to humans. For instance, fish are a significant source of protein for humans,

and when they ingest microplastics, they accumulate before becoming our ‘delicious’ meal. Researchers have also found through laboratory tests that microplastics can bind to heavy metals and other toxic contaminants, causing potential undesirable health implications. Heavy metals like lead, copper, and cadmium, as well as polycyclic aromatic hydrocarbons (PAH) such as naphthalene, a chemical that destroys red blood cells and causes anemia, can easily stick to microplastics surfaces, especially on polyvinyl chloride (PVC) and polypropylene (PP). Such toxic contaminants are often carcinogenic (cancer-causing) and can have long-term implications when accumulated in biological systems. This is because PAHs like naphthalene are highly lipophilic, which means that even small amounts of it from water selectively diffuse into fish’s gills and remain in their fatty flesh for a long period of time. This process is referred to as bioaccumulation and it happens to other organisms as well. The average concentration of organic compounds increases dramatically up the food chain, as these toxic chemicals are stored in animal fat rather than broken down. Over time, organisms simply retain these organic compounds. This is known as biomagnification. In recent history, we have seen the effects of a chemical called dichlorodiphenyltrichloroethane (DDT), originally developed as an insecticide, on bird eggs, where eggshells become too fragile and difficult to hatch. It is not too hard then, to imagine the harmfulness of these toxic contaminants bound to microplastics. “Protecting nature means protecting ourselves”, and the opposite holds true: humans created plastics without a concrete waste management plan, causing harm to the environment, and nature bites back at us and puts microplastics on our plates. A New Form of Plastic Pollution Since Covid-19, many people have become more aware of their impact on the environment. However, the continuous public health crisis has exponentially increased the use of plastic and created new forms of plastic waste. For example, personal protective


present

Illustration by Yen Peng (Apple) Chew

equipment (PPE) like surgical face masks and hand sanitisers are essential tools to limit the spread of Covid-19, but they also pose a threat to the environment as most consist of a thin layer of plastic such as polypropylene (PE). As always, unmanaged waste ends up in the marine environment as Covid-19-related waste is found washed up on shores. This has created a new form of plastic pollution and its impact is visible as marine animals are entangled, but what is more worrying are its long-term effects when degraded into microplastics. The urgent need forPPE amongst the public and healthcare workers has driven an increase in production by 40%, and more than 240 tonnes of medical waste (such as disposable face masks, gloves and gowns) were produced in hospitals in Wuhan per day at the peak of the Covid-19 pandemic, six times more than the pre-pandemic average. If the world’s population uses one disposable face mask per day, 129 billion masks and 65 billion gloves could be consumed by the end of a month. Aside from Covid-19-related waste, personal life choices during lockdown also increases the use of plastic. For example, during Singapore’s eightweek lockdown, takeout meals and online shopping contributed 1400 tonnes of plastic. What now? It is clear that the global health cri-

sis stresses regular waste management practices and leads to inappropriate disposal of waste. The negative impact of this is profound, as 1% of unmanaged Covid-19-related waste translates to more than 10 million items, weighing nearly 40,000 kg. For a brighter future with less plastic waste, scientific inventions may be the key to save the day. There are many ways science could help: an environmentally-friendly alternative to plastic, technological improvement of the recycling process and ocean-cleaning technologies. For example, biodegradable plastic is already widely available, though not all are fully biodegradable. If a novel biodegradable material is invented to incorporate plastics’ qualities, it would be beneficial to solving the world’s many plastic-related problems. As to reducing Covid-19-related waste, a special face mask was already invented by a team of engineers to filter and kill coronaviruses. The face mask works by letting through viral particles and incorporating a heated copper mesh which slows and inactivates viruses. The invention has not yet been scientifically or medically peer reviewed, but could greatly reduce the use of single-use masks if proven effective. For policy makers, they should implement stricter and more transparent trade policies to limit and manage plastic waste. Examples include but are

not restricted to: removing subsidies to promote plastic production and trade; improving environmental standards and plastics labelling requirements; reducing the use of single-use plastics; and subsidising taxes for industries to recycle and invest in waste management technologies. To achieve visible impact, countries must work cooperatively to support scientific innovations that help solve such challenging global issues. Conclusion Our planet is facing unprecedented challenges posed by irresponsible exploitations and the consequences are serious if we continue to neglect them. The pandemic has, for sure, made us realise the importance of respecting nature, but overwhelmingly, it has also added new forms of persistent waste to the environment. Environmental issues like plastic pollution have tight links to world trade and global politics. Through global cooperation, it is entirely possible to relieve our planet of plastic pollution, but the key lies in policy makers’ willingness to pursue a sustainable future. Ian Yang is a third-year Chemistry student, after learning about polymer chemistry in his degree course, he is very interested in environmental science and looks to find solutions to the most pressing environmental issues Winter 2021 | eusci.org.uk 33


Dear Reader, “The future depends on what we do in the present.” Mahatma Gandhi You might have noticed by now that what Harry mentioned in the Present editorial stands true: “Every new beginning is incomplete without its own past, present and future”. This section is the most exciting one and one that people always look forward to reading. The future is never certain. Our authors can only wonder, imagine and predict when they write their piece based on present research and discoveries. Have the decades of Mars explorations and missions done enough to indicate a new beginning for the human race to expand? (p.36-37). Would this require a thermonuclear missile terraforming to deliberately modify the atmosphere, temperature or ecology of a planet? Read some of the wackiest ideas that have been proposed for the future on page 35. Since the Covid-19 pandemic has incited digital movements of healthcare and education (p.28-31), should we be more concerned about data privacy and our personal information than ever before? Other than the Covid-19 pandemic, the world was literally on fire in 2020; there were 57,000 wildfires compared with 50,477 in 2019. How can we start anew and prevent wildfires from exacerbating in the future alongside climate change? (p.42-43). And as our world and society move towards a future of inclusivity, biology is having a crisis of identity in this transition. Harin Wijayathunga discusses whether we should have a new beginning for a more comprehensive definition of biology (p.40-41). These thought-provoking articles will keep you on your toes in thinking about potential new beginnings of the future. There’s so much more to talk about beyond the technical scope of this magazine and we hope these articles will sow a seed to your imagination. To infinity and beyond! Best wishes, Yen Peng (Apple) Chew Editor-in-Chief

34 Winter 2021| eusci.org.uk


Osseointegrated sonochromatic cyborg implants. Thermonuclear missile terraforming. Space-based laser-transmitted solar-powered electricity generation. These are not strings of words plucked at random from a scientific dictionary, they are serious solutions to serious problems, proposed by serious people. According to Ray Kurzweil the pace of technological progress follows a Law of Accelerating Returns, whereby innovation breeds innovation. If this law holds, then the future promises to deliver many more bafflingly futuristic scientific developments. Here are some of the wilder brainchildren of scientists who refused to be bound by convention. In recent years, biotechnology has been migrating from science fiction to reality, with commentators predicting that along with AI and quantum computing, biotechnology will kickstart the next great technological revolution. One early adopter of human-integrated technology is Neil Harbisson. Harbisson was born colour blind, but had the good fortune to acquire perfect pitch (the ability to identify musical notes without reference to a musical instrument) while still a child. With the help of a skull-implanted antenna, he uses one ability to compensate for the lack of another. A wireless camera at one end of the antenna detects colour, which is communicated to Harbisson in the form of musical pitches by a vibration implant in his skull. There being no reason why this arrangement should share in the limitations of the human eye, Harbisson can now ‘hear’ or ‘feel’ colours with frequencies outside the range normally visible to humans. In 2004, after a long bureaucratic tussle with the British Passport Authority, Harbisson won the right to have the antenna included in his passport photo and on this basis claims to be the first government-recognised cyborg. Further afield, Elon Musk, having dealt with electric sports cars and reusable rockets, has focused his attention on high-speed satellite internet access. Today about 3 billion people do not have access to the internet. Musk’s company, SpaceX, plans to bring the web to previously

unserved areas with the Starlink satellite constellation. Starlink is a proposed network of up to 42,000 mass-produced satellites in low Earth orbit, about 14 times the number of man-made satellites currently in space. By sticking closer to the Earth’s surface, Starlink would overcome the latency and speed issues which hamper other internet-providing satellites. However, there are concerns that the massive increase in objects orbiting Earth would endanger other satellites and the increase in light pollution could compromise Earth-based astronomical telescopes. There are 724 Starlink satellites in orbit as of October 2020, with SpaceX hoping to launch 700 more by early 2022. Another possible method of solving our problems by putting things in space is the use of solar panel satellites. Solar panels in space have several advantages over solar panels on Earth: they have access to the frequencies of light which are absorbed by the atmosphere before they reach Earth’s surface, bad weather does not affect them, and they are exposed to sunlight for 24 hours a day. Designs for solar panel satellites involve an array of mirrors reflecting and concentrating light on to a solar cell. The energy generated would be

transmitted back to Earth in the form of microwaves or laser beams. Extraterrestrial solar panels have the usual drawbacks which plague potentially revolutionary technologies in their infancy; the cost of launching such satellites is prohibitive and the transmission process is not as efficient as one would like, but the idea has been marked as promising by NASA and JAXA (the Japanese space agency). Speculating about the outlandish technologies the future will bring has been a pastime for over a century now. H.G. Wells anticipated wireless communication and the atomic bomb (spot on) but also anticipated that we would have figured out world peace and become an entirely egalitarian society by the year 2000 (not quite). J.G. Ballard anticipated that technology would replace physical human contact and widespread alienation would cause outbreaks of hysterical irrationality (more like it). Whether the future will be utopian or dystopian is still up for debate, but that it will be weird is a fairly safe bet. Sean Thompson is a second-year engineering student

Winter 2021| eusci.org.uk 35


future

Living on Mars: a new beginning for the human race? Jessie Hammond investigates how likely it is that humans will be able to colonise Mars in the future The Martian, starring Matt Damon, was a great success when it hit cinemas in 2015. The movie takes place in a futuristic setting in which humans have established short-term settlements to conduct research on Mars. For the most part, the author, Andy Weir, seems to have his science in check. And, with NASA planning to send humans there in the 2030s, it does seem like a trip to Mars may be plausible. So how close are we to colonising Mars in reality? Will we ever have a human race of Martians? Mars is around 55 million km from Earth when its orbit brings it closest. Depending on the position of Mars relative to the Earth at the time of launching it would take 7 to 8 months to get there. If such a journey were to be made, it would be the furthest any human would have gone from Earth before. Sounds like an awesome achievement, right? However, it would also be incredibly stressful. Not only would you be isolated from planet Earth, but you would also have the psychological impact of being in close confinement with five or six people for that long journey. Many studies have investigated how serious this psychological impact this would be, leading to careful consideration on who might make up the most robust team for such a task. In August 2016, a Mars simulation ended after running for a whole year. The project was called the Hawaii Space Exploration Analog and Simulation (HISEAS) in which 6 astronauts and scientists from around the world lived in a dome on the side of a Hawaiian volcano. They had access to solar-powered electricity and could communicate with a

Illustration by Clara Morriss

36 Winter 2021 | eusci.org.uk

base. However, the signals were delayed by 20 minutes to mimic the effect of being on Mars. They were otherwise isolated for the whole year. They found that emergency situations helped to bring the team together and that their biggest challenge was boredom, rather than social stresses amongst the group. So, okay, being in a small spacecraft with a group for a long period of time may not be unbearable. The journey does present other problems, though. In particular, there is the challenge of feeding the crew for 7-8 months without being able to grow anything or resupply. Models based on experience aboard the International Space Station have helped plan for this, though, and it seems that it could be doable. The real challenge is surviving on Mars and in its inhospitable environment. Mars is about ten times lighter than Earth. Because of its smaller mass, Mars has less gravitational pull and has lost a significant amount of its atmosphere since it was formed. This makes it harder for a lot of processes to happen that are taken for granted on Earth, such as the ozone layer protecting the surface from high UV radiation levels. Additionally, Mars has no magnetic field surrounding the planet to protect the surface from solar flares and as an added buffer from radiation. To render the planet more habitable, it has been suggested that a few centimetres of soil on top of a habitat, or living space, could block most of the radiation from penetrating to living areas. Alternatively, people could also live at the north pole of the red planet in ‘balls’ of habitats cut into the ice. Due to the loss of atmosphere, the pressure on Mars has decreased to the point where the surface conditions sit at the triple point of water - the temperature and pressure point at which the solid, liquid and gas phases of pure water coexist in equilibrium. At this point, ice sublimates (transitions directly to the gas phase, bypassing the liquid phase of the substance). Unsurprisingly, then, it is extremely hard to find any liquid water on the Martian surface. Instead, water

Illustration by Clara Morriss

is found in briny ice lakes, which have a high concentration of salt, but could be useful to humans if filtered. The likelihood is that there will need to be a water recycling system in the habitat that humans occupy in order to provide enough of this precious resource.

“It raises the question: to what extent do humans have a right to interfere with any other planet? Are we really justified in altering a planet’s atmosphere just to meet our human needs?” In addition to the significant challenges of high radiation and lack of water, there is also the issue of transport: how exactly do we get everything a human needs to live and survive there in the first place? Where are people going to live and how will it all start? There have been several designs suggested to overcome these problems - most rely on the settlers continuing to live in their space shuttle for some time after reaching Mars. Solar panels are the obvious choice to provide electricity, as Mars gets plenty of sunshine. There are also designs based on 3D printing a habitat. NASA had recently released NASA’s Centennial Challenge to design (phase 1) and fabricate (phase 3) a fully operational habitat to be used for deep space exploration. Last year, the design firm Hassell presented their design for a Mars pod, complete with chairs made of recycled packaging. Although the precise type of structure needed for people to live sustainably on


future Mars is unknown, it is clear that it will push the boundaries of technology and innovative design. If we were to colonise Mars and make it habitable for a large number of humans, we would need a system for producing food. The characteristic redness of Mars is due to the iron oxide dust that coats its surface, whereas underneath that, the ground is in fact grey. This soil is not plant friendly, as it contains perchlorate, which is toxic to most life from Earth. Plants and food would have to be grown in greenhouses connected to the human habitats. Techniques such as vertical farming now being introduced on Earth could be adapted for these greenhouses, allowing sufficient food to be produced using the minimum area of land surface. In case of emergency evacuation, fuel would be needed for spacecraft to escape Mars and get back to planet Earth. Fuel is also essential for both heating and cooking, but from where would we derive this fuel? Luckily, scientists have found a solution for this as well. Methane can be produced from carbon dioxide and hydrogen in a reaction using a nickel or ruthenium catalyst. This would require carbon dioxide from the atmosphere and hydrogen from the electrolysis of water, but it does mean the methane could be liquified and stored for any number of uses. Elon Musk has had his sights on Mars with the SpaceX Starship project for years. He believes that terraforming Mars in the long run is a possibility. In the past, he has proposed using nuclear explosives at Mars’ poles to release greenhouse gases and warm the planet –

although he concedes that massive solar mirrors could also be effective. Currently, temperatures range from -175.6 ° C to 30 ° C, so increasing these values would be a big help in colonising Mars for human civilizations. However, changing another planet’s atmosphere so drastically is regarded as highly unethical and would be controversial down on Earth. It raises the question: to what extent do humans have a right to interfere with any other planet? Are we really justified in altering a planet’s atmosphere just to meet our human needs? Other methods explored for warming Mars up include importing methane or ammonia, again to kickstart greenhouse gas effects. The downside to plans such as these is it would take centuries for Mars to become habitable for humans, so we are still a while off this dream. And again, it may be seen as unethical to change Mars’ atmosphere just for human colonisation. Is the idea of travelling to Mars complete fantasy? It seems not. The science is there to prove it’s possible. A recent publication in Scientific Reports found it would take around 110 people to colonise the planet and for humans to be a multi-planet species. Coincidentally, Elon Musk’s Starship is hoping to have a capacity of 100 passengers destined to the dusty red planet. So, the Martian movie may be closer to becoming a reality than it has ever been before. Jessie Hammond is a fourth year Physics undergraduate student and the Head Sub-editor for EUSci magazine

Illustration by Clara Morriss

Winter 2021 | eusci.org.uk 37


Our ‘right to privacy’ was officially recognised in the Universal Bill of Human Rights by the UN in 1949. It pertains that nobody should be under arbitrary interference with their privacy. It essentially means that you have the right to an ability to seclude personal information. Some things other people just shouldn’t know about. Privacy used to be about keeping your family’s skeletons in cupboards, sharing hushed secrets between friends, or covering your nether regions from a passing stranger. But now, in an online world, a new and expansive source of information that paints itself as being inconsequential is harvested from our internet activity. Interpretation of this data can paint a picture of who you are, so just because no one actually sees you, does it really mean that nobody’s watching?

“Just because no one actually sees you, does it really mean that nobody’s watching?” Data lacks the intimacy of other basic private tenets and it disappears without ever explicitly confronting the individual that gives it away. It was only in 1980 that it was recognised by the Organisation for Economic Cooperation and Development that computers gathering transactions from individuals should retain personal privacy through anonymity. The breakneck speed of technological development since, and our cultural ability to be a sponge of this technological progress, has changed the privacy landscape forever, and the future of data privacy will shift in this new age of information. Data is given away unconsciously, through card payments and cookies (web-based ones, not the ones your card just bought), in trivial information for a quiz to find out what Harry Potter house you would be in, or by downloading a comical face swap app. It doesn’t feel like it’s secret, so why would you need privacy of this personal data? It's valuable. So valuable that Facebook would pay $19 billion USD for WhatsApp, a free messaging 38 Winter 2021 | eusci.org.uk

service. The data collected from message activity is personal information that paints a picture of who you are. Our consumption habits, cultural trends and movement data can be anonymised by name, but the trail of information leaves a distinct fingerprint. All of those questions we ask Google provide a wealth of insight into the nature of human thoughts, however insightful or depraved they may be. As our use of the internet expands, the value of our personal data grows. A former Amazon executive interviewed this year stated that, “they happen to sell products, but they’re a data company.” It’s a twisted fate that the world’s most valuable company is really a data collection agency with the façade of an infinite product courier, when we the customers value the

products, whilst our data comes for free. Artificial (and a dash of human) intelligence can extrapolate the detail from these personal data points to make eerily accurate predictions about an individual’s behaviour from a surprisingly small data set of your previous decision-making. Integrated with machine learning capabilities, these data-interpreted behaviours will continue to improve in accuracy. If an algorithm can determine what movie you should watch next and what you want to eat, it’s not an inconceivably large jump to data-based directives for finding dating partners and getting careers advice, or medical decisions being updated by biologically integrated systems. The manifestation of data harvesting feels benign through its


predominant application for marketing purposes. It’s great when a new book is perfectly selected, or when targeted adverts present just the thing you ‘need’. Personalised products and an ever-increasing convenience; what’s not to like? One example lies in the fact that the distribution of information influences our decision making. We can only act on what we know, but sometimes our innate psychology leads us to be more likely behave in a certain manner when we aren’t even aware that we have the knowledge to act upon it. The threat of diminished data privacy raises the question that if we neglect to value our data privacy, will our personal control of intimate decisions be subtly and gradually diminished? We are unique and often individually unpredictable, but from population-wide data, our decisions perhaps aren’t as independent as we would like to consider them. Like darting eyes that blow your cover on the poker table, predictions about the next action can be preempted through signals that remain true for many people. It’s important that corporate data surveillance isn’t considered the only threat to data privacy. Other human rights such as free speech can be undermined if government agencies that do not anonymise information are

able to collect and exploit your data. The combination of poor legislation and a blasé public attitude towards data privacy could put democracy at risk. Mismanaged state data collection screams the beginning of a descent into an eerily Orwellian scenario.

“Democracy could become at risk with poor legislation regarding data privacy and a blasé public attitude towards data privacy” The corporate intention of gathering data about you to influence your consumption behaviour at least lets you feel like you’re making independent decisions rather than feeling like you’re in a Big Brother scenario. But providing tools for us to make independent decisions has drawbacks due to the limited value that an individual is likely to place in their own data. It’s only your data, but when collected and analysed en masse, the patterns begin to emerge. Take General Data Protection Regulation(GDPR), the privacy legislation most recently implemented by the European Union. GDPR made it legally necessary

that the user at least has the choice to decline their cookies being collected and sold to third parties. Cookies are all the data from a website of how you behave while there: your email to access it, the time you spend on it, the website that took you there, and where you go to next. This personalises the internet for you, and allows for that handy feature of your shopping cart remembering items when you dither for days on whether to buy. The idea wasn’t half-baked, but people either don’t know about the value of data privacy, or apparently aren’t as concerned about it as perhaps the legislative cyber gurus in the EU thought we might be. Such a tasty and innocent word carries no harm, so the convenient option of accepting those cookies and continuing to surf unabated is difficult to turn down. It’s nice to have the choice though, right? Despite the concerns, it’s difficult to foresee data privacy becoming a paramount concern for most people in the immediate future. Generation Y have grown up embedded in the World Wide Web. Our conspicuous presence on social media and its omnipotence shatters barriers of personal privacy that seemed ‘normal’ 25 years ago. There’s often a comment reflecting that all our personal data has already been gifted, so why bother trying to protect ourselves now? Falling down the Hole of the White Rabbit to surf on the web, we have forgotten what privacy felt like at the top. Unfortunately, this seems to mean that until our lives are materially affected, through behavioural influence or surveillance beyond a societal threshold of disturbance, the status quo will proceed. Our gifting of personal tidbits and acceptance of internet surveillance sets a bleak precedent for the future of personal data privacy. We will cherish the privacy we had, but only once it’s gone. It might not feel like the moment Adam and Eve awoke to the embarrassment of their ‘private parts’, but data privacy in an age of information is a far more seminal moment in the cultural evolution of human-kind. Hamish Salvesen is a 3rd year PhD student in Developmental Biology

Winter 2021 | eusci.org.uk 39


It is a little clichéd to spend one's time lamenting the passing ‘glory days’ of any subject. Certainly, the historical image of our black and white scientific messiahs, sitting in dingy woodpanelled libraries contemplating the nature of the universe, is completely outdated. Nevertheless, I wish to make the point that we, in biology, do find ourselves at a stalemate. A war waged between our rose-tinted scientific hearts, and the corporate demands of the modern state of the field. As the number of people vying for money from the holy grant-panel gods increases, longer-term projects with fewer product-based aims seem to lose out greatly. It is becoming an increasingly pressing necessity that we overhaul the way we do biology, with a shift in focus towards theoretical, non-reductionist biology.

“It is becoming an increasingly pressing necessity that we overhaul the way we do biology, with a shift in focus towards theoretical, non-reductionist biology” In a lecture delivered in 1951, Erwin Schrödinger stated that science should not focus on producing technological, medicinal, or industrial advancements, as it was dubious that these actually improved the human condition. This is a somewhat outdated viewpoint because, as with any industry, supplyand-demand logic holds in the scientific world; it is the job of academia to serve the needs of the population whatever they may be. Despite this concession, I would like to bemoan that the way we currently think about biology is outdated and based on irrelevant philosophical underpinnings. In his 2004 paper “A New Biology for A New

40 Winter 2021 | eusci.org.uk

Century”, the late, eminent microbiologist and biophysicist, Carl Woese, lamented that our level of biological understanding is currently on par with that of physics at the start of the 20th century. This is shamefully primitive - even if we are to prioritize the production of technology and medicine over advancement of the field for the sake of knowledge, this level is not enough as advancement in fundamental theory has downstream material effects (see quantum computers or inheritance-based models used in disease modelling). To understand the form a revolution will need to take, we first must understand the fundamental nature of the science. To do this, I will call on two philosophical notions that have been seen to underpin 20th century biology: reductionism and essentialism. Reductionism is the belief that a system can be completely explained by understanding the constituent parts of that system, and those parts by their constituent parts and so on. A classic example would be the notion that the workings of the cell can be understood completely by understanding the ‘central dogma’ of molecular biology (the relationship between DNA, RNA and proteins). There is value in this approach; it resulted in the delineation of the genetic code, after all. However, as Woese writes in his paper, the genetic code “seemed to be merely an arbitrary correspondence table between the amino acids and corresponding trinucleotides”. While this description may seem condescending, it does highlight that reductionism can be valid methodologically, but relying on it completely forces one to make metaphysical concessions that are less in keeping with reality. Ogres may be like onions, but the fundamental nature of reality is not. We often think of science in layers of

explanation, with mathematics the most fundamental. This description doesn’t account for the notion of laws of nature, which transcend and link theories pertaining to each individual science. The strictly layered depiction of science is a metaphysical implication of reductionism and is dangerous as it legitimizes the mockery of biology as merely an application of chemistry. This is, of course, not the case; the theory of evolution set biology apart from chemistry as it accounted for the populistic and non-deterministic nature of living organisms and systems. It was in 1969 that theoretical physicist David Bohm wrote in “Towards a Theoretical Biology”: “It does seem odd… that just when physics is… moving away from mechanism, biology and psychology are moving closer to it. If the trend continues… scientists will be regarding living and intelligent beings as mechanical, while they suppose that inanimate matter is too complex and subtle to fit into the limited categories of mechanism”. In contrast, holism (the opposite of reductionism) states that the parts of a whole are interconnected and we must understand these connections in order to fully understand the notion. Here I must admit that the nifty wording of “biology’s crisis of identity” is not my own. It came from chemist Addy Pross’s book “What is Life? How Chemistry Becomes Biology” – a remodelling of Schrödinger’s 1944 classic. In this, he describes his theory of “Dynamic Kinetic Stability”, an adaptation of the second law of thermodynamics under which the success of self-replicating entities is determined by the amount that they balance their reproduction and death. Through this theory, he is able to propose a tangible link between the fundamental workings of chemistry and those of biology; a holistic theory that has implications for both fields.


Reductionism is intimately connected with essentialism, the other philosophical underpinning of biology that needs to be dealt with. This idea originates in the dialogues of Plato, and it describes various eide, or essences. These are non-physical, objective forms that pertain to various objects, or aspects of reality. For example, the form of a circle exists. We may try to draw a circle but our attempt will fall short due to our inability; still, the ubiquitous notion of the circle persists. Roger Penrose, this year’s Nobel Laureate for physics, discusses Plato’s formic mathematical realm in depth in his seminal book “The Emperor's New Mind”. He paints the picture of scientists - not as artists, creating theories out of evidence or stimulus presented to them - but as explorers, trying to discover the universe’s mysteries from within the realm of ignorance (I recommend the reader look up his exploration of the Mandelbrot Set as an example of this). However, the opponents to essentialism are a vocal bunch. Iconic neo-Darwinian biologist, Ernst Mayr, highlights the fundamental incompatibility of essentialism with science. In his argument, put forward in his 1982 book “The Growth of Biological Thought”, he states that essentialism, as a notion, is in direct contrast with evolution. Essentialism asserts that there is a pure and objective form of each object and, by extension, species. Whereas, it is a diktat of evolution that, not only is

there necessary variation within organisms of the same species, but also that notions of what constitutes each species will change over time as the species evolves. This is a noncompromisable notion in evolution and one that has been rigorously proved. Mayr suggests that population-thinking, rather than essentialism, should be used to describe ideas of biological entities. The critical notion that sets biology aside from the physical sciences is that natural variation is a property that has to be not only accounted for, but held at the centre of any theory that will take hold. Where does all this discussion of philosophy lead us, and how does it link to the way in which biology is being carried out in the 21st Century? When we examine the two old theories of science, essentialism and reductionism, we can start to see the grey area that biology needs to expand into. Holism describes the level of transcendent interdisciplinarity that will be required in order for general theories, like that of Addy Pross, to develop. This nature is evident in the theories of evolution, relativity, and thermodynamics (examples of holistic theories relevant today). Furthermore, populations delineate the crux of the biological context that will need to be maintained in order to preserve the inherent variation and systems biology that is so fundamental. These two ideas make up the new philosophical

underpinnings that we will have to adopt if we are to bring about a revolution in biology. In a paper published at the turn of the millenium, influential biologist Syndey Brenner forecasted a shift in biology away from reductionism, back towards our 19th century theoretical roots, albeit with the leaps and bounds made in technology (particularly data) making the approach even more effective. Despite the optimistic projections of Brenner and Woese, biologists have been failed by not being put in a position to do meaningful, fundamentally conceptual biological research, rather remaining subservient to the productfocused reductionists. If we are to truly advance our science, this will have to change and scientists will have to be allowed the time and funding to explore interdisciplinary, abstract ideas, rather than just spending time in the lab, blindly publishing work. If we can do this, we will be able to make biology relevant in a post-neo-Darwinian era. Harin Wijayathunga is a 3rd Year MBChB student intercalating in Neuroscience

Winter 2021 | eusci.org.uk 41


future

Feeding the fire: a fresh start for forestry Fire and forests have a complex relationship, which human interference has turned sour, but Heather Jones explores how we might be able to make things right again With the undeniable whirlwind of 2020 still very much ongoing, it can be easy to forget that we began the year with the news that Australia was, quite literally, on fire. Then, we heard that the Amazon was burning, again. Then it was Borneo. Then Argentina, and then California, and then Siberia and the Arctic. Fiercer and larger than ever before, forest fires have been raging across the world, leaving devastation in their wake. The question is, why? And what can we do? Fossil charcoal records date the emergence of wildfires to just after that of terrestrial plants, around 420 million years ago. They have been a regular, natural part of forest ecology since then - a force that, while destructive, is also essential for regeneration and even, to some species (like the giant sequoia,

Sequoiadendron giganteum) an integral part of a plant’s life cycle. Fires can clear forests of debris, thinning canopies and undergrowth: making a path for sunlight to hit the forest floor and enabling the next generation to grow. And yet, today, they have become synonymous with disaster. To understand why recent forest fires have been so devastating, for both people and forests, we need to turn to our own history.

“sustainable management alone is not enough to turn the tide on wildfire surges. The other half of the problem is one that threatens all of the world in equal measure: climate change” Indigenous people have utilised small, controlled fires for thousands of years. These are low-intensity, highly regulated, and imitate natural, small-scale wildfires. They were traditionally prescribed to maintain the health and balance of local ecosystems, and used to generate materials, clear areas for grazing, spur food production, and to create natural barriers to wildfire. By burning specific plants and landscapes in certain seasons, they could minimise the dangers of uncontrolled fire, and the landscape could reap the benefits – nutrients would return to the soils, and the forests would regrow. And yet these prescribed burn practices were discarded by settlers in the following centuries, and later even banned. This became particularly widespread in the 1900s, after the western United States

Illustration by Eve Miller

42 Winter 2021 | eusci.org.uk

saw a megafire raze to the ground an area the size of Connecticut in 1910. The US Forest Service began a policy of complete fire suppression which disrupted the natural and integral role that fire plays in shaping the forest. After over a century of these policies, forests are looking very different, and we are only now understanding the true consequences. Fires need only three things to survive: fuel, heat, and oxygen. A forest can provide these things in a dangerous abundance; without fire to speed up removal of debris, dry biomass blankets the forest floor, and ample oxygen is readily available in the air. All it needs is a spark, and most forest fires – discounting those caused by direct human activity – are started by lightning. According to the National Severe Storms Laboratory, an estimated 100 lightning strikes hit the planet every second, which totals around 8 million times per day. Of course, a fire also requires suitable conditions: prolonged periods of dry, hot weather to dry out its fuel. The reason that forest fires have been so often in the news, and raging so widely, is because humans have exacerbated these conditions to an unprecedented degree. Dry seasons are longer and hotter than ever before, and forests are, simply put, full. Due to wildfire suppression, dead trees and undergrowth accumulate, making a dense bed of fuel on the forest floor. For the majority of the year, this is not a problem, but come the dry season? This is the perfect fuel for a wildfire. This thick bed of dead trees and grasses becomes a conveyor, spreading the flames in every direction, without the defences that natural patchwork forests afforded themselves. Since the mid-1980s, there has been a marked increase in large-wildfire frequency, duration, and intensity, as a response to changes in climate. What does this mean for the world? Primarily, fires cause damage to most things in their path. When they occur near to human settlements, they endanger the lives of residents and firefighters alike. When they occur in niche habitats, they can have enormous impacts on biodiversity, with many species losing their


future homes, if not their lives. Even where there are no casualties, fire is incredibly dangerous, in ways that one might not expect – for example, the burning of older buildings can release hazardous compounds like asbestos. Burning trees releases smoke into the atmosphere, which has both immediate and long term effects on air quality, and particles can even become lodged within people’s lungs, causing severe health problems. What’s worse, there exists a vicious cycle. Forests are natural stores of carbon, and are estimated to absorb around 25% of yearly fossil-fuel emissions. With each year that the wildfires rage longer and fiercer than before, more and more of this carbon quite literally goes up in smoke: the 2019-2020 Australian bushfire alone emitted 343 million tonnes of carbon dioxide into the atmosphere. The more carbon released into the atmosphere, the worse the effects of global warming become, and so the more likely forest fires are to occur; the cycle continues. Is there a way to break it? There is no simple solution. Better management going forward involves many complex steps and the integration of several practices, and is, ultimately, unique to each forest and people. But simply, it is about balance.

“Fire is a force that, while destructive, is also essential for regeneration and even, to some species, an integral part of a plant’s life cycle” One step towards a sustainable future with our forests is to use natural forest management practices. These practices include reduced-impact logging (minimising the damage to surrounding trees), extending harvest cycles (allowing trees to grow larger before felling, which increases their carbon stock), thinning of competing vegetation, and avoidance of logging completely in sensitive areas. It means an end to unsustainable slash and burn agriculture, in which large areas of forest are cut down and burned to create a field for growing crops, which can only be used for a few years. It means better prepared landscapes, with the reality of fires and the need for firefighters to access them at the heart of their planning. It means local authorities educating and preparing citizens. It means expecting fires, but working to minimise their harm. It means preparing the land and investing in management before the fires

begin. However, as in traditional indigenous practices, fire can be used for the good of a forest. Around the world, people are rediscovering this wisdom, even after, in some cases, banning it only a century ago. Prescribed burning in the wet season, using small, relatively low-intensity fires, ensures that forests do not become clogged with undergrowth and overrun by dead trees. While it may seem counter-intuitive, controlled burning will mean avoiding large, out of control fires later on in the year, and contributes to the overall health of the forest. Many areas around the world, including both California and Australia, are once again beginning to use prescribed fires at larger and larger scales in order to minimise wildfire risks. These are intentionally set by ecologists and experienced fire managers under extremely controlled conditions. These burns thin the vegetation, and work both to lessen the risk of wildfires by reducing their potential fuel, and by opening up soil and increasing nutrients available to the plants already there. The key to sustainable forest management is planning and balance. Each landscape is unique and must be planned for accordingly, and fires prepared for and used in a way to maximise the health of the forest and minimise the damage. Forest management practices that focus on the health of the forest, rather than the maximum profit we can make from it, are readily available, and are already employed at varying scales around the world. The deeper problem is that sustainable management alone is not enough to turn the tide on wildfire surges. The other half of the problem is one that threatens all of the world in equal measure: climate change. Without a significant reduction in emissions, and stabilisation of global temperatures, things are not going to get better. Fires will continue to blaze, destroying the habitats that we so desperately need to protect, and releasing yet more carbon into our atmosphere. Animals will be displaced, species lost, communities threatened, and landscapes left devastated. Utilised correctly, fires can be part of a healthy, balanced ecosystem, integral to promoting new growth and regeneration. But if we do nothing to counteract what we have done to our atmosphere and let climate change continue unabated, they will not be disappearing any time soon. Heather Jones is a second-year biological sciences student

Illustration by Eve Miller

Winter 2021 | eusci.org.uk 43


regulars: infographic

Infographic by Katharin Balbirnie-Cumming

44 Winter 2021 | eusci.org.uk


regulars: technology

Top 10 tricks to wipe clean your digital footprint In an era of almost exclusively online interpersonal communication, many have come to view data privacy with a growing sense of futility - Cerys Walsh shares 10 ways you can take back control of your information online and start anew 1 – Practice browser hygiene Most standard internet browsers store a cache on websites we have visited in the past, and store cookies to monitor our online activity and tailor web content accordingly. Clearing this information regularly is very easy. The benefits are a reduced susceptibility to targeted advertisements, and improved efficiency of your web browser. AVG (the antivirus software company) recommends cache clearance every two weeks.

4 – Survey online databases So-called ‘people search’ or ‘people finder’ websites make their business by aggregating personal information then selling it on to the highest bidder – be that advertisers, potential employers or just curious individuals. Checking these websites will give you an idea of what personal information about you is publicly available. Deletion of this information upon request is legally mandated under GDPR.

2 – Delete unused accounts If you’ve been an active internet user for any prolonged period of time, you’ve probably signed up to many online services that you don’t currently use. They are likely still hoarding your data. If you have trouble remembering all of these accounts, AccountKiller.com has a handy alphabetised list of direct deletion links to jog your memory.

5 – Customise your privacy settings and permissions If a default feature isn’t designed to enhance the user experience, the chances are that it serves an alternate purpose of data collection that only assists advertisers or training algorithms. Neither of these things benefit you, so – for example – maybe reconsider giving Twitter microphone permissions and full access to your geo-location.

3 – Know your rights Internet users have two core legal rights regarding data disclosure enshrined in law thanks to GDPR: the ‘right to access’ and the ‘right to be forgotten’. You can activate these rights by liberally issuing data requests on your main online accounts to see what information has been collected about you.

6 – Spoof your personal details You are almost always required to provide some personal information in order to use online services, such as full name, date of birth, and address. There is usually, however, nothing to stop you from lying about most or all of these things. Companies and advertisers communicate with one another, and consolidate their information to construct demographic and interest profiles of their users. By offering inconsistent information, you make your various online profiles difficult to link, and the information they provide becomes mostly useless. 7 – Get a new search engine Most default search engines monitor and store your movements across the internet. For instance, Google Chrome has their activity trackers present on over 70% of the top one million websites and can provide all of this logged information to companies, internet service providers, and governments. Consider switching to an anonymised search engine – two good options are duckduckgo and startpage.

Illustration by Alyssa Brandt

8 – Use a VPN To prevent similar tracking of your activity via your IP address you should also think about employing the use of a VPN service. This comes with the added benefit of allowing you to redirect your web traffic anywhere in the world, allowing you to gain access to region-locked content. 9 – Use an ad blocker Targeting algorithms are becoming increasingly sophisticated. Everything you do on the Internet, be it hovering your mouse over a certain item, slowly scrolling through a page, or pausing as you browse an online shop is valuable data for advertisers. The only real way to avoid the perverse data-collecting incentives of advertisers is to avoid ads altogether. 10 – Embrace encryption Encryption keeps your online communications private. While most services like Gmail offer email encryption, Google and other service providers still have access to the contents of your emails and scan them both for security and to deliver more relevant ads and search results. True ‘end-to-end’ encryption in email means that the service provider does not have access to any of the information you send, which you can get from services like Protonmail or Hushmail. The same goes for instant messengers; Viber and Whatsapp are both end-to-end encrypted services, and so is Signal (which is also endorsed by Edward Snowden). Lastly, HTTPS is the encrypted version of HTTP, but often websites default to unencrypted HTTP or fill encrypted pages with links that go back to the HTTP site. You can circumvent this with a browser extension like ‘HTTPS everywhere’, which also makes sure that the HTTPS version is loaded automatically on sites that are known to support it, even if you type URLs or follow links that omit the ‘http:’ prefix. Cerys Walsh is an undergraduate student from the School of Chemistry Winter 2021 | eusci.org.uk 45


regulars: inclusivity

We need diverse representation in STEM, and social media can help

Under-representation is chronic in science, but we can tackle this through visibility campaigns and positive discrimination, argues Zandile Nare There is an overwhelming and undeniable lack of diversity in academic institutions. In the US, the murders of Ahmaud Arbery, Breonna Taylor, and George Floyd, all of whom were unarmed and Black, acted as catalysts to the global #BlackLivesMatter (#BLM) movement. Perhaps the straw that broke the camel’s back was the killing of George Floyd, undoubtedly because of the cruelty of his murder and because the world witnessed a Black man die at the hands of a white police officer as he screamed for his mother on social media - all as two other police officers stood by and did nothing. More

Illustration by Karolina Zieba

46 Winter 2021 | eusci.org.uk

than ever, 2020 has laid bare the hideous reality of racism in all of its forms. Black people in the UK are also subjected to violence from the state. Many British people hold the false belief that police brutality against Black people is only an issue in the US. This thought negates the fact that while Black people make up only 3% of the population, we account for 8% of all recorded deaths in police custody in the UK. Too often when Black people are killed by British police their cases receive little to no media attention, allowing this false narrative to persist. The British police are not innocent. In 2015 Sheku Bayoh, an

unarmed Sierra Leonean man, died from asphyxiation after he was held face down by at least four Police Scotland officers in Kirkcaldy, Fife. Trevor Smith, a father of two was shot dead by an armed police officer in his own bedroom in March 2019. In May 2019, Simeon Francis died in a police cell in Torquay following his arrest, during which he had been held down by multiple police officers from Devon and Cornwall Police. In a video released after his death, Mr Francis is heard saying “I can’t breathe”. Following the death of George Floyd the #BLM protests sparked conversations about race, the history of racism in the UK, and the role of academic institutions. Many universities were forced to reflect on their individual roles in historically racist practices and to renew their commitment to improve diversity and equality in academia. Although the subsequent flurry of statistics revealing the stark lack of diversity in academia were shocking, they were also unsurprising to me in a deeply disturbing way. Having grown up in Scotland, I was never once taught by a Black teacher in primary school or high school, and I did not attend a lecture, workshop, or tutorial given by a Black person during my undergraduate or master’s degree. I am almost ashamed to admit it, but I had not even considered these facts until the conversations surrounding equality and diversity in academia erupted on social media. In her Nature article, Diversity in science: next steps for research group leaders, Nikki Forrester shows that Asian, Black and mixed-race people make up 9.4%, 1.9% and 2% of total academic staff in the UK, respectively. However, only 2.8% of managers, directors, and senior officials in UK universities are Asian, and only 0.9% are mixedrace. Astonishingly, there are no Black people in these roles (figure 1). How can anyone aspire to a particular role if they have not seen anyone who looks like them (be it race, ethnicity, able-bodiedness, gender, sexual orientation etc.) in those roles? Representation matters.


regulars: inclusivity One of the biggest positives to come out of the #BLM movement has been the increased social media visibility of minorities in the fields of science, technology, engineering, and mathematics (STEM). Perhaps the biggest example of this has been the trend of virtual events that highlight Black scientists in a particular discipline (for example #BlackInParasitology, #BlackInMicrobiology, and #BlackInNeuro). These events, often hosted on Twitter and Instagram, feature Zoom presentations and panel discussions surrounding the discipline of the week. As well as highlighting innovative research by scientists of colour, these events also act as an excellent vehicle for addressing imbalances in STEM fields and improving representation. Social media has proven to be a powerful tool that can be utilised to reach everyone from teenagers to research scientists, policy makers, and heads of state all over the world. In the same way that the “#BlackIn...” movement has broadened the general understanding of who can be a scientist and what scientists look like, the same approach can be applied in highlighting the representation and visibility of scientists from other minority groups including Asian, Muslim, LGBTQ+, and scientists living with visible or invisible disabilities. STEM careers must be accessible to everyone.

“There will always be people who argue that positive discrimination is a “tick-box” action that is only performed as an act of political correctness and does not result in real progress. I disagree” Diversity can be defined simply as differences between people. However, when it comes to discussions about race and diversity, precise language is important. According to Dr Kenneth Gibbs Jr. (@ KennyGibbsPhD), “Diversity in science refers to cultivating talent, and promoting the full inclusion of excellence across the social spectrum. This includes people from backgrounds that are traditionally underrepresented and those from backgrounds that are traditionally well represented.” Diversity is vital in STEM because it fosters better problem-solving and innovation, it expands the talent pool, increases competitiveness, and is critical to excellence. Society does not always show the same tolerance for

Illustration by Karolina Zieba

different people. Looking to the future and to new beginnings, society needs to change its views and invest more in diversity and inclusion. Mechanisms that exclude certain groups of people based on their gender, race, ethnicity, sexual orientation, religion, or disability lead to the exclusion of valuable ideas and expertise possessed by these people. Ultimately, inclusivity increases success for everyone. In her article Inclusivity for all: how to make your research group accessible Alaina Levine (@AlainaGLevine) shows that an inclusive and accessible research culture leads to a productive and successful workforce. For example, Dr Mona Minkara (@mona_minkara), a bioengineer at Northeastern University in Boston, Massachusetts (US), who happens to be blind, devised a way to plot visual protein molecular dynamics data allowing her to mathematically analyse protein movements and make observations that were missed by sighted scientists. Levine states, “To produce the best science, to tackle the toughest problems, and to create and innovate interventions that advance our understanding of the universe, everyone’s voice, view, and brain needs to be at the table.” So, what’s necessary to change the status quo? In 2008 the European Parliament’s Committee on Women’s Rights and Gender Equality published a report for the European Commission that highlighted the under-representation of women in STEM research environments and the “leaking pipeline” as female scientists advance further in their careers. In an interview with euractiv. com, the author of the report Brit-

ta Thomsen (@BrittaThomsen) stated that “we should not shy away” from positive discrimination. She went on further to state that “it shouldn’t be an end in itself, but it is necessary to take measures that counteract the current systems and traditions, because these obviously in some way positively discriminate men”. There will always be people who argue that positive discrimination is a “tick-box” action that is only performed as an act of political correctness and does not result in real progress. I disagree. Increased representation and visibility of minority groups will always increase diversity, leading to inclusive excellence and successes for all. The issue of representation is broad and extends itself from race to religion, gender, ethnicity, sexual orientation, and disability. Therefore, it is important to also recognise that what is true for Black people in STEM is also true for other minorities. How then are we celebrating Asian, Muslim, LBGTQ+ and people living with disabilities, and how do we encourage people in these groups to pursue careers in STEM? Although social media will not be the silver bullet to tackle this issue, it seems clear to me that it is an important tool that can be used to start addressing this issue. I would also argue that positive discrimination should be used to counteract the status quo, leading to a new beginning in STEM with more diversity, inclusivity, and success. Zandile Nare (@znzan92) is a final year PhD student at the Institute of Immunology and Infection Research where she studies target-based drug discovery for kinetoplastid diseases Winter 2021 | eusci.org.uk 47


It too often seems that unnecessarily elaborate language is celebrated with “ooh’s and ahh’s”, praised for its sophisticated vocabulary and long, meandering sentences. I agree that this type of writing holds an important place in literature, but somehow it has crossed over into the realm of science. Scientific journals, the main source of new ideas and research, are strewn with winding sentences that seem to tail off without making a point, and the obligation to write in a passive voice causes convolutions that leave me completely lost in the maze of words. I wish I could walk up to the author, sit them down and ask “please, just tell me

48 Winter 2021 | eusci.org.uk

what this means!” When this happens, I seriously question the need to write in formal language – is it truly more important to adhere to societal standards of the ‘correct’ style of writing than being able to get your point across? In case it wasn’t entirely clear yet, I would argue no. I understand how the requirements hold all writing to a high standard, but if it keeps the article from making a clear point, it seems impractical. Aside from the elaborate language, in scientific writing there will always be complex jargon. Every field has its own precise language, and there is almost no way to understand it without having extensively studied the field. Of course, jargon will always be an integral part of scientific writing, to fully explain new ideas and arguments. The issue arises when there is no link from the ‘high circle’ of academia to the rest of us. After a certain point, there are no dictionary definitions or YouTube science teachers to break down the topics. Not everyone has a friendly neighbourhood scientist to help out. The only way to fully understand the jargon of the article would be to read the papers that came before or are linked to it, which in turn have more words that don’t have definitions outside complicated articles; it’s a vicious cycle. If you did muster the courage to dig through piles of papers to try to understand all the foreign words, you might come across an awful realization. Most articles you try to read will greet you with the option to “read full article for this much money!”. There are pay per article or pay monthly unlimited options for any prominent science journal, both of which are exorbitant. I will admit I have found an increase in the number of open access articles, which is great, but typically you’ll still have to pay for knowledge - for example the average article on Elsevier is $31.50! As a university student, I get some access through the university, but there are still articles I can’t read,

presumably because the University hasn’t paid for a subscription. Overall, trying to read through all these papers to understand the jargon takes up a massive amount of time and - more often than not - money. Luckily there are some brilliant people who are working on bridging the science communication gap. Probably one of the most famous examples of this is the Nature Communications journals - articles from all areas of science are written in straightforward language and the entire journal is open access. I’ve also heard that the Science podcast is a great source of information - they invite the lead authors to talk about their latest published research. There are many other incredible free resources, in the form of websites, documentaries, and articles, but these tend to mainly comment on things of public interest. It would be difficult to find very specific information from these sources, and since they are secondary sources, they might not contain the full details of what the original authors were arguing. Believe it or not, a solution already exists. I recently came across something called a 'plain-language summary'. This is a synopsis of the paper written by the author but in very simple language. As perfect as this could be, I've never seen one of these in practise; very few journals require them. And that is unfortunate, because if they were commonplace, I could probably stop ranting. I have no delusions about changing the face of science communication and journal writing with a couple of silly paragraphs, but maybe one day I’ll start a journal of my own dedicated to translating scientific writing for the masses. Text me if you want in! Isha Prabhu is a perpetually confused 4th year chemist, annoyed that she can’t always learn cool new things in easy bitesized youtube videos


Angela Saini’s meticulous investigation into intellectual racism the “toxic little seed at the heart of academia” - reveals the astonishing persistence with which the concept of biological race has been maintained despite centuries of research barely delivering a scrap of evidence in its favour. In today’s turbulent political climate, with the Black Lives Matter movement bringing race firmly back into focus, it is more important than ever to understand what racism truly is, why it exists, and who benefits from its insidious lies. If one were to select a spiritual home of modern-day racism, perhaps it would be the British Museum in London. It is here that Saini goes to set the scene. Here, the bones and possessions of people from all over the globe are reduced, alongside the crown jewels of ancient civilisations, to prized curiosities behind glass. The easy way to make sense of this bizarre reality is to accept the colossal injustice and crimes of colonialism. However, it is those in power who write their history, and so a second narrative exists – one that provides a ‘natural’ explanation in order to let the Western European murderers, rapists, thieves, and torturers off the hook. They were simply superior. Saini traces the idea of race to these colonial beginnings, exploring the disturbing ways in which people who looked and lived differently to white Europeans were treated variably as vermin, circus amusements, lab rats, and savages. Importantly, she notes that even those regarded as the greatest thinkers at the time were not immune to such prejudice: David Hume, the Scottish philosopher “saw no contradiction between the values of liberty and fraternity and the belief that non-whites were innately inferior to whites”. From these entirely ignorant foundations, the notion of race found its way into mainstream science, always morphing and adapting to the political requirements of the day, yet always clinging to the idea that a biological explanation would eventually emerge.

With academic rigour, but also with a personal and engaging tone, Superior connects the dots. Saini interviews an impressive range of people from an indigenous activist in Perth, Australia, whose mother and grandmother were forcibly taken away from their parents and put to work by the colonial government, through to the editor in chief of Mankind Quarterly, a journal dedicated to white supremacy. It is hard not to feel reassured that Saini has made an exceptional effort to include a variety of opinions. Equally impressive is her ability to weave them together into a captivating narrative in which she reveals how racist theories lingered and disguised themselves after the events of the Second World War had forced them into the shadows, and indeed how they have evolved up to the present day. The core argument of Superior is that despite countless studies, scouring of genetic sequences and millions spent on researching race there is still no good evidence that it is anything more than a social construct designed to maintain old power balances and satisfy a cultural desire for identity. It is only through magnificent extrapolation, contortion of data, and ignorance of social factors that any difference can be demonstrated between different races. While genetic variations between humans of course exist, they are much greater within racial groups than they are between them. Worryingly, it appears that race science is becoming more mainstream

again. Saini cites examples of how modern medical research blindly uses race, often running into statistical blunders despite scientists’ best intentions. Furthermore, in 2005 research was published that implied a link between genetics, brain size, and intelligence. However, it wasn’t published in Mankind Quarterly, but in Science, one of the world’s most prestigious journals. Despite this inflammatory research being fully debunked since then, the lead author still insists to Saini that he “follows the scientific method and data, not politics”. If you are a scientist, perhaps this quote contains the most important lesson in Superior. You are biased, whether you like it or not. If you choose to be ignorant of history and politics, if you choose to ignore who is funding your research, if you choose not to concern yourself with who might misinterpret your findings, then it could be costing society a better future. As Saini puts it, “the stories we’re raised on, the tales, myths, legends, beliefs, even the old scientific orthodoxies, are how we frame everything that we learn.” Harry Carstairs is a PhD student in the School of Geosciences. His research focuses on developing remote sensing techniques to detect forest degradation in the tropics

Winter 2021 | eusci.org.uk 49


regulars: innovation

Cometh the hour, cometh the mRNA Farrie Nzvere follows the ups and downs of vaccine development, which culminate in the timely arrival of a successful RNA vaccine for SARS-CoV-2

(WHO) estimates that vaccines prevent 2.5 million deaths per year worldwide.

The Rise and Fall of RNA Vaccines

Illustration by Joyce Wang

Kairos is an Ancient Greek word meaning the “right time” or an “opportune moment”. As scientists worldwide have been plunged into the modern-day ‘space race’ in search of a life-saving vaccine against SARS-CoV-2 (the virus responsible for the Covid-19 pandemic), a new contender has risen to prominence: the RNA vaccine. Unlike conventional vaccines that use modified live viruses or denatured (inactivated) viral particles to elicit an immune response, RNA vaccines use the cell’s machinery to produce antigens which in turn elicit an immune response. To fully understand how we got to this new dawn of RNA vaccines, a trip down memory lane is in order.

“Kairos, an Ancient Greek word defined as the “right time” or an “opportune moment” Prior to its eradication in 1980, smallpox devastated human populations worldwide leaving death and blindness in its wake. Variolation, an earlier form of prophylaxis against smallpox, was 50 Winter 2021 | eusci.org.uk

widely used in China to protect individuals against smallpox, dating back to the 16th century. The process of variolation involved inserting or rubbing dried, powdered smallpox scabs or fluid from pustules of an infected individual up the nose or into superficial cuts made in the skin. Variolation slowly made its way across to India, the Middle East, and Africa. However, it was only in the early 1700s that Lady Mary Montagu, while living in Turkey at the time, was enthralled by the local inoculation practice carried out by elderly Turkish women and sought to introduce it back home in England. This was the breakthrough moment of variolation in the Western world. Several decades later its success led to the development of the smallpox vaccine. As a result, variolation was slowly phased out, and centuries later, smallpox was wholly eradicated. Now, in the year 2020, groundbreaking scientific research has seen the successful development of vaccines for over 27 vaccine-preventable diseases such as influenza, measles, and ebola, to name a few. Smallpox and wild type polio have been eradicated worldwide (except in Pakistan and Afghanistan), and the World Health Organisation

This success story has been mainly attributable to modified live or inactivated virus antigen vaccines, as opposed to DNA and messenger RNA (mRNA)-based vaccines. Spurred on by the prospect of developing novel vaccine technology, scientists set out to determine whether DNA and RNA fragments could be injected directly into the body to elicit an immune response. In the late 1980s and early 1990s, scientists demonstrated that mRNA encapsulated within a liposomal nanoparticle - a cell-membrane-like structure - could successfully introduce RNA into a variety of cells. A year later, Wolff and colleagues published their innovative research showing that injecting ‘naked’ RNA - RNA which is not encapsulated by any proteins, lipids or structures to protect it - directly into the muscles of mice resulted in expression of the encoded protein. These RNA studies laid the foundation for Martinon and colleagues’ 1993 proofof-concept study, in which mRNA encoding influenza viral particles elicited a virus-specific cellular immune response. Further studies demonstrated that mRNA vaccines not only induced a cellular immune response but also an antibody response. They boasted the promise of naturally producing viral antigens inside cells without the need to introduce the actual virus or viral proteins. The prophylactic (preventative) and therapeutic capabilities of this type of vaccine were full of promise, and the excitement could not have been any higher at that point.

63 - days between identifying surface spike protein of SARCoV-2 and first patient injected with an mRNA vaccine. However, as with most cutting-edge innovations, mRNA vaccines soon encountered major stumbling blocks. The first major stumbling block was the rel-


regulars: innovation ative instability of mRNA. After being injected into the human body, naturally occurring enzymes called RNases rapidly broke down the mRNAs before a response could be elicited. Furthermore, the single-stranded mRNA production process was often contaminated by the formation of double-stranded RNA (dsRNA). Combined mRNA and dsRNA had a multiplier effect, stimulating inflammatory immune cells. This resulted in inflammation and an autoimmune response. Unable to overcome these challenges, mRNA vaccines were sidelined and, not faring much better, DNA vaccines were relegated to veterinary medicine where they are used to this day. By the early 2000s, the mRNA hype had all but faded.

Breakthroughs

Behind the scenes, scientists had not given up on this novel type of vaccine. In 2005 the tide began to shift. Two research developments significantly altered the future of mRNA vaccines. Researchers at the University of Pennsylvania demonstrated that modifying the basic building block structure of RNA made mRNA vaccines less likely to induce an adverse immune response. A few years later in 2008 a follow-up study introduced further modifications to mRNA vaccines, leading to enhanced stability and capacity to decode mRNA information and build viral proteins. These two improvements addressed unlocked the potential for mRNA vaccines with increased stability, enhanced production of the encoded protein, and decreased undesirable inflammatory response. Those formerly insurmountable obstacles were overcome, and mRNA vaccines rose from the ashes.

2.5 million - number of lives saved each year by vaccinations (WHO) Following on from the breakthroughs in 2005 and 2008, a burst of basic and clinical research saw mRNA vaccines make significant strides forward. The ability of these vaccines to induce a cytotoxic T-cell response, which detects and kills cancer cells, led scientists to widely explore the possibility of therapeutic mRNA vaccines in cancer therapy - a notion proved feasible by proof of concept studies over 20 years ago. Unlike infectious diseases, the cancer vaccine that is introduced into the body codes for tumour-specific cell proteins that are preferentially expressed in cancerous cells.

This then stimulates the immune system to destroy only the tumour cells and not healthy ones. Successes in this field have resulted in over 50 clinical trials for several vaccines including melanoma, prostate cancer, and acute myeloid leukaemia. Conventional vaccines have failed to provide immunity against challenging viruses that cause chronic or repeat acute infections, such as HIV-1, herpes simplex virus, respiratory syncytial virus, and cytomegalovirus. Moreover, they are still incapable of swiftly responding to emerging outbreaks such as those caused by influenza, Ebola, and Zika virus. The recent Ebola and Zika virus outbreaks emphasised the need to mass produce cheap vaccines rapidly, and about a year after the emergence of Zika, human trials began for an mRNA vaccine. Extensive basic research into RNA, lipid, and polymer biochemistry made it possible to move mRNA vaccine development to clinical trials. To date multiple clinical trials have been completed or are ongoing for influenza, cytomegalovirus, HIV-1, rabies, and Zika virus. These clinical trials would not have been possible without the massive financial investment made in this vaccine technology. Moderna, a biotechnology company formed in 2010 on the back of recent mRNA nucleoside modification research, raised 2 billion dollars to commercialise their mRNA-based vaccines, and many other companies including CureVac AG and BioNTech, have expanded their therapeutic targets for cancer and infectious diseases. Furthermore, the Coalition for Epidemic Preparedness Innovations (CEPI), a billion-dollar multinational public, private, and philanthropic partnership was formed to support the development of vaccines such as mRNA vaccines for emerging outbreaks.

A New Dawn

Fast forward to the present day, when SARS-CoV-2 has brought the world to a standstill and once again we are in need of a vaccine. Of the ten Covid-19 vaccines in final phase clinical trials by November 2020, two are RNA vaccines. Unlike conventional vaccines, RNA vaccines are not made from pathogen particles or attenuated viruses. Thus, they are non-infectious and carry no risk of causing the infection that scientists are trying to prevent. Natural cellular processes rapidly degrade mRNA after the code is used for making the protein and mRNA does not enter the cell’s nucleus; hence, the risk of integration into the genome is low. RNA vaccines are also highly effica-

Illustration by Joyce Wang

cious. Studies show that because the mRNA uses the body’s natural cellular machinery to produce viral proteins, unlike when viral particles are injected directly into the body, they induce a more robust and reliable immune response. Lastly, and most notably in the case of Covid-19, mRNA vaccines can be rapidly produced in a standardised, inexpensive, and highly scalable process

.

“Of the ten Covid-19 vaccines in final phase clinical trials by November 2020, two are RNA vaccines”

On 16 March 2020, the National Institutes of Health (NIH) announced that the first human trial participant had been injected with mRNA-1273, just 63 days after the SARS-CoV-2 surface spike proteins sequence was first isolated. Meanwhile, on 29 April 2020, a mere 43 days after Pfizer and BioNTech announced plans to jointly develop a Covid-19 vaccine, Pfizer and BioNTech’s BNT162 vaccines began human testing. Recent positive developments indicate that Pfizer’s BNT162b2 and Moderna’s mRNA-1273 SARS-CoV-2 mRNA vaccines may become the first-ever RNA vaccines approved for human use by the time this is published. Novel research is invariably full of surprises, and these vaccines could stumble at the final hurdle, but one thing is for sure: RNA vaccines are in their Kairos moment, and the future looks bright. Farrie Nzvere is a first-year Master of Public Health student Winter 2021 | eusci.org.uk 51


regulars: inclusivity

New beginnings by underrated women in STEM Kate Summerson explores how influential women in STEM have contributed to “new beginnings” in the industry today

Once, the most famous scientists were all men. But now that is changing. For many years, women have made significant contributions to science, but in many cases their contributions to discovery and invention have been minimised or underappreciated. We are now in a new era of STEM. The scientific community is beginning to recognise how important it is to promote equality and diversity. Here, we look at some of the women whose achievements and advocacy have paved the way for underrepresented groups in the industry today.

Jane Goodall Jane Goodall (b. 1934) is an English primatologist and anthropologist. She is considered the world’s expert on the social interactions of wild chimpanzees, and best known for her 60-year study of social and family interactions of chimpanzees at the Gombe Stream Game Reserve on Lake Tanzania. The culmination of Goodall’s career, The Chimpanzees of Gombe: Patterns of Behaviour, is regarded as the definitive scientific work on chimpanzees. By discovering that chimpanzees make and use tools, Goodall changed the way the scientific community viewed inheritance. She proved that certain be-

Illustration by Karolina Zieba

52 Winter 2021 | eusci.org.uk

haviours were not exclusive to humans, as had previously been thought. Goodall also discovered that primates use non-verbal language as a way of expressing emotions. From a young age, Goodall had a passion for animals. Her mother encouraged her to pursue a career in primatology, a predominantly male field at the time. In 1960, Goodall began her study without any degree or training. Despite criticism for her research methods, such as naming chimpanzees and attributing emotions to the primates, Goodall was recognised for her discoveries in the field. She refused to accept being discredited for being a woman.

‘In addition to representing women in her field, she uses her position to inspire the next generation of scientists” She later went on to earn her PhD from The University of Cambridge, despite having no undergraduate degree. Through her conservation organisation, the Jane Goodall Institute, her work continues to influence people around the world by spreading a message of peace and sustainability. In addition to representing women in her field, she uses her position to inspire

the next generation of scientists. In 1961, the institute founded the “Roots & Shoots” programme which aims to inspire children to implement practical positive change for people, animals, and the environment by providing teachers with free resources and activities. The program has empowered young people for almost 30 years and continues to expand into more countries each year.

Virginia Apgar

Virginia Apgar (1909 – 1974) was a groundbreaking American obstetrical anaesthesiologist. She specialised in pain relief before, during and after childbirth. She is best known for her simple, rapid method for assessing the viability of newborn babies - the “Apgar Score”. Each newborn is given a score of 0, 1 or 2 in each of the following categories: heart rate, respiration, colour, muscle tone and reflex irritability. The compiled score for each newborn can range between 0 and 10, with 10 being the best possible condition for a newborn. In 1953, the Apgar Score became the first standardised method for testing newborns and is now credited to have saved the lives of thousands, if not millions, of babies. In addition, while researching the effectiveness of the


regulars: inclusivity

Illustration by Karolina Zieba

method, she also discovered that the anaesthetic cyclopropane had a negative effect on the infant she was testing. It was therefore discontinued for use during labor. Apgar had to overcome financial problems, but managed to earn a degree from Columbia University in 1933. Even after graduation, she still encountered bias from surgeons towards anaesthesiologists. By consistently pressing onward into new realms within anaesthesiology, Apgar pushed through contemporary barriers for women and became the first woman at Columbia University College of Physicians and Surgeons to be named a full professor. Towards the end of her career she traveled widely, speaking to audiences about the importance of early detection of birth defects. By doing so, she inspired many, and stood as an outstanding representative for women in her industry.

Jocelyn Bell Burnell Jocelyn Bell Burnell (b. 1943) is an astrophysicist and astronomer from Northern Ireland. In 1967, as a postgraduate student, Burnell discovered radio pulsars, which are a type of dying star. At first, pulsars were thought to be communications from an extraterrestrial civilisation. Now, scientists use pulsars to test General Relativity - a theory of gravitation in strong gravity conditions, building on the work of Einstein. Astronomers are also using pulsars in the Milky Way to detect gravitational waves and extra-solar planets. The discovery of pulsars was one of the most significant scientific achievements of the 20th century. However, in

1974, it was not Burnell who was awarded the Nobel Prize. Instead, it was given to her male supervisor, Antony Hewish.

“Once, the most famous scientists were all men. But now that is changing” During her career, she served as the president of the Astronomical Society from 2002 to 2004 and the president of the Institute of Physics from 2008 to 2010. Last year, after being awarded a Special Breakthrough Prize in Fundamental Physics, she donated the entire £2.3 million to studentships for women and other underrepresented groups in physics. The resulting bursary scheme was named the “Bell Burnell Graduate Scholarship Fund”. This incredible donation will be used to give physics department PhD students in Britain and Ireland any additional money they may need alongside their studies. This has made PhDs more accessible for women that may have young children, disabilities, or visa difficulties. Furthermore, Jocelyn is one of the founders of the Athena SWAN scheme to support diversity in universities. She believes that increasing diversity in the industry will strengthen the scientific community and may lead to more breakthrough discoveries like her own.

Lynn Conway Lynn Conway (b. 1938) is an American computer scientist, electrical engineer, inventor and transgender activist. In 1964, Lynn joined International Business Machines (IBM) Research in

New York. While working on IBM’s Advanced Computing Systems project, she made fundamental contributions to computer architecture. Conway was one of the first scientists to undergo a surgical gender transition. However, IBM fired her when they discovered she was planning to transition. She was forced to start her career all over again with a new identity. Her career was nevertheless extremely successful. She and her colleague Carver Mead are widely known for the Mead & Conway revolution in microchip design. Together they discovered a way to combine tens of thousands of transistor circuits on a single chip, giving them the ability to perform more complicated functions. Today, this technology is still being adapted for modern computers, allowing processing speeds to continue increasing. Lynn worked as a computer architect at Memorex (1969-1972) before joining Xerox PARC, where she developed new types of integrated circuits. Conway then joined the University of Michigan as a professor of electrical engineering and computer science. However, she lived in constant fear of being “outed” and possibly losing her career. To help women like her in the industry and provide encouragement and hope to others in the same situation, Conway started a website that included information about gender transition. Her Transsexual Women’s Successes pages were an important early source of inspiration for those in the industry. The website has evolved into a platform through which she personally contacts and supports people struggling with gender identity. Today, Lynn continues to open the door for transgender men and women into the scientific community. Winter 2021 | eusci.org.uk 53


regulars: inclusivity Marjorie Lee Browne Marjorie Lee Browne (1914–1979) was the third female African-American to receive a PhD in mathematics when she graduated in mathematics at the University of Michigan. Her research was in the field of topology, a branch of mathematics that studies the properties of shapes. Her most notable contribution, A Note on the Classical Groups, details mathematical proof about the geometry of fundamental algebraic objects. She also lectured at North Carolina Central University (NCCU), a historically black university, from 1949 to 1979. At NCCU, she was known to give her own money to students for food, tuition and conferences. However, during her time there, the university was awarded a prestigious National Science Foundation grant. This was the first time the grant had ever been awarded to a predominantly black institution, and enabled Browne to set up the first computer center at the university. After she died, the NCCU established the Marjorie Lee Browne Trust, which sponsors a scholarship and distinguished alumni lecture series. Browne recognised that underrepresentation of black women in STEM fields was a systemic problem and needed to be addressed with long term, sustainable solutions. Her selflessness, advocacy, and passion for mathematics inspired an entire generation of mathematicians who would, otherwise, not have had the opportunity to pursue their dreams.

Although there is still some way to go, the beginning of a diverse, inclusive era in STEM started with them. We owe them a debt; let’s repay it by building the most connected and richly diverse scientific community we’ve ever had. Kate Summerson is a recent MSc Science Communication and Public Engagement graduate from the University of Edinburgh

These five women, and many others like them, deserve to be publicly celebrated for their advocacy and personal achievements in STEM.

“These five women, and many others like them, deserve to be publicly celebrated for their advocacy and personal achievements in STEM” By providing representation, fighting for equality and inspiring women of all backgrounds, these women helped to diversify the scientific community and shape it into what it is today, with more women studying or employed in STEM than ever thought possible. 54 Winter 2021 | eusci.org.uk

Illustration by Karolina Zieba


comic strip and crossword

Comic strip by Marie-Louise Wohrle

1.

Crossword

2.

Across 1. Don’t Know? (4) 3. You Raise Me Up (10) 7. Tools and Techniques (10) 8. 8 (8) 9. Your Surroundings (11) 10. Parasitic Flower (9) 11. Finches (9) 14. Elementary, my dear... (6) 15. A Magical Bird (7) 17. A Drop of Golden Sun (4) 18. Japanese Blossoms (6) Down 1. Cupid’s Lover (6) 2. Scottish Inventor (4) 4. Island Cluster (11) 5. The Sun (4) 6. Birds do this (7) 11. Always Conserved (6) 12. The Snake Got its Tail (9) 13. Vivaldi’s Concerto (6) 15. A Seabird (6) 16. Music Genre (4)

3.

4.

5.

6.

7.

8.

9.

10.

11.

12.

13.

14.

15.

Crossword by Sonia Dahal

16. 17.

18.

Winter 2021 | eusci.org.uk 55


Profile for EUSci Magazine

Issue 27: New Beginnings  

Advertisement
Advertisement
Advertisement