The Oxford Scientist: Impact (#3)

Page 1

MICHAELMAS 2018 • ISSUE 3

IMPACT A New Eye for an Old Eye New Oxford research aims to help patients affected by eyesight loss.

Welcome to the Smart City A tour around the city of the future.

Mirror, Mirror A reflection on one of the most important innovations in human history.


HAVE YOU THOUGHT ABOUT...

A CAREER AS A PATENT ATTORNEY?

An intellectually challenging and rewarding career option

What Does It Involve? Training as a Patent Attorney is a career path that will enable you to combine your understanding of science with legal expertise. You will leave the lab environment yet remain at the cutting edge of science and technology, applying your knowledge and skill in a commercial context. You will help to protect intellectual property assets and grow businesses.

Jenny Soderman MChem in Chemistry University of Oxford (2018)

Sound Interesting? J A Kemp is a leading firm of UK and European Patent and Trade Mark Attorneys with offices in London, Oxford, Cambridge, Paris and Munich. Deadline for our Autumn 2019 intake: 21 December 2018 Hiro Shimazaki MBiochem in Biochemistry University of Oxford (2018)

www.jakemp.com/careers


OXSCI STAFF

OSPL STAFF

EDITORS-IN-CHIEF

CHAIRMAN

Hannah Ralph Ludo Fraser-Taliente

India Barrett

PRINT EDITOR

Polly Halladay

Jade Kinton WEB EDITOR

Jack Holland CREATIVE DIRECTOR

Jack Weatherilt SCHOOLS OFFICER

Jacqueline Gill MICHAELMAS 2018 • IMPACT

CONTENTS

4 EDITORIAL 5 7 9 10 11 12 13 14 16 17 19

BUSINESS TEAM

Hung-Jen Wu Gresa Rustemi

MANAGING DIRECTOR

FINANCE DIRECTOR

Bryce Ning TECH DIRECTOR

Antonia Siu STRATEGIC DIRECTOR

Harry Gosling NON-EXEC DIRECTOR

Katie Birnie

SUB-EDITORIAL TEAM

Ian Foo Eleanor Frew Noah Hearne Megan James Malhar Khushu Shakira Mahadeva Gabriela Mancey-Jones Gerda Mickute Deborah McFarlane Irene Trung

iGEM BLOG

Using Synthetic Biology to Cure Autoimmune Diseases

Copyright © The Oxford Scientist 2018

INTERVIEW

Corrections for the TT18 issue: Lewis Fry served as a Sub-Editor, and Jacqueline Gill was Schools Officer.

Wildcat in the Country: Dr Kerry Kilshaw HIVictory

KIRSTEN LEE

Cannibals, Cattle and Controversy MICHAEL ORRELL

The Impossible Burger PANDORA DEWAN

Catalyst for Destruction SEB ELMES

Full Steam Ahead

MING JIE (JOHNNY) LI

A New Eye For An Old Eye SAMUEL SUSSMES

Keeping Your Cool IAN FOO

Just My Type RYAN ELLISON

SCHOOLS COMPETITION WINNER

Vivien T. Thomas: From Poverty to Pioneer

20 21 22 24 25 26 27 28 30

Print On Demand CARLA V. FUENTESLÓPEZ

Dinosaur Mania: The Victorian Edition

MARIE-CLAIRE KOSCHOWITZ

Welcome to the Smart City CHARIG YANG

Hitting a Nerve CARLA V. FUENTESLÓPEZ

The Day the Atom Stood Still MARIA VIOLARIS

Don’t Fear the Reaper JACK FELTHAM

Fresh Fingers in Forensics SHAKIRA MAHADEVA

Mirror, Mirror OWEN YUNAPUTRA KOSMAN

40 Years of IVF JESS CHERRY


Editorial There can be no doubt that scientific development has – and continues to have - a profound impact on us. From the way we travel, to revolutionary healthcare treatments, how we live has changed. Here, in this Michaelmas 2018 issue of The Oxford Scientist, we have chosen to celebrate the transformative impact of science. From historical discoveries (the development of the steam engine, breakthroughs in blood transfusions), through the innovations that we use every day without a second thought (mirrors, refrigeration technology), we look ahead to the research paving the way for the innovations of tomorrow. The ‘smart city’, once a fantastical concept from science fiction, is becoming reality. Research into artificial nerves offers hope to those who rely on prosthetics. In a planet with limited resources, burdened by the weight of the ever-growing human population, scientific endeavour is our salvation. Will the Impossible Burger, a burger made of plant protein rather than meat, offer the

solution to the destructive nature of livestock farming? Yet, it is important to remember that scientific progress is not always without controversy. Nowhere is this more obvious than in Fritz Haber’s invention of an industrial method to synthesize ammonia. His invention allowed large-scale production of fertilizers, feeding billions of people, but was also used in the production of munitions by Germany during World War I, leading to the deaths of millions of people. Ethical questions continue to surround the ways in which we utilise scientific innovations. From genetically modified crops to the recent use of DNA ancestry databases to solve criminal cases, it is for all of us to discuss how science will continue to impact our lives. We would like to take this opportunity to thank all who have contributed to the production of the magazine, without whom this issue would not have been possible. We hope you enjoy reading this issue!

The Editors-in-Chief Hannah Ralph & Ludo Fraser-Taliente

4

MICHAELMAS 2018


ELLIE BEARD

Using Synthetic Biology to Cure Autoimmune Diseases iGEM is an international competition where teams of university students compete in designing a genetically engineered product to tackle a world problem. Oxford’s team are engineering E. Coli with the aim of creating a new therapeutic treatment for autoimmune diseases by altering the activity of a subset of T cells. The Role of T cells in the immune response T cells are lymphocytes (white blood cells) involved in the adaptive immune response. T helper cells (Th) are a subset of T cells and are characterised by the presence of the CD4 protein on the cell surface. CD4+ cells interact selectively with antigen presenting cells (APC) that express specific antigens – molecules capable of initiating an immune response - on their cell surface. The interactions of Th cells upon activation by a complementary APC will result in the release of small proteins – cytokines – that influence the activity of other immune cell types. Specifically, activated Th cells can enable the maturation of B cells to upregulate the production of antibodies, as well as the proliferation of cytotoxic T cells (cells which kill infected body cells). Our project focuses on enabling the detection of a particular subset of Th cells – Th17 cells. Th17 cells produce IL-17 – a proinflammatory cytokine – and the incorrect balance and functioning of these cells is associated with the development of multiple autoimmune diseases, such as Crohn’s disease or lupus. Treg cells, another subset of T cells,

can prevent the immune system from becoming overactive. They do this by suppressing the action of Th and cytotoxic T cells, thereby preventing the development of an excessive inflammatory state and can limit self-reactivity, thus limiting the development of autoimmune diseases. A fine balance must exist between these two types of T cell to avoid the development of immunodeficient and autoimmune states. Autoimmune diseases Autoimmune diseases encompass a range of 100 known diseases that have a huge clinical burden and are characterised by the following features: • Autoimmune diseases result when the body’s immune system incorrectly targets and attacks the body’s own tissues. • The diseases have a broad range of symptoms and complications, with examples including Crohn’s disease, type 1 diabetes and multiple sclerosis. Autoimmune diseases have a long-term effect on the health and quality of life of patients, as well as having a huge cost and burden on healthcare services. • The prevalence of autoimmune diseases is rising in both the developed and developing world, and there are significant global inequalities in the treatment outcomes. • Current treatments for autoimmune diseases, such as corticosteroids, are associated with a range of negative side effects and often need to be taken on a regular basis to manage the disease symptoms. • Common treatments focus on symptom management and will frequently result in only sub-optimal control, IMPACT

as well as being less accessible in the developing world. Autoimmune diseases can result from an imbalance in the populations of the subtypes of immune cells within the body, such as the balance of Treg and Th cells. We have placed particular attention on autoimmune diseases in developing countries where bacteria, such as segmented filamentous bacteria (SFB), are highly prevalent in water sources or in areas close to livestock. SFB are bacteria that tightly adhere to the gut lining and can modulate host immune responses. For example, SFB has been shown to promote the development of lymphocytes and the differentiation of Th17 populations, resulting in the production of IL17a. As a proinflammatory cytokine, an elevated level of IL-17a due to SFB infection is associated with the development of autoimmune diseases, such as autoimmune epilepsy and multiple sclerosis. The aim of our project is to rectify the notable yet unmet global need for better treatments for autoimmune diseases by creating a probiotic that will alter the immune system’s activity to prevent the self-destruction of tissues. The microbiome and probiotics The microbiome refers to the population of symbiotic microorganisms that normally reside within the human gut and play critical roles in maintaining human health. It is thought that there are around 100 different species of microorganism in the gut in infants, and this number steadily increases to over 1,000 different species in adults. The composition of the microbiome is highly variable between individuals, and is determined by multiple environmental factors such as diet, exercise levels, exposure to antibiotics and age. Changes in microbiome compo5


ELLIE BEARD

sition, most notably a reduced diversity of the component species, is associated with an increased likelihood of developing inflammatory bowel diseases, autoimmune diseases and obesity. The aim of our iGEM project is to design a probiotic – a live bacterial product that will result in beneficial effects in the human body – that can be used to treat autoimmune diseases. The idea is that the genetically engineered bacteria will be digested orally – either in a pill or yoghurt formulation – and the bacteria will reside within the normal microbiome. Using probiotics to target specific diseases is a relatively new phenomenon, but specific probiotic-based treatments include treatments for infectious diarrhoea and C. difficile infections. Our project Our aim is to engineer E. coli so that, when digested as a probiotic, it can reside within the microbiome and regulate the relative populations of Treg and Th cell populations within the body. Our engineered E. coli will contain a system that detects the increased concentration of Th-17 cells and responds by secreting an appropriate amount of IL-10 (an anti-inflammatory cytokine) to prevent overactivity of the immune system and excessive inflammation and self-destruction. A negative feedback loop will indirectly sense the concentration of Treg cells and, when high, will prevent the release of IL-

10 to prevent the excessive suppression of the immune response and the resulting immunodeficiency. We are engineering E. coli to detect nitric oxide (NO), which is released continuously by Th cells. As NO is a small molecule that is permeable through the outer cell membrane of E. coli, NO can be used a suitable biomarker for the size of the population of Th cells. An excessively high concentration of NO is indicative of a high activity of Th cells, and therefore signifies an overactive immune system. Our engineered bacteria will use the SoxR/SoxS promoter system to detect NO and induce the transcription of IL10. This is a system in which NO will bind to the SoxR protein to activate it, and then the active NO-SoxR complex will bind to the SoxS promoter region. The transcription of IL-10 will be under the control of the SoxS promoter, meaning that the activation of the SoxS promoter due to the binding of the active NO-SoxR complex will result in the transcription of IL-10. The transcribed IL-10 should be secreted into the surrounding environment via a secretion system. IL-10 is an anti-inflammatory cytokine, meaning that its release should suppress the inflammatory, autoimmune state in the surrounding environment in the body. Our engineered bacteria will also contain an adenine-induced negative feedback loop to inhibit the overexpression

of IL-10 and the resulting immunosuppression. Adenosine is a small molecule that is synthesised by a series of reactions involving the metabolism of ATP by Treg cell surface enzymes, meaning that adenosine is a suitable biomarker of Treg population size. Our engineered bacteria will express an outer membrane-anchored hydrolase enzyme that will convert adenosine to adenine, which is a signal that our E. coli system can detect. In this manner, adenine can act as an effective biomarker for Treg activity, thus representing a situation in which there is sufficient control on the immune response (and therefore a situation where releasing IL-10 is no longer appropriate). Our feedback loop consists of an adenine riboswitch that controls the transcription of sRNA that will bind to IL-10 mRNA and acts to inhibit the translation and release of IL-10. The absence of adenine will prevent the activation of the riboswitch, therefore inhibiting the production of the sRNA to enable the undeterred translation of IL-10. However, when adenine is present at specific concentrations, the riboswitch will be activated; sRNA will be transcribed and will act to inhibit the translation of IL-10. In this manner, excessive production of IL10 will be prevented and this will enable an optimal, personalised dose of IL-10 to be delivered to the patient. We are proposing that the cultures of probiotic bacteria would be digested using a live yoghurt or tablet, since this would avoid the need for injections, training for appropriate use and the associated risk of infections. The personalised dose of IL-10, along with the adenine-induced negative feedback loop, means that the treatment should not be associated with the side effects of common treatments for autoimmune diseases, such as immunodeficiency. We believe that our design has the potential to be a successful treatment that could overcome the common problems associated with current treatment options for autoimmune diseases. The 2018 Oxford iGEM team won a gold medal and the award for Best Therapeutics Project. Illustrated by Jhanna Kryukova.

6

MICHAELMAS 2018


KIRSTEN LEE

HIVictory What’s next in the fight against HIV? Since 1983, advancements in drugs and technology has made a huge impact in controlling the devastating HIV epidemic. It is thought that the virus spread from primates to humans in the 1920s and has since been transmitted through unprotected sex, using shared unsterile needles between drug users and breastfeeding. The human immunodeficiency virus (HIV) reproduces within immune cells using its own reverse transcriptase enzyme and can outsmart and suppress the immune system, leading to acquired immune deficiency syndrome (AIDS). Antiretroviral therapy (ART) was introduced in 1996 as a singular pill, taken daily. It acts by targeting different stages of the viral replication cycle. This is the most common and, so far, the most effective treatment, with at least 900,000 lives saved as estimated by the World Health Organisation. Weekly injec-

tions and implants of ART are being researched to provide a more convenient and long-lasting solution. As a preventative measure, pre-exposure prophylaxis (a form of antiretroviral drug) can be taken by high risk HIV-negative individuals, reducing the risk of infection by an incredible 86%. Since the epidemic began, 77.3 million people have become infected with HIV. The prevailing stigma against HIV is rooted in promiscuity and MSM (men who have sex with men). Low attendance in sexual health clinics can be overcome partially with testing kits available online and on the high street. However, these are facing criticism for being too expensive, with some costing ÂŁ33.95. Local charities, for instance in Brighton, can fund these testing kits, where they are dispensed through vending machines. Now, the aim is to implement such discrete and accessible testing nationally or even globally.

The United Nations aim to eliminate AIDS by 2030 and have set a 90-90-90 target globally - 90% diagnosed, 90% of those on treatment and 90% of those with undetectable viral loads (fewer than 50 copies of HIV per millilitre of blood). This was met for the first time in London in 2016, but globally it has yet to be achieved. Today, HIV positive patients can lead a largely normal life thanks to breakthrough treatment. The challenge now is against drug resistance, caused by the virus’ ability to continuously mutate, rendering the virus incurable. Such resistance threatens to halt the success of ART. Mutations - alongside the virus hiding inside immune cells - has meant that a successful vaccine has yet to be created. HIV infection in the UK is not the death sentence that it once was, but without more research and resource implementation, it will continue as a global killer.

Study with us at the Nuffield Department of Population Health MSc in Global Health Science and Epidemiology (1 year Full-time Masters) DPhil in Population Health (3 year Full-time or 6 year Part-time) Open to students with a background in mathematics, medicine, biology or statistics

Automatically considered for studentships if you apply by 11 January More information at: https://www.ndph.ox.ac.uk/study-with-us/


Wildcat in the Country

In Conversation with Dr Kerry Kilshaw Dr Kerry Kilshaw works for the Wildlife Conservation Research Unit (the WildCRU), which is part of the University of Oxford’s Zoology department (www.wildcru.org). She has carried out research on several different species ranging from bats and badgers in the UK, to jaguars and ocelots in Belize and small rodents and brown hyenas in Botswana. Although primarily a small carnivore researcher, over the past 8 years she has focussed her attention on the Scottish wildcat, Britain’s only remaining native felid. How did you get involved in protecting the Scottish wildcat? Although the WildCRU has been involved in research on the Scottish wildcat for over 20 years, when I was first introduced to the plight of this small cat in 2008 I had, rather embarrassingly, never heard of this small, endangered cat species but I have been hooked ever since. Why is the Scottish wildcat endangered? TThe Scottish wildcat is a small cat, slightly larger than our domestic cat, and it was once found across Britain, but as a result of intense persecution for its fur and status as a “vermin” of farms and grouse moors, the wildcat had disappeared

from most of its former range by the late 1800s and is currently only found in the Scottish highlands, north of Glasgow and Edinburgh. Extensive habitat loss during WWI and WWII as a result of deforestation pushed the wildcat to the edge of extinction in Scotland in the early 1900s, but changes in land management allowed it to slowly recolonise parts of Scotland. Unfortunately, wildcat populations were heavily affected by the spread of myxomatosis in the rabbit populations across Scotland in the 1950s. Combined with the fact that they did not receive legal protection until 1988 and were, up until this time, allowed to be controlled as a pest species meant that the wildcat population never fully recovered from the brink of extinction in the early 1900s. However, potentially the final nail in the coffin is the presence of one of our most beloved family pets, the domestic cat… How has the domestic cat affected the Scottish wildcat? The Scottish wildcat forms the British population of the European wildcat (Felis silvestris silvestris) and easily interbreeds with the domestic cat (descended from the African wildcat, Felis silvestris lybica) which was introduced into Britain by the Romans 2000-3000 years ago. Unfortunately, this cross-breeding produces fertile offspring and it is this hybridization and the production of “wildcat x domestic hybrids” that is currently thought to be the biggest threat facing the Scottish wildcat population. Cross breeding between wildcats and domestic cats was observed as early as the 1800s by Darwin and other fellow biologists. However, hybridization appears to have increased in the past 30 years or so, probably due to a complicated combination of factors including habitat loss, persecution and prey densities that have led to a dramatic decrease in the local number of wildcats which has then resulted in a lack of available mating partners. As a result, wildcats end up mating with feral domestic cats and hybrids instead of another wildcat and this is leading to, or may have already, resulted in the ge-


HANNAH RALPH BELOW

Showing how difficult it is to tell a wildcat apart from a hybrid and a domestic tabby cat. Copyright: K. Kilshaw

netic extinction of this species. Currently, estimates put the wildcat population somewhere between 100-400 individuals making it our most endangered carnivore. Why is cross breeding such a problem? Extensive hybridisation causes difficulties in distinguishing wildcats from wildcat x domestic hybrids and even feral tabby cats. This lack of clear-cut identification has resulted in problems collecting ecological data on genetically pure wildcats as well as problems enforcing conservation legislation because wildcats are protected but hybrids and feral domestic cats are not. Much of WildCRU’s research in the past has focused primarily on how best to identify the Scottish wildcat with detailed research being carried out on ecological behavioral differences, examination of evolution and morphological differences including skull and bone length and genetic differences. The results of this work led to the production of the first Action Plan for conservation of the Scottish wildcat in 2004. When I joined the WildCRU’s wildcat project in 2009, a new identification tool, the use of 7 specific pelage (fur markings) traits had been developed which allow us to identify wildcats from hybrids and tabby domestic cats using their coat markings. Using this in combination with emerging new technology meant that I was able to trial the first use of remote cameras (known as camera traps) to survey for the Scottish wildcat. This pilot study was hugely successful, much more so than anyone had anticipated because

not only did we actually manage to photograph some wildcats, but we could tell individuals apart. With the use of some fancy statistical analysis we could start to generate population size estimates and get a better idea of how many cats were out there. Camera trapping is now the standard method used across different parts of Scotland for surveying for wildcats. It’s not without its limitations of course, they are expensive and time consuming to run, and cheaper models often malfunction especially if they are not set up correctly but they have enabled large areas of Scotland to be surveyed for the Scottish wildcat and provided us with a much better idea of where they can be found and how many are left. What are WildCRU currently working on? Because hybridization has caused so many issues in wildcat conservation there are still many gaps in our knowledge about how and what this special cat needs to survive. Our current project aims to contribute to our understanding of wildcat ecology and behaviour and how this can affect their conservation management using GPS radio tracking. This project started in 2018 and with the help of the Scottish Wildcat Conservation Action Plan team (www.scottishwildcataction.org) several wildcats and hybrids were successfully collared. The GPS collars used here were already being used on European wildcats in Germany and Switzerland so I knew that they would suit our Scottish wildcat, they are also extremely light weight and I download the data in the field every 2-4 weeks instead of having to wait for the collar to drop off before I can get the data. This is a relatively new feature for a collar of this size and importantly enables us to help reactively manage the wildcat on the ground. What does this data tell us? The data collected is shared with relevant landowners to help them avoid disturbing or accidentally harming the wildcat. One of the cats I had collared spent the winter in the forest but then moved up

into the grouse moors in the summer, so the local gamekeepers were contacted and advised that there was a wildcat using their land and a reminder of what pelage features they needed to look out for when they carry out their predator control activities. He has happily made it through the summer months! So far the collars have generated 7-9 months’ worth of data and more wildcats will hopefully be collared this winter. This wealth of data has already improved and expanded on the existing information available and helps us understand how these cats use the landscape and what habitat features are important for it. It is also helping us to identify areas of potential risk so that we can try and help mitigate these risks. How can we help? You can help by making sure your pet cat is neutered if you live in Scotland. If you own/live on a farm please be aware that wildcats may be using your farm buildings and may be interacting with feral farm cats there. Also, any sightings of wildcats, hybrids and feral cats can be reported on the Scottish Wildcat Action Plan website and more information about the species and the current work being done to protect it can be found there (www.scottishwildcataction.org) and on the WildCRU’s website (www.wildcru. org/research/scottish-wildcat-project/).

ABOVE

Scottish wildcat photographed using a camera trap. Copyright: K. Kilshaw

IMPACT

9


MICHAEL ORRELL

Cannibals, Cattle, and Controversy How prions changed the rules of biology. The major pathogens that spring to the minds of those who haven’t played Plague Inc. are bacteria, viruses, fungi and parasites. These all contain nucleic acids (DNA or RNA) that provide the blueprints for their survival and replication. So, imagine everyone’s surprise when it was suggested that there was something that ignored these blueprints and was responsible for a set of devastating neurodegenerative diseases. Enter the prion. Prions (pronounced pry-ons) are infectious, misfolded proteins that can cause similar protein molecules to misfold, starting a chain reaction that leads to the destruction of nervous tissue. Prion diseases are unusual in that they can occur spontaneously, or be caused by mutations in the prion protein (PrP) gene, or be due to the acquisition of misfolded PrP from infectious material. Prion diseases are still incurable and fatal to this day, with examples including Creutzfeldt-Jakob disease (CJD) and fatal familial insomnia. Prion diseases have been recognised since the 18th century at the earliest, but it wasn’t until the 20th century that they began to be understood. In the 1940s,

“They suggested the cause of scrapie was resistant to nuclease enzymes which break down nucleic acids, but was not resistant to proteases which break down proteins.”

10

the veterinarian W.S. Gordon developed a vaccine against louping ill, a viral disease, using brain tissue from sheep infected with the virus. Within a couple of years, there was a spike in the incidence of a prion disease called scrapie in the vaccinated sheep. Importantly, formalin had been applied to these tissues to inactivate louping ill viral particles but it had failed to prevent the effects of the scrapie agent lurking in the brain tissue used to make the vaccine. In the 1950s, an obscure disease called kuru amongst the Fore (pronounced For-ay) people in Papua New Guinea started to gain attention. Kuru is a prion disease of humans and caused symptoms including tremors, sporadic outbursts of laughter and loss of motor function. It was suspected that some Fore people participated in cannibalism, but how this might link to disease took some time to determine. The fact that women and children seemed more at risk than men couldn’t be explained by a known genetic mechanism, but it was realised that Fore women and children ate the brains of the dead more frequently than men, and were more involved in the cleaning of bodies, something which could have assisted transmission. The scientist William Hadlow spotted a possible link to scrapie, a crucial insight. Stanley Prusiner and his colleagues carried out painstaking bioassays involving the scrapie agent. They suggested the cause of scrapie was resistant to nuclease enzymes which break down nucleic acids, but was not resistant to proteases which break down proteins. Prusiner coined the term “prion” (proteinaceous infectious particle) and accumulated evidence for the protein-only hypothesis MICHAELMAS 2018

of prion disease, leading to his receipt of the 1997 Nobel Prize in Physiology or Medicine. What should have thrilled scientists enraged some instead. Prusiner had critics who were convinced that the scrapie agent had to be some sort of virus, perhaps a slow virus or a tiny atypical virus called a virino. The protein-only model has some flaws but there is insufficient convincing evidence for viral involvement in prion diseases, though some still disagree. Interestingly, many students were fascinated when Prusiner’s work gained prominence in the 1980s, while faculty members seemed more resistant. Perhaps there is hope for the future if today’s students remain open to new concepts as they progress in their careers. Prion biology suddenly came to the forefront in the UK during the BSE (bovine spongiform encephalopathy, A.K.A. mad cow disease) outbreak. It is thought that contaminated meat and bone meal in animal feed led to an increase in the number of cattle with the disease, and that around 200 cases of variant CJD in humans occurred mainly because of consumption of contaminated beef products. Drastic measures were taken: millions of cattle were slaughtered and destroyed, and agricultural law was changed, helping to prevent a human catastrophe. Since that dark period, prion biology has become relevant in the study of less obscure neurodegenerative disorders such as Parkinson’s disease and ALS. Exciting research possibilities lie ahead, but it all comes back to the importance of challenging dogma and accepting new ideas with a strong evidence base. A true paradigm shift.


PANDORA DEWAN

The Impossible Burger Plant-based meat could be the solution to reducing the environmental impact of livestock farming. Animal agriculture has a more destructive environmental impact than any other human technology. It is responsible for over 18% of our greenhouse gas emissions, and many of the hazardous chemicals it utilises escape into local ecosystems. Species may even be driven to extinction by destruction of their natural habitat to accommodate livestock. Yet despite increasing recognition of the destructive impact of eating meat, the global demand for animal-derived foods continues to grow. In 2011, a company called Impossible Foods was created with an ambitious goal in mind: to completely replace the use of animals in food production. Its founder, Dr Patrick O. Brown, considers farmed animals as inefficient ‘factories’ for converting energy from plants into meat. His solution for more efficient meat production is to utilize plants directly. The team’s first mission was to create the Impossible Burger- a plant-based meat that was indistinguishable from the real thing. To do this, they needed to identify the individual compounds that give beef its distinctive properties ABOVE Impossible such as taste, texture and smell. The team adapt- The Burger, made entirely ed analytical tools, including gas chromatogra- without meat. Credit: phy, mass spectrometry and texture probes, to T.Tseng, Flickr. identify these molecules with the aim of replicating them in the lab using plant-based proteins. To simulate texture, the team found wheat proteins that provide chewy firmness, and potato proteins that allow the patty to hold water “Yet despite increasing and transform from a recognition of the soft raw state to some- destructive impact of thing more solid once eating meat, the global cooked. Eventually, they demand for animal-derived discovered the secret foods continues to grow.” ingredient for meat’s unique flavour and scent: an iron-containing molecule called haem. Haem occurs naturally in all plant and animal cells but is particularly

IMPACT

abundant in animal muscle. It can also be isolated from leghaemoglobin - a protein found in soy plants. Even so, a massive area of land would be needed to grow enough plants for industrial haem extraction. To solve this, the team inserted the leghaemoglobin gene from soy into yeast cells. By using genetically modified yeasts instead of plants the land required for haem production can be greatly reduced. After years of experimentation, today’s Impossible Burger deceives meat lovers 47% of the time (although the team are aiming to exceed 50%). This burger requires approximately 95% less land, 75% less water and produces 87% less greenhouse gas emissions than burgers from cows. Humans are unlikely to abandon meat entirely, but by using plants as meat producers instead of animals we could greatly reduce our devastating impact on the Earth. 11


SEB ELMES

When science goes wrong... and right The Haber process killed millions. The Haber process fed billions. How do we handle scientific innovation that both benefits and harms?

onry. During the attack at Ypres, many soldiers chose to shoot themselves rather than continue suffering. 5,000 men were killed and 10,000 injured. Some historians claim that World War One would have been two years shorter without the Haber profter winning the 1918 Nobel Prize for chemistry, cess, as increased food production allowed Germany Fritz Haber emphasised that the ultimate aim of to sidestep the Allied naval blockade and additional science “must be bound up in the moulding influence resources for munitions resulted in increased miliwhich it exerts at the right time upon life in general.” tary strength and endurance. Since the development The work that earned him the prize - the develop- of effective gas masks, chemical weapons have had ment of the Haber process - was likened to “turning limited use since World War Two. Explosives made air into bread” and has had a staggering impact on using the Haber process, however, continue to be human life. It was estimated in 2008 that 48% of the used. One estimate puts the number of deaths from world’s population depend upon fertilisers produced these explosives at over 100 million. by the Haber process for food. His Nobel Prize was The question of whether the Haber process - in controversial however; many scientists boycotted the Haber’s view - fulfilled one of the main aims of sciceremony and his story raises questions about how ence by coming “at the right time” is a tricky one. we should judge scientific progress. The ability to produce larger quantities of explosives Haber’s work centred on the production of fer- made World War One longer and more destructive. tilisers, a precious, finite resource in the 19th cen- Despite an estimated 424,000 German civilians starvtury. Nitrogen, the crucial element in fertilisers, is ing during the war, the Haber process didn’t ease their abundant in the earth’s atmosphere. The challenge suffering as ammonia was diverted for military use. was harnessing it. We had known for over 100 years Alternatively, a case can be made for the process that nitrogen, when reacted with hydrogen, produces arriving just in time to avert a global food crisis. By ammonia and ammonia can 1913, wheat production then be used to make fer- “Perhaps a better question is whether was plateauing but the tilisers. Haber’s innovation the ways in which we have used our population was increaswas calculating the condi- new knowledge are positive or not” ing exponentially. In an tions required to scale this 1898 speech, eminent nitrogen-hydrogen reaction up to an industrial level. chemist Sir William Crookes grimly made the case Since then the global population has soared from 1.6 that “all civilised nations stand in deadly peril of not to 7.6 billion. If there was any doubt about what fed having enough to eat” and that “it is the chemist that this, 50% of the nitrogen molecules in your body got must come to the rescue.” Through this lens, the dethere through a Haber process factory. velopment of the Haber process couldn’t have come A second use of ammonia, however, is in the pro- sooner. duction of explosives like TNT. This is why, at the It’s tempting to question whether scientific develconclusion of World War One, many Allied scientists opments like the Haber process are positive or not. were outraged that a man who would go on to be list- The development of the Haber process as the human ed as a war criminal was awarded a Nobel Prize. population grew, however, was inevitable. Haber Staunchly nationalistic, Haber had volunteered concedes in his Nobel Prize acceptance speech that to oversee the production of chemical weapons and science was “bound to reach” his discovery and lists explosives for the Germans and did so with devas- four other scientists who were close to getting there tating effect. On 22 April 1915, 168 tonnes of poison- first. Perhaps a better question is whether the ways ous chlorine gas were released across the battlefield in which we have used our new knowledge are posat Ypres. The Allied troops were left clawing at their itive or not. To quote Werhner von Braun: “Science faces and throats, fighting for breath. Haber, who fre- does not have a moral dimension. It is like a knife. If quently visited the front lines, believed these chemi- you give it to a surgeon or a murderer, each will use cal attacks to be no worse than conventional weap- it differently.”

A

12

MICHAELMAS 2018


MING JIE (JOHNNY) LI

Full Steam Ahead One of our greatest scientific inventions, the steam engine, ushered in the industrial revolution. The start of a revolution Imagine being confined to small villages, your transportation limited to horses. Known to have forged modern civilisation, the impact of the steam engine significantly contributed to the history of mankind. To award anyone credit specifically for the invention would be to steal credit from another; the engine was not created by one person, rather gradually developed over a century. Fittingly, the first recorded utilisation of steam was made by Hero of Alexandria, a first-century Greek engineer. Hero’s engine demonstrated pressurised steam producing motion, but practical applications were not developed. Major advancements made by three eighteenth century British engineers propelled the nation into one of the most innovative periods of history: The Industrial Revolution. To raise water by fire At the turn of the eighteenth century, factories became hungrier for resources such as coal and iron, meaning workers had to mine deeper for them. However, a major problem appeared: flooding. Manpower was initially employed to drain the underground water, but it soon became apparent that this was insufficient. A more powerful machine was needed. An attempted rescue made by Thomas Savery in 1698 utilised the principle that a partial vacuum is created when steam condenses in a closed vessel. His device released steam into a cylinder, condensing it, thus creating a lower pressure. Atmospheric pressure then pulled water from mines into the cylinder through one-way valves, after which, steam was released again to push the water to the surface. Savery’s method was said to ‘raise water by fire’. However, this preliminary device could only draw water that was a few metres deep and used up enormous amounts of coal. To call it an ‘engine’ would be a stretch, but nonetheless Savery was the first person to practically use steam to perform useful work. Team Thomas Due to its shortcomings, Thomas Savery’s steam engine could not prevent the mine floods. In 1712, another Thomas (Newcomen) independently built on Savery’s design. Newcomen added a movable piston to the steam cylinder, so the vacuum pulled the piston instead of water. The piston was connected to a pivoted beam with the other end connected to a rod, which pulled on a pump handle in the mine. As steam entering the cylin-

der on the induction stroke pushed up the piston, the other end of the beam went down. Cold water sprayed into the cylinder to condense the steam and created a vacuum. Due to the imbalance in pressure, atmospheric pressure pushed down the piston on the downward ‘power’ stroke and the other end of the pivoted beam pulled up like a seesaw. Savery joined Newcomen’s venture later and together, the Thomas duo successfully pumped out the first steam engine. Watt’s next The next step of the journey was arguably the most revolutionary. Up until this point, steam engines were limited to pumping water from mines and were nowhere near the capability of sparking an industrial revolution. James Watt is regarded as the father of the steam engine, making some extraordinary additions to Newcomen’s design. In 1765, Watt added a separate condenser to the engine which boosted the efficiency: the cylinder could function at a constant temperature instead of getting cooled each cycle. But Watt still wasn’t content and worked tirelessly to improve the engine. Sealing the top of the cylinder and using steam to perform both the induction and power stroke instead of relying on atmospheric pressure was a radical change, but Watt’s engine was much more powerful. Factories swarmed into cities; they no longer needed rivers for power generation. The steam engine, instead of fast-moving water, continuously drove waterwheels. This dramatically shifted the workforce and organisation of people, though raising many ethical questions. Watt further endeavoured, transforming the reciprocating motion produced by the engine into rotary motion by connecting a crankshaft to a flywheel. Rotary motion was a huge leap forward, leading to the development of steam locomotives. Manufactured goods no longer relied on horse-drawn wagons as locomotives powered across the country. A Golden Age Thus, the Industrial Revolution dawned. Production became advanced, efficient and profitable as factories had the freedom to set up near large workforces and automate repetitive tasks. Goods, resources and raw materials were supplied and traded quickly via steam locomotives. The daring development story proves that ingenious inventions often require critical people working together to constantly improve on great ideas. The steam engine had profound and long-lasting effects on the way we live and can surely be considered one of humanity’s greatest inventions. IMPACT

13


A New Eye For An Old Eye A 2016 survey of 2000 US adults revealed that the public rank blindness as the worst possible condition to develop, even above conditions such as Alzheimer’s disease, AIDS, cancer, and heart disease. The loss of eyesight was voted to have the greatest effect on quality of life compared to the loss of memory, a limb, speech, or hearing. Here in Oxford and elsewhere, a huge research effort is underway with the aim of transforming the lives of patients battling daily with blindness- the condition which the public seem to fear the most. An estimated 36 million people worldwide are blind, with around six times this number suffering from moderate to severe vision impairment who are on their way to becoming fully blind. Blindness has a wide range of possible causes and can result from damage anywhere along the visual pathway. This pathway starts with the eyes that operate by refracting light through the transparent cornea and lens to be focused onto the retina. The retina contains the light detecting photoreceptor cells called rods and cones that convert light signals into electrical impulses which are sent along neurons making up the optic nerve. The optic nerve travels to various regions within the brain which decipher what our eyes 14

are observing. Blindness can strike anywhere along this pathway, for example from clouding of the lens (cataracts), to retinal degeneration from general ageing or underlying diabetes, or to damage to the brain itself such as glaucoma in which the pressure of the fluid inside the eye damages the optic nerve. It is estimated that over 80% of all blindness is preventable or curable. Despite this, there are still individuals whose blindness cannot be prevented or treated, including those with inherited conditions. These people have been the focus of Oxford researchers developing a range of novel treatments, with innovations spanning genetic engineering techniques, electronic implants into the retina, and stem cell approaches. Retinal dystrophies are conditions of the eye that can lead to blindness and are caused by the inheritance of a faulty gene. They have the potential to be treated using gene therapies, which involve adding healthy copies of the gene to the eye. This happens via a vector such as a harmless but useful virus which has evolved to infect human cells and integrate its DNA into ours. The viruses are modified in such a way so that, firstly, they can’t cause disease in humans like a pathogenic virus could, and secondly enzymes are used

MICHAELMAS 2018


SAMUEL SUSSMES

to incorporate the functional working a retinitis pigmentosa experienced sigcopy of the human gene into the virus’ nificantly improved vision after being DNA. The virus can then infect cells and infected with a virus containing the parinsert its genetic material which can be ticular functional gene he was missing. read alongside the cell’s normal DNA to Gene therapy is not the only method produce functional proteins that were available to tackle retinitis pigmentosa previously missing. though. Other trials involve patients In 2011, Oxford scientists, led by Pro- receiving tiny electronic microchips to fessor Robert MacLaren of the Nuffield replace the function of their damaged Department of Clinical Neurosciences, retinas. These implants, three square began a clinical trial exploring the use of millimetres in area, carry around 1500 gene therapy to treat patients suffering light sensors and are connected to tiny from choroideremia. This is a disease in computers that sit underneath the skin which patients lack a functional copy of behind the ear, much like a cochlear the CHM gene which encodes a specif- hearing implant. The microchip acts ic protein called REP1. It is not known like a camera, capturing light and conhow the deficiency of this protein leads verting it to electrical signals that are to disease but in affected individuals sent directly down the patient’s intact the loss of the functional protein leads optic nerve to the brain. to the death of light-detecting cells in Upon initial activation, patients rethe eye and gradual deterioration of the port flashes of light, but their brains retina, eventually progressing to com- gradually learn to convert these flashes plete blindness. The scientists injected into visual images of shapes and obharmless viruses that contained billions jects. Since most of these patients have of working copies of this missing gene suffered from blindness for decades or into patients’ retinas that would infect longer, the specific areas of the brain such cells and thus restore functioning concerned with vision (the visual corproteins to prevent the cells from dying. tices) take time to reactivate and acThe trial was promising with long-last- climatise. The images produced by the ing results, showing improved vision chip are currently in black and white five years after the initial virus injection. and can be grainy and low-resolution, Choroideremia is, however, a very but this is still a transformative change rare X-linked recessive disease, affect- to the patients’ lives. Trials began in ing mostly young boys at a frequency of 2012, and it is thought that these elecaround one in 50,000 males. There are tronic implants–or bionic eyes–could other, more common causes of inher- become a standard treatment not only ited blindness which involve multiple for patients with retinitis pigmentosa genes, that gene therapy has now since but other disorders involving damaged been applied to. Oxford researchers and non-functional retinas. have demonOxford scienstrated success “Electronic implants — or bionic tists have also of gene therapy eyes — could become a standard suggested anin mouse mod- treatment not only for patients other potential els of retinitis with retinitis pigmentosa treatment option p i g m e n t o s a , but other disorders involving for retinitis pigfor which there mentosa patients: is currently no damaged and non-functional stem cells. Studcure for the retinas” ies transplanted human disease. rod precursor This is one of the most common caus- stem cells into mice that were comes of blindness worldwide and can be pletely lacking a layer of rod cells. Afcaused by a mutation in one of more ter several weeks, the developing cells than 50 genes leading to a loss of the had completely reconstructed this prelight detecting photoreceptors in the viously missing layer. This gives hope retina and is thought to affect around that stem cells could one day be used to one in 4000 individuals. In 2017, this replace the light-sensitive retinal cells progressed to clinical trials: a man with absent in humans with this inherited IMPACT

TOP

The image as viewed with normal vision. BOTTOM

Below, the image as viewed by a person suffering from retinitis pigmentosa. Credit: National Eye Institute, National Institutes of Health.

disorder. Despite this technique looking promising in mice, it’s a long way from being used in human clinical practice. Besides the risks of tumour formation from uncontrolled division of the transplanted cells, other challenges like infection and inflammation prevent cell therapies from being a simple treatment option for humans at this current moment. Much of the work in Oxford has centred around rare, inherited conditions of the eye, but the success of these trials indicate that gene therapies, electronic implants and stem cells are attractive treatment options for the future. They have the potential to extend beyond choroideremia and retinitis pigmentosa to treat more common causes of blindness, like glaucoma and age-related macular degeneration (AMD). Although these novel techniques have not yet evolved into standard clinical treatments, the research in Oxford over the last decade or so makes clear that the goal of curing incurable blindness is well within sight. 15


IAN FOO

Keeping Your Cool The journey to today’s refrigeration technology. The next time you enjoy a cold treat to beat the heat, spare a moment to celebrate the transformative power of refrigeration. This technology saves us daily supermarket trips for meal ingredients, makes crops and meats available year-round, and lets us enjoy chilled drinks and desserts. Such convenience wasn’t always the case. Ancient Greco-Roman methods of cooling involved loading ice and snow into naturally-insulated underground pits until needed. In drier climates, the Egyptians and Indians left boiled jars of water out overnight to cool via evaporation of moisture from the jars’ surfaces. By the 1600s there were new methods: dissolving saltpetre in water was found to absorb heat (an endothermic reaction), giving a cooling effect. The standard method of refrigeration—the vapour-compression refrigeration cycle—was designed in 1805. A liquid coolant enters an expansion valve and is abruptly cooled and vaporised by the reduction in pressure (known as the Joule-Kelvin effect in aerosols). The cold vapour absorbs heat from the fridge contents, and a pump compresses it to high pressure and temperature. This passes through a radiator and expels heat into the surroundings, cooling into a liquid and repeating the cycle. Due to expense, ice remained the main cooling method through the early 1800s. Ice provision became a booming industry—large ice blocks sawn from lakes in winter were stored in insulated “ice houses” that kept well into September. Natural ice gradually became polluted by the Industrial Revolution, prompting a switch by food suppliers to refrigeration technology. The re16

frigerated railroad car revolutionised the transport of produce in 1867, and wider distribution of fresh foods led to healthier diets. Over time, a broad variety of industries, from metalworkers to morgues, adopted refrigeration for the fine control it gave over their various operations. General Electric released the first household refrigeration units powered by electricity in 1927. However, it wasn’t until the invention of cheaper refrigerants, chlorofluorocarbons (CFCs), in 1928 that fridges became widespread in households. CFCs have a big flaw, however: in 1988, they were found to deplete the ozone layer, and their replacements, hydrofluorocarbons, are potent greenhouse gases. It remains to be seen what will be done to reduce their impact. Refrigeration has revolutionised food consumption. Casual, everyday use lets us take it for granted, but make no mistake—without refrigeration, our way of life would be much more tepid. MICHAELMAS 2018


RYAN ELLISON

Just My Type Breakthroughs in blood type identification led to the transforming treatment of blood transfusions. Have you ever given or received blood? The NHS needs 1.6 million pints of blood each year to treat patients in England alone, and 3 blood transfusions are given around the world every second. Blood transfusions are now (at least for the medical staff giving them) a normal affair, and the science behind the components of blood is now well understood. However, this was not always the case; before 1901, receiving a blood transfusion could often be lethal, doing more harm than good. One discovery in that year, however, made the procedure much safer: the identification of blood types. Troubled Beginnings Throughout ancient history, mankind has been fascinated by blood, believing that it held the force of life within it. The first to perform transfusion from animal to man was a Frenchman by the name of Jean-Baptiste Denys, who transfused blood from lambs to mentally-ill people, with the idea that infusing them with blood of a docile animal would calm them. However, it wasn’t long before things weren’t looking good for transfusion – the number of negative reactions and deaths stemming from transfusion (not to mention political pressure from doctors who believed that bloodletting was a superior treatment) lead to the Royal Society banning transfusions, and the French government outlawing them in 1668. The Vatican then spoke out condemning the procedure in 1670. Blundell’s Breakthrough The trail then went cold for 150 years until James Blundell breathed new life back into the field in 1818. He was the first to perform a human-to-human transfusion and strongly advocated that only human blood should be given to humans – a fact revealed to him through his experiments in which treating dogs with human blood invariably proved fatal. Blundell was also the first to use blood transfusion to treat

1667 loss of blood, instead of mental illness. Compatibility and clotting Karl Landsteiner discovered human blood types in 1900, identifying types A, B, and O (AB was later discovered by his assistants). In addition, he correctly theorised about the presence of antibodies in the blood which would burst cells of differently-typed blood and release haemoglobin into the bloodstream, where it is toxic. This discovery explained the symptoms of a negative transfusion reaction while the variation of blood types within Europe explained why it had essentially been a lottery as to whether you had an adverse reaction to a transfusion. However, cross-matching of donor and recipient blood type would not become standard until the 1920s. Another breakthrough was Richard Lewisohn’s 1915 realisation that diluting sodium citrate (already used as an anti-coagulant in labs) to 0.2% made it non-toxic. With the use of citrate, a donor no longer had to be present at the transfusion as blood could be kept for longer. This opened the doors to the modern age of blood transfusion and made services like blood banks one step closer to being feasible. Dawn of the modern Era Following the development of anticoagulants, the stage was now set for the development of what we consider blood donation. The first volunteer blood donor scheme was conceived by Percy Lane Oliver in 1921. Then in 1936, permitted by the practice of electrical refrigeration, the first blood bank was opened in Barcelona. The positive impact of blood donation and transfusion cannot be overstated, enabling a range of vital surgical and medical treatments. To find out more about how you can donate blood, visit www.blood.co.uk/ IMPACT

Dr Jean-Baptiste Denys performs the first blood transfusion from an animal to a human, transfusing blood from a sheep into a 15-yearold boy. Denys then went on to do several more transfusions, this time using more blood, and subsequently the patients died from immune reaction.

1667

Richard Lower performs the first transfusion from an animal to a human in Britain, using sheep’s blood.

1668

The French government and Royal Society ban blood transfusions. Research on the subject falls into obscurity for the next 150 years.

1670

experiments.

The Vatican condemns blood transfusion

1818

James Blundell performs the first human-to-human transfusion.

1865

Louis Pasteur discovers that bacterial contamination causes putrefaction.

1867

Joseph Listers discovers antiseptics. As such, transfusion instruments are now sterilised as protocol.

1900

Karl Landsteiner discovers ABO blood types of human blood.

1915

Richard Lewisohn publishes results showing that 0.2% sodium citrate is a non-toxic anticoagulant.

17


SCHOOLS COMPETITION In January 2018 we launched our first UK-wide school writing competition. The theme for our most recent competition was ‘Inspirational Young Scientists’. We received 171 entries, all of an extremely high standard. The winning article, as selected by our panel of judges was Vivien T. Thomas: From Poverty to Pioneer by Emma Baker, Queen Elizabeth Sixth Form College, Durham. Our winners and runners up all receive prizes kindly provided by Eppendorf. The articles can be read on our website: oxsci.org WINNERS YEAR 13 WINNER (AND OVERALL WINNER) Vivien T. Thomas: From Poverty to Pioneer, by Emma Baker, Queen Elizabeth Sixth Form College, Durham. YEAR 12 WINNER Teachers, We Salute You, by Jemima Longcake, Kirkbie Kendal School, Cumbria. YEAR 11 WINNER A Neat Solution to a Messy Problem, by Zain Ali, King Edward VI Aston School, West Midlands. RUNNERS-UP YEAR 13 RUNNER-UP Treasure Islets: a Revolution in Diabetes Therapy, by Max Peacock, Colyton Grammar School, Devon. YEAR 12 RUNNER-UP Mary Anning: a Teenage Scientist who Defied Gender and Cultural Norms, by Phoebe Hall, Ripon Grammar School, Yorkshire. YEAR 11 RUNNER-UP Daniel Burd: 17 Year Old Eco-Expert, by Anna Grube, Alleyn’s School, London.

JUDGES Dr PRIYANKA DHOPADE is a Senior Research Associate at the Oxford Thermofluids Institute (Department of Engineering Science) at Oxford University. Her research expertise is in the field of jet engine thermodynamics and fluid mechanics. She currently develops computational models for cooling systems in modern jet engines, in collaboration with industry and academia. In 2017, she was chosen as one of the UK’s Top 50 Women in Engineering under 35. She also leads various initiatives to promote diversity in STEM, including the Women in Engineering network at Oxford University. Dr ADAM HARGREAVES is an Evolutionary Biologist in the Oxford Department of Zoology, who loves weird animals. He has worked on sharks, gerbils, lizards and snakes, and tries to understand how changes in an animal’s DNA can lead to the evolution of new “traits” (things like wings, longer necks, venom systems). His current research is trying piece together how venomous snakes protect themselves from their own venom, and to try and use that knowledge to develop better treatments for snake bite. Dr CHICO CAMARGO is a Postdoctoral Researcher in Data Science at the Oxford Internet Institute, where he uses tools from theoretical physics and data science to develop new approaches to questions in the social and biological sciences. He is also is part of the Portuguesespeaking YouTube educational channel BláBláLogia, and has spoken at FameLab and at Oxford’s Science Cabaret. JACQUELINE GILL is a DPhil student in Evolutionary Microbiology in the Department of Zoology, University of Oxford. She was a co-founder of The Oxford Scientist magazine, and established the first national Oxford Scientist school science writing competition, and has managed all aspects of the competition since its launch.

Interested in being the next winner of our Schools Writing Competition? Our theme for this term is ‘Impact’. Write about a way in which science impacts your everyday life. This could be anything from the smartphone you use everyday, the washing powder you use in your washing machine, to the insulin injections you might require as a type-1 diabetic. Upload your article to oxsci.org/schools by January 18th 2019 for your chance to win a prize and see your article online and in print. If you have any questions please email competition@oxsci.org. If your school, sixth form or college would like to subscribe to The Oxford Scientist for just £15 a year, please contact editor@oxsci.org.


EMMA BAKER, QUEEN ELIZABETH SIXTH FORM COLLEGE, DURHAM

SCHOOLS COMPETITION WINNER

Vivien T. Thomas: From Poverty to Pioneer Vivien T. Thomas never had much more than a high school diploma, yet remains one of the most significant pioneers of cardiac surgery to date - having helped save the lives of countless children with congenital heart defects through surgical techniques he invented. His is not a household name; yet his portrait hangs on the walls of one of the most prestigious medical schools in America. The reason? Vivien T. Thomas was an African American, and the grandson of a slave, working in an era when institutional racism was the norm. In the wake of the stock market crash in 1930, a case of serendipity meant that a young Vivien T. Thomas - forced to abandon his plans to attend college and become a doctor - began working as a laboratory assistant to Dr. Alfred Blalock at Vanderbilt University. Quickly, he mastered various surgical techniques and research methodology; by the mid-1930s Thomas was conducting the work of a postdoctoral researcher in Blalock’s lab, despite being paid the equivalent of a janitor. Following cardiac and vascular research (which would act as the foun-

dations to the development of a revolutionary technique they would perform a decade later), work he had conducted with Thomas placed Blalock at the forefront of American surgery - Thomas never being acknowledged for his part. This status led to a job offer for Blalock at Johns Hopkins University in 1941 and Blalock requested that Thomas, who had proved himself invaluable, would accompany him. They soon began working with renowned paediatric cardiologist Dr. Helen Taussig on a surgical solution to the complex congenital heart defect, Tetralogy of Fallot - more commonly known as Blue Baby Syndrome - which occurs in 1 in 2000 newborns. Vivien T. Thomas was tasked with creating a blue-baby-like condition in a dog, and correcting it by means of pulmonary-to-subclavian anastomosis (increasing blood flow to the lungs). Two years and 200 dogs later, he demonstrated that the corrective procedure was not fatal, and in turn offered a lifeline to children suffering from Blue Baby Syndrome. During the first procedure in 1944, Thomas stood on a step stool behind Blalock (as at the time, he was not allowed to conduct it himself), instructing his colleague step-by-step, until the defect was safely corrected for the first time in a human patient. When the Journal of the American Medical Association published the procedure in 1945, Blalock and Taussig received sole credit for the somewhat improperly named: “Blalock-Taussig Shunt”. Thomas was never even mentioned. Due to his poor salary, Thomas resorted to bartending in order to support his family. In one circumstance, when

working at one of Blalock’s parties, he was serving students he had taught just hours earlier. Despite working two jobs (one in ground-breaking surgical research and the other bartending), Thomas devoted much of his time to mentoring a number of African-American lab technicians. In 1976, Johns Hopkins University presented Vivien T. Thomas with an honorary doctorate in an overdue acknowledgement of his contributions to medicine. However, due to restrictions of the time, he could not receive a medical doctorate so instead received an ‘Honorary Doctor of Laws’. For a man who couldn’t even walk down university corridors in a lab coat without turning heads, adversity never prevented him in progressing our medical understanding. Vivien T. Thomas rose above the barriers of poverty and racism and quietly changed the course of medical history, not for credit he inevitably deserved, but because important work he conducted saved lives. “I’ve fixed a heart” he would simply state, and that would be the end of it. As an aspiring young scientist myself, I hope that his resilience to hardship and undying devotion to his field (despite its restrictions to him) is something I can mirror, as these attributes define any great scientist. It is truly inspiring that because of people like him, over time, doors have opened to all into science; and that no matter how unlikely they may seem, anyone can contribute to our understanding of the world. If Vivien. T Thomas, an impoverished college dropout, can work his way to the top of his field, why can’t we?

LEFT

Vivien T. Thomas, in the lab.

IMPACT

19


Print On Demand Bio-printed human tissue structures could be used for toxicology tests and, as a result, reduce animal testing and improve the reliability of pre-clinical tests. Bioprinting is based on additive manufacturing (i.e. 3D printing) and allows for the construction of living tissue by layer-by-layer deposition. In other words, it makes the transfer and patterning of biologically relevant materials possible. This evolving tissue engineering technology is able to construct rapidly, and with high precision, 3D biostructures. It has the potential to revolutionise medicine, offering the possibility to fabricate artificial tissues and revolutionise diagnosis and treatment of multiple conditions. The origins of bioprinting can be traced back to 2002, when Prof Makoto Nakamura began to investigate cell printing and successfully demonstrated, a year later, that cells survived printing. Five years later, he managed to produce a bio-tube resembling a blood vessel. Ever since, efforts to print different tissues have been made across the globe, with promising results. These range from bio-printed scaffolds used to replace hip bones in rabbits (Tissue Engineering & Regenerative Medicine Lab at Columbia University), to skin printers that allow faster wound healing in mice (Wake Forrest School of Medicine), to functional blood vessels and cardiac tissue using chicken cells (Organovo Inc.). Bioprinters are now available commercially, using a range of technologies (e.g. Life-Printer ‘X’ by Bio 3D Technologies, BioFactory by RegenHU, and 3D-Bioplotter by EnvisionTEC). The bioprinting industry is undergoing a rapid transformation and growth. In 2012, the market size was estimated to be USD$2.2 billion and is expected to reach USD$10.8 billion by 2021. The Basics Before depositing the cells, a layer of water-based bio-paper is printed. This bio-paper refers to a scaffold, usually made up of collagen, gelatine or other hydrogels (organic or synthetic), that is 20

CARLA V. FUENTESLĂ“PEZ

used to provide support and protection during printing. It is worth noting that some cells are able to position themselves correctly without requiring additional support. Bio-ink spheroids, which contain aggregates of thousands of cells, are then injected into this layer. These cells must be taken from the patient and cultivated until a sufficient number is available. Then, they are used to create the bio-ink and deposited as several layers to build the final object. The final step, and perhaps the most interesting if yet still not fully understood, comprises the fusing together of the bio-ink spheroids naturally. As this happens, the bio-paper either dissolves away or is removed, leaving a printed tissue only. Moreover, it has been demonstrated that cells are capable of rearranging themselves, making it unnecessary to print all of the tissue details. This was observed when attempting to manufacture blood vessels. Aggregates of endothelial, smooth muscle and fibroblast cells were deposited on the bio-paper and, eventually, cells rearranged themselves to mimic natural vessels, with endothelial cells at the inner layer of the tube, followed by smooth muscle cells and, finally, fibroblasts at the outermost layer. Organovo reported this occurrence using cells from a single person in 2010. While the process outlined above refers to the first technique used, 3D cellular constructs can also be manufactured by thermal transfer, modified inkjet technology, extrusion, laser, microvalves and tissue fragment printing. Advantages One of the main advantages of bioprinting is that cells are placed exactly where they are required, which allows for customisable products. This is particularly useful in order to ensure a perfect fit, for example when dealing with wounds as their shape is irregular and varies enormously in every case. Focusing on the naturally-occurring cell rearrangement, cells self-assemble to form, ultimately, complex functional MICHAELMAS 2018

configurations. This significantly simplifies the design and manufacturing, and allows researchers to create biomimetic systems and control geometry and organisation. In the future, mimicking the physiological and pathological behaviour of native tissues could be useful to test different treatments for cancer, for instance, and select the best option for a particular patient. Future Perspective and Challenges Bioprinting is still a relatively new technology still undergoing experimental work. However, applications such as drug discovery and testing as well as bio-preservation appear to be obtainable soon. While the long-term goal is to print functional multicellular tissues and whole organs on demand, there are quite a few shorter-term objectives currently being explored. For instance, in situ printing would allow you to scan wounds and place cell layers accordingly. Bio-printed human tissue structures could be used for toxicology tests and, as a result, reduce animal testing and improve the reliability of pre-clinical tests. Also, 3D bioprinted tissues/scaffolds could be enhanced with nanosensors that could detect, among others, glucose levels, proteins, and toxins in the bloodstream, making health monitoring easier and more reliable. Bioprinting fully functional organs is the ultimate goal, but the complexity of the functions performed by these organs is key. Less complex organs, like bladders, have already been replicated (done first at Wake Forest University), but more sophisticated organs, like the heart, are still at least 10 years away.. Notably, efforts to 3D-print hearts have resulted in micro-physiological system devices (i.e. heart-on-a-chip with integrated sensors) that mimic the behaviour of human tissue. It is worth mentioning that the aim for 3D-printed organs is to replicate function, meaning that visually, they could vary widely from their natural state. With all the progress in this field, there can be little doubt that the future of medicine will be revolutionized by biological printing.


MARIE-CLAIRE KOSCHOWITZ

Dinosaur Mania: the Victorian Edition The Victorian fascination with dinosaurs paved the way for popular science and still resonates with us today. “Look at him now, how eagerly he pounces up on every living thing that comes within the range of his pliant neck, how cruelly he crushes the bones of his victims, and how greedily he swallows them! We never witnessed such unhandsome conduct in a monster before. Leaving him at his disgusting banquet…” What reads like an over the top description of Drogon, principal dragon on HBO’s Game of Thrones, could not be further removed from this massive TV fantasy spectacle of full-frontal nudity. The above is a quote from “The Age of Monsters”, a famous children’s book of 1859 by John Cargill Brough. Yet it invokes an eerily contemporary approach to entertainment. Brough wrote it as an educational book, riding the first wave of public enthusiasm for the sciences in Victorian England. With the decline of the Roman Empire, the classical academic canon faded into obscurity. By the 4th century, academic study was a privilege of the clerical and noble class and heavily con-

strained by Christian doctrine. Until the industrial revolution hit that is, and the mass manufacturing of goods was suddenly in demand. Industrialists wanted steam engines to make more goods in less time, torches that would never burn out to keep factories lit at night, radiators instead of fireplaces and so on. In short, people had problems that they wanted to be solved, and scientists sorted a lot of that out. But a growing middle class wanted to be entertained too. This brings us back to Brough and his, to today’s audience, fantastical approach to education. The creature he described was a reimagining of a plesiosaur. Plesiosaurs are marine reptiles, distant relatives of dinosaurs and birds, which looked like your classic Loch Ness Monster. In 1842 the term “Dinosauria” was coined by Richard Owen, palaeontologist and founder of the Natural History Museum, London. It translates to “terrible, powerful lizard”, imagery that was picked up on by writers like Brough. Other popular books were Gideon Mantell’s Wonders of Geology (1854) or The Fossil Spirit: A Boys Dream of Geology by John Mill. These authors simultaneously dismissed mythical creatures as children’s tales, while being as colourful in their own descriptions of dinosaurs, creatures wilder than human BELOW

Model dinosaurs in Crystal Palace, London. Credit: CGP Grey, Flickr

IMPACT

imagination. Using the vernacular of legends and folklore as a springboard to familiarise the public with scientific discoveries is typical for the early to late 17th century, as pointed out in “Science in Wonderland” by Dr. Melanie Keene, a detailed treatise on science and technology in Victorian England. Around the same time, the first dinosaurs to be scientifically described featured heavily in the iconic Crystal Palace exhibition in 1854. By then the consumption of popular science writing, illustrations and sculpture as a means of entertainment became the favourite pastime of the newly emerging middle class. It was this environment that provided the ideal stage for Charles Darwin’s theory on natural selection, when he published his seminal work On the Origin of Species in 1859. Not even three generations ago, he would have been chased out of town, but now his ideas found fertile ground in the court of public opinion, in no small part thanks to the existence of the weird animals of yore, that had been unequivocally established in the preceding decades. It was still an uphill battle against the idea of divine creation, as is exemplified by a historical debate between Thomas H. Huxley and Bishop Samuel Wilberforce in 1860 at Oxford’s Museum of Natural History. Exact words have not been transcribed, but Huxley’s ingenious dig at Wilberforce went along the lines that he would not be ashamed to be related to a monkey, but to a man who misuses his great intellect to obscure the truth. The paradigm shift from creationism to natural selection was rapid; within a generation the fundaments of how we as humans perceive our place in this world changed substantially. It led to advancement of biology, chemistry and medical sciences. This is a story as profound as it is amusing, especially next time we come across a late night rerun of Jurassic Park, knowing that we enjoy the same thrill as the Victorians did, two centuries ago. 21


WELCOME TO THE SMART CITY

I

n today’s world of large scale technological innovation and invention, the concept of smart cities is on the rise, but what exactly are smart cities? In essence, they are urban areas where an extensive exchange of data takes place, and where that data is effectively utilised. You may not have heard about them yet, but current technologies including Internet of Things, Edge Computing, and Artificial Intelligence work hand in hand to improve the quality of lives of citizens. But there are several challenges involved in real-time data collection, transmission, and analysis that cannot simply be solved by turning it off and on again.

22

MICHAELMAS 2018


CHARIG YANG

Data Collection: the Internet of Things The Internet of Things, also known as IoT, are devices that can collect and exchange data. This may not sound like something new, but this concept is about allowing data collectors, from temperature sensors to weighing scales, to pass the data onto somewhere useful. One direct application of this is in the healthcare industry, where IoT can be embedded into home medical devices of patients. This way, a comprehensive patient profile can be generated which may consist of vital signs like the patient’s heart rate, glucose levels, physical activities and so on. These recordings are then automatically synchronised with the healthcare provider’s data. This allows healthcare providers to detect patients with abnormal results more quickly and reach out to them before conditions get worse. However, the backend technology is way more complicated than IoT alone. How is the data transmitted and analysed? Data Transmission: 5G Way before Tinder existed, our predecessors were finding dates via telephone services (also known as telepersonals), where callers pay to listen to pre-recorded messages until they find someone they are interested in. Not quite as easy as swiping right back then! As consumers, we continue to witness the transformation of the technology involved in telecommunications. When mobile phones were first invented, the technology was only able to transmit voice messages; text messages did not exist until the 1990s. Fast forward to the present: the amount of data we send and receive over our mobile devices has increased exponentially. Remarkably, since 2014, the global number of mobile devices has exceeded the worldwide population. Thanks to current 4G technology, we are able to support this incredible number of concurrent users and still transmit data quickly. We have indeed come very far, but the rising demand for future applications with faster, novel features means

that the tech must keep up. In the near future, the prevalent use of IoT will mean that mobile gadgets are not the only devices that require communication, but rather the whole city infrastructure will have to be able to transmit large volumes of information concurrently. 5G, or fifth generation mobile networks, may be the solution. 5G utilises a much higher frequency band and upgraded antennas. The increased bandwidth promised by 5G technology is expected to deal with 10 to 100 times the number of connected devices and at a much higher speed, which in turn will be able to support the complex mesh of interconnected devices within the city. 5G also shines when it comes to real-time communication of data. In some applications this is not required: say we have a sensor in a waste bin that alerts the city council whenever it is full; it won’t need to be updated every second. However, if we are talking about self-driving cars making the decision of whether or not to turn, a half second delay in processing is already considered too dangerous to be implemented. 5G utilises a technology called Edge Computing that will dramatically reduce the latency involved. The idea is simple: to move the data analysis tasks from the central cloud to somewhere nearer to the device (called the edge) so that data travels a much shorter distance. Thus, the latency associated with 5G is much smaller and allows for safer use of autonomous cars and drones, among other applications. Data Analysis: Artificial Intelligence Now that we have a huge stream of data available, the challenge is making sense of that data. If the volume of data is small, it can be analysed manually. However, the scale in which we are analysing data here is dramatically larger. For example, Barcelona is about 1/15 of the size of London, yet it is estimated to have over 320 million sensors that cumulatively generate about 8 gigabytes of raw data per day. In addition to the challenge of storing all this data, the data IMPACT

needs to be analysed quickly, such as in forecasting natural disasters to allow rapid evacuation of citizens. This is where artificial intelligence (AI) comes into play. AI can be thought of as the brain of many future technologies: autonomous vehicles, high-frequency trading, virtual assistants and many more. These machines mimic the computational abilities of the brain, and though they are capable of a limited variety of tasks compared to humans, they are able to do it much more quickly. The amount of data that needs to be analysed is so large that we need AI to make sense of it. Final Thoughts: Data Security The idea of smart cities sounds great and will make cities safer and more convenient. However, it does come with several challenges, and a major one is data security. Data leakages are not uncommon these days; every month there seems to be a news report on a big company losing or mishandling consumers’ data, from the Cambridge Analytica Scandal to the NHS hack. There are several possible reasons for a leak, from unprofessional private data handling by corporates to coordinated hacking of governmental data. In the future, when there is even more data in circulation, it will become an even greater challenge to keep them safe. The law will also need to toughen up on companies that do not follow strict data privacy guidelines that are necessary to prevent data breaches. Over the past few decades, we have seen an incredible rise in the kinds of connections humans are able to form; from living in small tribes and villages, to sprawling cities across every habitable continent. We lie on the cusp of a spectacular future of interconnectivity, where every aspect of our personal and professional lives will be augmented and supplemented by the technology at hand. It sounds almost fantastical, something out of a sci-fi book, but the technologies needed for this kind of mass integration and connectivity are almost at hand; we must only see how such a future is brought to reality. 23


CARLA V. FUENTESLÓPEZ

Hitting a Nerve Development of ‘artificial nerves’ has the potential to revolutionise the field of biorobotics and change the lives of those who rely on prosthetics.

Amputation may be needed following a variety of insults, including: chronic infection, which is often provoked by diabetes or ulcers, peripheral artery disease, severe injury, cancerous tumours in muscle or bone, neuromas, and frostbite. Additionally, more and more people face a decline of physical function due to an aging population and a longer life expectancy. Sometimes, this results in the need for prosthetics or orthotics (devices to support immobilisation, such as splints and braces) to maintain or improve quality of life. An estimated 30 million people worldwide require artificial limbs and braces, but only a fifth actually have them. The solution may lie in Additive Manufacturing (i.e., three-dimensional printing) which can create rapid and cost-efficient prosthetics. Despite significant improvements, prosthetics still struggle to recapitulate the sensing, signalling and feedback interactions our body is capable of. By developing skin-like sensory neural networks, it could be possible to restore sensation to prosthetics users and even provide robots with a capability similar to human reflexes or even senses. This ability would allow robots an enhanced interaction with both humans and the environment, a key aspect for the performance of complex tasks. This is particularly relevant to surgical robots, which would benefit greatly from haptic feedback as surgeons would be able LEFT

An early design for a prosthetic, mechanical arm. Designed by Ambroise Paré, 16th century. Credit: Wellcome Images

24

MICHAELMAS 2018

to improve precision and the use of instruments based on the softness, hardness, and texture of the tissue. A group from Stanford University recently created an artificial mechanoreceptor which detects pressure, emulating the functions of a real nerve. The sensor detects minuscule forces and sends signals through a flexible electronic neuron (called a ring oscillator). The signals stimulate a transistor, which stores information to make simple decisions. This model replicates what occurs in the body. For instance, a tap on the knee makes the surrounding muscles stretch. Then the muscle sensors send an impulse through a neuron. It transmits a series of signals to relevant synapses, which recognise the pattern and emit two simultaneous signals, one that makes the knee muscles contract reflexively and another that registers the sensation in the brain. This artificial nerve can even detect simple shape and movement, translating into a massive improvement for prosthetics. Now that the basis of artificial nerves has been established, what’s next for prosthetics? There is a demand for devices that can detect heat and transmit a variety of sensations, be embedded into flexible circuits and even act as an interface to the brain. Artificial nerves have the potential to revolutionise biorobotics and prosthetics and, more importantly improve the quality of life for users.


MARIA VIOLARIS

The Day the Atom Stood Still Laser cooling technology opened the door to a new quantum world. Cooling and trapping atoms founded an ultracold adventure playground for physicists and beyond. From studying superfluidity and superconductivity to probing quantum entanglement and computing, the techniques have even seeped into biology – potentially saving lives. Between 1988 and 1995, Steven Chu of Stanford University, Claude Cohen-Tannoudji of École Normale Supérieure and William D. Phillips of the National Institute of Standards and Technology managed to laser-cool atoms to only several microkelvin above absolute zero. Their technique of trapping atoms between six lasers was coined ‘optical molasses’, effectively immersing each atom in a treacle made of light. This slowed atoms to roughly 1km/hr, compared to 4000 km/ hr in air. As speed is a measure of kinetic energy, and lower energy corresponds to lower temperature, slowing is equivalent to cooling. Therefore, trapped by optical molasses, individual atoms could be studied not only with unprecedented accuracy, but at record low temperatures: work worthy of the 1997 Nobel Prize. Yet in any dystopian action movie, surely a laser is more likely to incinerate the enemy rather than freeze them? To transform this burning weapon into a cooling device, we must consider lasers as a collection of photons, particles of light. When an isolated atom absorbs a photon, it slows down to conserve momentum, then soon emits the photon in a random direction and regains speed from the recoil. But as the incident photons are always from consistent directions, the cumulative effect of many absorptions

slows the atom down. Soon it is stuck in the optical molasses, a nanoscale spider’s web that traps its targets. Laser cooling made the quantum world visible, unveiling a new state of matter. The Bose-Einstein condensate (BEC) was predicted in 1925 by Satyendra Nath Bose and Albert Einstein, to explain an unusual property of Helium-4. Cooled below a critical temperature, it would become a fluid with zero viscosity: a superfluid. It was 70 years later that, thanks to laser cooling, a BEC was finally created in a lab, leading to another Nobel Prize in 2001. Ticks, Twists and Tweezers Picture the depths of CERN’s electronics, the satellite guiding your Sat-Nav or a gravitational interferometer discovering minerals underground. Inside sits an atomic clock, just one technology revolutionised by the precision of laser cooling. The caesium atom within is bombarded with microwaves which it subsequently remits, and the frequency of these emitted waves is measured. By comparing this to a known value, the precise duration of a second is found. For an accurate result the caesium atom must be kept as still as possible, a task for which optical molasses was perfectly suited, thereby improving atomic clock accuracy a hundred-fold. An exciting consequence is that precise atomic clocks could help probe time itself: gravitational interferometers could test the predictions of Einstein’s General Relativity, the theory of time and space currently irresolvable with quantum mechanics.

IMPACT

More unexpected is the side-effect of laser cooling in biology. In the late 1980s, a tobacco mosaic virus was successfully held in optical tweezers. Where a laser beam narrows to form a ‘waist’, there is a strong electric field gradient, and particles in the vicinity become charged. Attracted to the centre of the waist, where the field is strongest, they become trapped in optical tweezers. These tweezers could fashion synthetic tissues, untwist DNA molecules, and produce stunning images of molecular motors to reveal the mechanisms inside cells. More recently in the field of medicine, laser cooling has created an ultracold operating table for new treatments. The latest gene therapies such as CRISPR require careful manipulation of DNA, and optical tweezers could improve current techniques by allowing individual cell modification. Furthermore, research using optical tweezers in 2014 revealed how malaria parasites infect red blood cells, by studying how strongly the parasites adhere to the cells. With the disease becoming increasingly resistant to our drugs, laser cooling could soon be saving lives. The impact of trapping atoms has spanned geophysics to genetic engineering, but what’s next in the field? With trapped ion quantum computers currently holding the highest accuracy, laser cooling may one day help crack currently unbreakable encryptions. Thus, controlling the atomic world has triggered developments across a remarkable range of areas, from the body to the planet and beyond. 25


JACK FELTHAM

Don’t Fear the Reaper New approaches to genetic modification of crops.

Genetically modified organisms (GMOs) have the potential to revolutionise agriculture, making crops more nutritious, reliable and resistant to disease. However, many members of the public express concerns about the impact of GMOs on natural habitats and human health. To alleviate some of these concerns, scientists have been investigating new ways to influence the genomes of crops more subtly. Human beings have been manipulating the genomes of plants and animals since the dawn of civilisation. For example, selective breeding has transformed a single species of wild mustard into foods ranging from cabbage, sprouts, broccoli & cauliflower; to beetroot, turnips & swedes; rapeseed oil & of course mustard itself. However, in recent years the development of genetic engineering has dramatically accelerated our ability to shape biology to our own ends, opening up a world of new possibilities for agricultural development. Most GM plants fit into a category known as transgenics, meaning they have DNA from an unrelated life-form inserted randomly into their genomes alongside a marker gene that makes them resistant to certain herbicides, chemical substances that kill weeds. This allows scientists to select the mutant plants more easily. Transgenic plants have been developed with some truly remarkable properties. For example, golden rice and super bananas both have increased levels of beta-carotene which the body can convert into vitamin A. Other examples include transgenic potatoes which grow better and contain more easily digestible proteins and even edible cotton seeds which were recently approved by the USDA and turn a for-

26

mer waste product into a new and tasty food source. Despite the benefits of transgenic crops, there is still a great deal of concern about their biosafety, especially regarding the marker genes that come with them. As marker genes make plants resistant to herbicides, if they were to escape into wild populations they could easily spread through the environment with unknown consequences. Likewise, adding DNA from a completely unrelated life-forms to plants could change their behaviour in unpredictable ways, posing further ecological risks. In order to relieve some of these concerns, scientists are developing new methods to genetically engineer plants without using foreign DNA. One way of doing this is to make GMOs that are cisgenic rather than transgenic. This means adding DNA from other plants that are either the same species or close enough that they can still be bred conventionally. It effectively means that cisgenic GMOs can only gain genes that could have eventually been added to them by conventional selective breeding. For example, it would in theory be possible to transmit a single gene from a turnip to a cabbage with a careful selective breeding program over many generations. However, the advantage of making a cisgenic plant is that it is much faster and can be done in only one or two steps. This allows scientists to transfer naturally-occurring pest-resistance genes from rare and even wild varieties of a crop into high-yield varieties quickly and easily. Another technology which is allowing scientists to avoid using transgenics is CRISPR. CRISPR is a revolutionary technology which has recently trans-

MICHAELMAS 2018

formed the study of genetics in many living things. By allowing scientists to precisely cut DNA within a cell at a single location, CRISPR can edit genes to turn them on or off under specific conditions or change their functions inside cells. CRISPR is so fast and efficient that a team of scientists from Brazil, Germany and the USA were able to mutate six genes in the wild ancestor of tomatoes in one go, creating a newly domesticated super-tomato with high antioxidant levels and achieving in a single generation what it took our ancestors centuries to do. This super-tomato isn’t just healthy and quick to make, it is a benefit to tomato breeders too.

“Human beings have been manipulating the genomes of plants and animals since the dawn of civilisation.” Centuries of selective breeding has caused our crops to be highly inbred. Much like pedigree cats and dogs, inbred plants are very vulnerable to disease. By taking shortcuts through the domestication process, plant breeders can add genetic diversity to their breeding stock, opening up genetic bottlenecks and creating new opportunities to further improve our food supply in future. These advances allow scientists to access the hidden potential of our crop plants and their ancestors. In doing so, it might just provide humanity with a safer and more acceptable way to secure its food supply through the 21st century and beyond.


SHAKIRA MAHADEVA

Fresh Fingerprints in Forensics The science and ethics of using DNA profiling to solve crimes. It has long been known that our fingerprints are unique to us. This knowledge can be traced back to millennia ago, when the Chinese used thumbprints as signatures on legal documents. And they haven’t gone out of fashion—fingerprints are still in use today, from unlocking our mobile phones to identifying suspects in forensic investigations. They’re crucial to forensics today because, as well as being unique, they remain unchanged during a person’s lifetime. In fact, they have only one main flaw: only intact fingerprints are useful. Partial prints or smudged prints are much more difficult to identify with any acceptable degree of certainty. In 1984, Alec Jeffreys discovered a new technique called “DNA fingerprinting”. We leave our DNA everywhere we go, in the form of strands of hair and flakes of skin. He spotted that digesting particular repetitive regions of DNA with enzymes resulted in unique patterns, which he called “fingerprints”: tiny fractions (<1%) of a person’s DNA that are unique to them. This DNA lies in the non-coding regions between genes (which are not directly used to make proteins) and gives unique patterns because the number of repetitive units varies widely between individuals. Just a year after Jeffreys’ discovery, the technology was already being used in immigration and paternity cases. It also paved the way for the DNA profiling methods we use today. The technology used to measure these repetitive DNA sequences has moved on since 1984, but the principles remain the same. Instead of using enzymes, the repetitive DNA regions, known as short tandem repeats (STRs), are selectively copied and their individual lengths measured in a technique called gel electrophoresis. This allows the number of repetitive units to be determined. By analysing multiple STR sites, the probability that another person in the population has the same STR profile quickly decreases. 16 STR sites are analysed in the UK, which means the probability of having a particular STR profile is one in trillions. It is extremely unlikely you will share the same DNA profile as someone else, unless you are an identical twin! The implications of this are clear: even when present only in trace amounts, comparisons of crime

IMPACT

scene DNA with that of a potential suspect can reliably confirm or deny their association with the crime. Presently, DNA profiles built from a crime scene sample in the UK can be compared to the National DNA Database, which contains DNA from 5.5 million previously convicted individuals. This is an enormously powerful tool: in around 50% of cases where a crime scene DNA sample can be recovered, searching the database identifies a match, and therefore a prime suspect. Other DNA databases are on the rise. Athome DNA testing is becoming increasingly common as people try to uncover their ancestry, sending their genetic information to companies that store it in their databases for interpretation. This raises the question of whether police should be able to search these databases for the purpose of law enforcement. In April 2018, the cold case of the “Golden State Killer”, who committed 51 rapes and 13 murders between 1976 and 1986, was reopened thanks to such commercial DNA databases. US detectives searched the DNA database of a company named “GEDmatch” and found partial matches to crime scene DNA that belonged to a relative of the killer, Joseph James DeAngelo. Together with existing evidence, this information was enough to track down and arrest him. But the use of commercial DNA databases like “GEDmatch” for these purposes raises important ethical questions. DeAngelo’s relative, whose DNA helped crack the case, never gave consent for their data to be used for this purpose. In the aftermath of this high-profile case, users of databases like “GEDmatch” were split in their opinions of whether the genetic data contained in such databases should be used for law enforcement purposes. This highlights the need for debate about this issue between geneticists, ethicists and lawyers in order to determine where the law should stand on this issue. DNA profiling is an incredible technology with the power to convict and exonerate suspects of crimes, but we must ensure that we draw clear ethical and legal lines before we progress further.

27


OWEN YUNAPUTRA KOSMAN

Mirror, Mirror M

y alarm rings at 7:30am before I press snooze for another 30 minutes of shut-eye. After waking up, I browse through my phone for the daily brief of what happened whilst I was asleep. Then I brush my teeth, change, have breakfast and leave the house with my bicycle towards my 9am lecture at Parks Road. Did you spot it? Lying within your morning routine is one of the most important products of human ingenuity in history. It is not bicycles - not even close. Go back millennia and you can still find it. The Ancient Egyptians used it to represent the sun-god Ra; the Greeks ornamented it with images of the gods: Pan, Eros, and Aphrodite. In Medieval Japan, it was sacred and used to ward off evil spirits. Now, it is on either side of our cars. Mirrors have been an object of fascination since antiquity. Traded with valuable possessions like land and glorified to a spiritual level, mirrors shaped cultures. Perhaps part of its appeal is its singular use: the ability to project the real world on a virtual plane. But most importantly, it has no substitute. If I ask you to imagine a mirror from Ancient Egypt, a handful of you might find it difficult to do. Now imagine a well-polished bronze plate. This was the main form of mirrors for millennia. Bronze was the material of choice as it was simple to produce, easily forged and lustrous. What was before an ornamental object now resides in museums all over the world as almost unrecognisable, patina-green artefacts that only make sense if ‘mirror’ is written in the description. It was not until the third century that the first glass mirrors would be created. It was dim and uneven, owing to Chinese guildsmen’s excellent but imperfect glass-blowing skills. Nevertheless, these resembled the glass mirrors we see today, with thin layers of metal bonded behind it. The Venetians perfected their craft in creating such mirrors, to the point where Venice became Europe’s leading exporter. Once again, mirrors made a society great, at least for another 150 years. Unbeknownst to them, mirrors had been mak-

28

MICHAELMAS 2018


ABOVE

An ancient Egyptian bronze mirror, c. 1540-1296 BC.

ing headway in an unexpected faculty: science. Several scientists ventured to investigate its wonders. Diocles, a Mathematician, and Ibn Sahl, a physicist, described and studied parabolic mirrors at around 200 BC and 984 AD respectively. Fast forward to pre-1900: the silvered-glass mirror, direct precursor to modern mirrors, came into existence by German chemist Justus von Liebig in 1835. The rest is history. With iterative improvements to mirror technology, scientists have been able to disembody the concept of mirror from the usual flat, silvery, and reflective form factor humans are used to seeing since the beginning of humanity. So much so that the layman would not realise how important a mirror is to their very existence, because it has evolved into something completely unrecognisable, and for good reason. As much as household mirrors appear flawless, a reflectivity of about 97% is not enough for any scientific significance that often involves low-intensity reflection. It is also non-uniform microscopically in thickness. In a way, mirrors have been cursed with the ideal of reflection that society has endowed, such that sci-

entists, with their knowledge and ingenuity, must realise perfect reflectivity. To do this, we need to define light as photons, quantised energy packets (without this definition, we will not be able to understand how metallic mirrors are inefficient). A near-perfect ideal reflectivity of about 100% is needed to perfectly reflect a photon without significant energy loss. This is a requirement for research and commercial applications. Ironically, modern glass mirrors are too crude for this task. They rely on electrically-conducting metals to absorb and re-emit light photons which will always involve energy loss. They also can only reflect certain wavelengths of light that might not be useful to scientific applications. What scientists need is redefining how the mirror works by rebuilding it from its first principles. Rebuilding also means making big breakthroughs, one small mirror at a time. In the late 1990s, Henry Kwong Hin Choy of The Massachusetts Institute of Technology (MIT) published a paper on the design and fabrication of Distributed Bragg Reflectors (DBR), a new concept that would help usher the commercial use of Vertical-Cavity Surface-Emitting Lasers (VCSEL). In microscopic terms, DBR consists of alternating stacks of GaAs and AlAs semiconductors that have markedly different refractive indexes – these stacks are ‘grown’ in a manner similar to fabricating microprocessors. At each interface, the incoming beam is partly reflected. Given that there are enough number of stacks, it not only can reflect light at over 97% reflectivity (even 100% at some wavelengths), but also be calibrated to reflect different wavelengths of light. This capability has wide-ranging applications especially when coupled with VCSEL, such as Global Positioning System (GPS) satellites, aerospace communication links and other high-power and high-speed communication applications. In fact, MIT introduced yet another breakthrough around the same time: Yoel Fink successfully devised a dielectric mirror that has practically 100% reflectance from every angle at narrow range of wavelengths. Similar in working principle to DBR, dielectric mirrors before this were only able to reflect from a particular angle range. Now, this innovation enables engineers to develop reflective coatings of various geometries, such as a hollow tube for low-loss broadIMPACT

band waveguide or planar films for thermoelectric devices. Most importantly, it can be manufactured at low cost due to the relatively common material makeup: polystyrene and tellurium. These developments do not mean that metallic mirrors are becoming increasingly irrelevant. In fact, the area is so well-researched that metallic mirrors can become rocket science. The Hubble Space Telescope is infamously recognised for exposing one of the most embarrassing mistakes in NASA’s histo-

“You know there is a lot of faith in mirrors when a significant number of scientific observations are dependent on a singular human ingenuity.” ry: a testing error in optical fabrication led to the main mirror being polished in the wrong shape; 2.2 micrometres too flat, to be exact. This led to blurry images and extensive servicing in space by installing corrective optics akin to prescription glasses. Now NASA has learnt its lesson and is making another bet on a new, more advanced telescope scheduled for 2021 launch, the James Webb Space Telescope (JWST). They now have it right, with a reflectance of over 99%. The mirror, made of beryllium for low density and high rigidity, is coated with gold to efficiently reflect infrared waves, the spectrum that is the focus of the observation. You know there is a lot of faith in mirrors when a significant number of scientific observations are dependent on a singular human ingenuity. This is not surprising; the science behind a mirror is so obvious that almost anyone can understand it just by looking at it. Try to think of a simpler form of human ingenuity. This very simplicity is what makes the mirror such a great invention. Their universality spans beyond science and into our daily lives. Their versatility brings us to appreciate the features on our faces and understand the stars. Their concept has been stretched and contorted beyond meaning by engineering and human creativity, yet still serves its purpose. They have stood the test of time, beyond great civilisations and into our hands. You will never see a mirror the same way again. 29


JESS CHERRY

40 Years of IVF Four decades after the first IVF baby was born, the ethical dilemmas are no closer to being solved. This year, Louise Brown celebrated her 40th birthday. In Oldham, 1978, the birth of the first ‘test tube’ baby was heralded as ‘the start of a new era in fertility medicine’. To date, over 8 million babies have been born thanks to this breakthrough. Fertility problems range from defective sperm to the failure of fertilised eggs to attach to the uterus lining. Advancements in treatment mean there is now hope for couples and individuals suffering with these problems. Solving these issues involves a combination of fertility treatments and IVF is the basis for many of them. IVF (in vitro fertilisation) involves removing oocytes from the mother and fertilising them outside of the body. This process routinely involves the suppression of a woman’s normal menstrual cycle, the harvesting of oocytes from follicles and the collection of sperm from the father. In some circumstances, donor eggs and sperm can be used. The fertilised oocyte is incubated until the embryo reaches the morula stage (when it is composed of 10-30 cells). It is then implanted into

30

the mother, or a surrogate. The process of fertilisation itself is often unassisted in that sperm are combined with the oocyte and allowed to fertilise it naturally. However, if sperm are defective, they are directly inserted into the oocyte. Louise Brown was born following 10 years of research conducted by Patrick Steptoe, Robert Edwards and Jean Purdy. The trio conducted 457 cycles of IVF on 282 women, at a cost of over £70,000; over £1.1 million in today’s money. Since the 1970s, advances in technology have meant that IVF has become more accessible and successful. Today, IVF costs between £3,000 - £5,000, with most couples being able to access one cycle on the NHS, although there are huge regional differences. The current success rate stands at around 35% for women under 35, but drops to as little as 2% for women over 45. Whilst still low, this does represent a 10% increase since 1980s. The advancements driving the increased success of IVF include the introduction of laparoscopic surgery, which reduces the risk of damage to mother and embryo, whilst also making the procedure more accessible. Developments in fertility drugs and more treatments for male infertility have also proved to be pivotal. In recent years, there have also been advancements for women who undergo aggressive cancer treatments; eggs can be removed and frozen, ready for use in the future. There is no doubt that IVF and fertility treatment in general is an important scientific and social development, and gives hope to thousands of people who cannot have children naturally. How-

MICHAELMAS 2018

ever, like all scientific innovation, it is not free of ethical dilemmas. Genetic screening of embryos has come under scrutiny. The aim of which, for the most part, is to identify embryos carrying genes which cause life-limiting conditions, such as Cystic Fibrosis. Any embryo with these genes is not implanted. Some people would argue that this is ethically responsible as it prevents children being born with life-limiting diseases. However, campaigners suggest that this could be a way of ‘cleansing’ the population of people with disabilities. There are also fears that genetic screening may extend to parents being able to choose the ‘perfect’ genetic makeup of their child. The arguments over so-called ‘designer babies’ have raged for years and show no sign of being resolved soon. Is it right to provide fertility treatment when there are over 2000 children in care in England alone who need loving, stable families? Moreover, at what point do we refuse people treatment based on age? In the last 2 years, a 72-year-old Indian woman and her 80-year-old husband had a child with the aid of IVF. What will happen to this child when his parents die? Is it responsible to have children at an age where you know you will never see them grow up? These questions highlight the importance of treatment criteria in protecting IVF-born children. It is true that IVF has been a miraculous treatment for thousands, but its story is far from over. With success rates remaining stubbornly low, more research is required, and the ethical concerns surrounding its use have no clearcut solutions.


JOIN OUR SENIOR EDITORIAL TEAM FOR HT19 EDITORS-IN-CHIEF

Manages the production of the magazine, liases with the publishers, and has the final say on all editorial and creative decisions. WEB EDITOR

Runs the blog of our website and is responsible for site maintenance. PRINT EDITOR

Commissions the articles for the magazine, and leads the team of sub-editors who make sure every article meets our high print standards. NEWS EDITOR

Leads the news team, who write news stories for the print issue and for the website throughout the term. CREATIVE DIRECTOR

Designs all aspects of the magazine and commissions the illustrations from our art team. Lays-in the magazine in InDesign, so experience with this software is essential. BUSINESS MANAGER

Leads the business team in securing advertisments for the magazine and website. SCHOOL LIAISON

Contacts schools to sell subscriptions to the magazine.

Email editor@oxsci.org to apply.


oxsci.org


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.