Page 1


Issue 22: Spring 2018 Issue 22: Spring 2018

intersections between science and Pop Culture



HEDY LAMARR Starlet to Scientist

PEER PRESSURE The psychology of consumerism

Contents Focus 11 Let me take a selfie: from mad scientist to insta-famous Bonnie Nicholson narrates how scientists have used social media to alter the public perception of science 12 The scientist’s claim to fame Helen Feord discusses the importance of celebrity scientists and their role in humanising the discipline 14 The scientist who has it all Genevieve Brown wonders why pop culture is obsessed with the scientific polymath 15 The scientist look Helena Cornu contrasts the image of scientists in popular culture with life in the lab because all scientists are not, in fact, socially challenged, workaholic geniuses 16 The star who brought us Wi-Fi Carlos Martínez-Pérez tells the story of how a Hollywood starlet became the unlikely mother of modern communications 18 Science needs you Clari Burrell discusses the active involvement of the public in scientific research 19 Fake news: the state of science in a post-truth world Tom Edwick looks at what the recent return of ‘flat-earthers’ can tell us about the state of science today 20 It’s a pug’s life Kate Downie explores the veterinary health concerns linked to brachycephalic dog breeds, such as the pug 22 The trial of a Time Lord Andrew Bease examines the evidence to determine whether the fictitious race of time travellers could potentially exist

24 Superhero origin stories: fact or fiction? Dianna Bautista explores the plausibility of becoming a superhero 26 The perfect song? Alina Gukova looks for the elusive patterns in songs in a quest to find the ‘perfect song’

38 Is ecology the original science? Louis Walsh explores whether early human interaction with the environment was instinct or ecology

27 What you wear tells a story: make sure it’s a good one Milda Lebedyte explores how science can help fashion become more sustainable

40 Our human bodies: a story of reform, revolution and change Reminding us of the marvel of the human form, Haris Haseeb describes the series of events which the body undergoes in the moments after birth

28 From folk tales to memes: why we’re obsessed with pop culture Teodora Aldea explores the evolutionary implications of popular culture


30 Pop psychology Maria Fernandez explores how peer pressure and changes to the science and technology of marketing influence our choices and opinions 31 Beyond Breaking Bad: chemistry and pop culture Emma Alexander investigates the perception of chemistry in society and how it has been shaped by popular media 32 Biology for the masses Marta Mikolajczak explores how some of the common misconceptions portrayed in pop culture shape our understanding of biological science 34 Science on the small screen Emma Dixon explores the good, the bad, and the ugly representation of science on TV 36 An inconvenient truth Rachel Harrington explores how sci-fi popularisation has affected the evolution of scientific research

Cover illustration by Vivian Uhlir; featuring artwork by Alyssa Brandt, Gillian Cavoto, Andrea Chiappino, Rebecca Holloway, Milda Lebedyte, and Lucy Southen

2 Spring 2018 |


42 Are politicians ruining the trip? Karolina Zieba investigates the impact social stigma and political hurdles have on psychedelic research 43 A new wireless technology to help paralyzed people walk again Alessandra Dillenburg investigates advances in brain-machine interfaces offering new hope for spinal cord injury treatment 44 Tasting colours: studying synaesthesia Shinjini Basu explains our current understanding of why some people hear colours and taste shapes 45 Dr. Hypothesis EUSci’s resident brainiac answers your questions 46 Review: Chance by New Scientist James Hitchen reviews New Scientist magazine's latest book


eu:sci Editorial

News Team Ailie McWhinnie, Chiara Herzog, Christopher Weir, Duncan McNicholl, James Hitchen, Juan Quintana, Sam Hughes, Zandile Nare Focus Team Alina Gukova, Andrew Bease, Bonnie Nicholson, Carlos Martinez-Perez, Clari Burrell, Dianna Bautista, Emma Alexander, Emma Dixon, Genevieve Brown, Helen Feord, Helena Cornu, Kate Downie, Maria Fernandez, Marta Mikolajczak, Milda Lebedyte, Rachel Harrington, Teodora Aldea, Tom Edwick Feature Authors Haris Haseeb, Louis Walsh Regulars Authors Alessandra Dillenburg, James Hitchen, Karolina Zieba, Shinjini Basu, Simone Eizagirre Copy-editors Adam McLauchlan, Aishwarya Sivakumar, Calum Strain, Catherine Lynch, Charlotte Harris, Chloe Wright, Christiana Kontaxi, Clare Mc Fadden, Ella Mercer, Emma Dixon, Gabi Olley, Hanna Landenmark, Helena Cornu, Holly Fleming, James Garforth, Katy Bladen, Ke Yuan, Konstanze Simbriger, Maria Fernandez, Monica Kim, Nikki Hall, Ozioma Kamalu, Peter Xu Sub-editors Aidan Marshall, Aishwarya Sivakumar, Angela Downie, Anna Oprandi, Asier Galarza, Cath Adam, Catherine Lynch, Chandrika Rao, Christiana Kontaxi, Clare Mc Fadden, Ella Mercer, Ellen Harper, Helen Parker, Holly Fleming, Jagoda Poniatowska, Michael Mabbott, Monica Kim , Oswah Mahmood, Ozioma Kamalu, Peter Xu, Sarah Halpin, Shruti Madhusudan, Simon Kaufmann, Swetha Kannan, Tremaine Bilham, Vicky Ware

Dear readers, As we make our way through a millennium marked, thus far, by an ever-increasing influence of media and popular culture, we can’t help but stop to look at how this has changed the science landscape. For instance, the popular science movement, which initially gained momentum in the 70s, has now blossomed into an important part of our modern mainstream culture. Not only is science itself now heavily featured in TV and film (p22-25), but media platforms are also being used to promote the image and lifestyle of scientists (p11-13) and to get the general public actively involved in research (p18). However, the relationship between pop culture and science is not at all one-sided. By ‘flipping the lens’, so to speak, we can also put pop culture under the microscope in order to dissect its dynamics and observe its effects on our society. Issue 22 therefore goes on to explore the science behind pop music, how marketing and psychology play into what becomes mainstream, and why we’ve evolved to produce and consume pop culture in the first place (p26-30). On top of this pop culture bonanza as always we also welcome you to catch up with the latest scientific news and information about local ongoing research (p4). You can also delve into our insightful features, which explore whether ecology was the primordial science and how being more mindful of the changes in our bodies could lead to a more inclusive and heterogeneous society (p38-41). Our usual array of regulars are also on offer covering whether policies on psychedelic drugs may hinder valuable research, how far our current understanding of synesthesia stretches, and whether you should invest in Bitcoin (p42-45). As this is our last issue at the helm, we’d like to thank the Institute for Academic Development for their continued generous support, as well as all the volunteer writers and editors who make this magazine possible. We hope it proves as popular as its subject matter. Teodora Aldea and James Ozanne Editors

Art Team Andrea Chiappino, Rebecca Holloway, Lana Woolford, Lucy Southen, Milda Lebedyte, Rachel Berman, Gillian Cavoto

Editor Teodora Aldea

News Editor Samuel Jeremy Stanfield

Editor James Ozanne

News Editor Bonnie Nicholson

Deputy Editor Haris Haseeb

News Editor James Hitchen

Deputy Editor Aditi Jain

Head Copy Editor Simone Eizagirre

Focus Editor Chiara Herzog

Head Copy Editor Shinjini Basu

Focus Editor Isobel MacGregor

Layout Editor Dianna Bautista

Art Editor Alyssa Brandt

Web Editor Angus Lowe

Art Editor Vivian Uhlir

Spring 2018 | 3

n e ws

DNA nanorobots as anti-cancer delivery systems Origami is the traditional Japanese art of folding paper into multiple shapes and forms. Traditionally this technique is used in ritual celebrations, but can we find inspiration from origami to develop new technologies to solve medical problems? This question has gained traction since it was reported that DNA molecules, the material that contains the genetic information in all living organisms, can be specifically engineered to be folded into multiple two-dimensional and three-dimensional shapes, just like paper origami. This novel approach, inspired by the ancient art form, has opened many doors for biotechnological applications. In fact, these DNA-based origami structures, also called “DNA nanorobots”, have been used to visualise cells in many organisms, from worms to humans. However, until now it has been unclear whether these nanorobots can be designed to safely deliver compounds to specific tissues or tumours.

In a recent report, Suping Li and collaborators described the development of an origami DNA-based nanoscale robot able to deliver thrombin specifically to tumour cells. Thrombin is a well-known protein which induces rapid blood coagulation, potentially leading to the blockage of veins and arteries. So far this protein has not been considered as a therapeutic agent because in the body it acts indiscriminately when it is not delivered properly, leading to lots of unwanted side effects. However, the authors of this report overcame this issue by attaching thrombin molecules to cylinder-shaped DNA nanorobots coated with “protein sensors” capable of selectively binding to tumour cells. When the nanorobots encountered endothelial cells containing malignant markers on their surface (in this case a protein called nucleolin), the protein sensors triggered the DNA nanorobots to unfold, thereby exposing the tumour cells to high levels of thrombin on a local scale. This caused blockage of tumour vasculature, cutting off the tumour nutrient supply, and led to the death of tumour cells. Excitingly, this DNA nanorobot-mediated tumour killing was rapid and highly specific, and its safety has been demonstrated in small animals such as mice and miniature pigs. This clearly demonstrates that origami-based DNA nanorobots can help us to efficiently fight challenging cancers, whilst potentially bypassing any side effects caused by non-specific delivery of chemotherapy agents to healthy cells. Juan Quintana

Image of DNA origami by Matthias A. Fenner courtesy of Wikimedia Commons

The ageless naked mole-rat Perhaps no mammal has earned as much celebrity for its ugliness than the naked mole-rat, an animal so grotesque in appearance that the name “naked mole” was not sufficient to describe it. Yet their hideousness masks an incredible gift: a truly unique lifespan that is becoming more bizarre the more we learn of it. The longevity of the naked mole-rat is not news. They are the longest-living rodent, frequently surviving for longer than 30 years in captivity: an astonishing lifespan, five-fold greater than predicted for a 40 g rodent, and three times longer than that of the 50 kg capybara. In the wild, their survival is significantly reduced, but still impressive; they have been observed to survive for over 17 years, whereas most rodents last only a season before their luck runs out. Moreover, older naked mole-rats show no menopause, no loss of cardiac function, and very rarely develop age-related diseases such as cancer. The ugliest rodent, maybe, but also the most impressive. And now, in a first for mammals, the naked mole-rat has overcome the Gompertz-Makeham model for aging. Gompertzian ageing is simply the increased risk of death with age, the observation that once adulthood is reached we risk death to a greater degree with every passing year. This phenomenon is so clearly evident that it can be surprising to hear that it has a name, and yet a study of over 3000 naked mole-rats, recently published in eLife, has demonstrated that to them, age truly is just a number. There is no more risk of death on a given day for the naked mole-rat at 40 years old than there is at 40 months, and the au-

4 Spring 2018 |

Image of mole-rat by Trisha M Shears courtesy of Wikimedia Commons

thors conclude that they are the only known non-ageing mammal. Indeed, based on the model developed in this study, 5% of naked mole-rats should live to be 103 years old, an age at which one would expect them to be fairly desperate for menopause. It’s obviously always worth being cautious with extrapolating to this degree – ageing may just begin later for the naked mole-rat, and this data set does not extend beyond a thirty year window. But it is clear that in the case of the naked mole-rat, there is more than meets the eye. Which is particularly good news when you look like that. Sam Hughes


Black and British: the unexpected colour of Mesolithic Britian Until recently, it was assumed that after entering Europe 45,000 years ago, the earliest inhabitants of Britain had quickly evolved paler skin. This offers a selective advantage by more easily absorbing UV light compared to dark skin, preventing vitamin D deficiency in climates where there is less sunshine. However, DNA analysis from the remains of a 10,000-year-old man known as Cheddar Man have revealed that surprisingly, some of the earliest modern humans possessed genetic markers associated with the type of skin pigmentation usually found in Sub-Saharan Africa. In other words Cheddar Man, one of the earliest modern human inhabitants of Britain, was black. Cheddar Man’s remains were found by archaeologists in 1903 in Gough’s Cave in Cheddar Gorge, Somerset. He was a hunter-gatherer who died in his twenties during the Mesolithic period. On first discovery, many thought that he was the earliest Englishman, long sought after by archaeologists. It was initially thought that he was 40-80,000 years old, but radiocarbon dating, a tool used to determine the age of an object based on the amount of carbon isotopes it contains, suggested that he lived a mere 10,000 years ago. His is the oldest complete skeleton of our species, Homo sapiens, ever found in Britain. Analysis of Cheddar Man’s DNA has helped scientists from the Natural History Museum generate a portrait of one of the oldest modern humans in Britain. Generating data from ancient DNA is made very difficult by the fact that as soon as on organism dies, its DNA starts to break down. Luckily, Cheddar Man’s DNA was well preserved, thanks to the cool conditions of Gough’s Cave and the layers of natural mineral deposits he was buried under. Scientists extracted DNA from leg bones, teeth and inner ear bones since such dense bones often provide a protected environment for DNA. Following extraction, the DNA underwent next-generation shotgun sequencing whereby DNA is randomly broken up

into many millions of small fragments across the genome. This created a library of Cheddar Man’s DNA which was then compared to the modern human genome. It was important that the researchers were specific about the genetic markers they were looking for (e.g. markers for skin and eye colour). Decoding Cheddar Man’s DNA has revealed that he had striking blue eyes and higher levels of melanin in his skin. These findings are consistent with other human Mesolithic remains which have been found throughout Europe. Dr Tom Booth, a postdoctoral researcher who worked closely with the Natural History Museum in this study states that Cheddar Man “is just one person, but also indicative of the population of Europe at the time. They had dark skin and most of them had pale coloured eyes, either blue or green, and dark brown hair.” This discovery is fascinating on so many different levels, partly because it challenges our expectations of the genetic traits that are linked together. These findings suggest that pale eyes entered Europe before pale skin or blond hair, which didn’t come along until the agricultural revolution during the Neolithic age. This is consistent with the sequencing data from Cheddar Man’s DNA which suggest that, like other humans from his time, he was lactose intolerant and unable to digest milk, since his was the age before the arrival of farming. Ultimately, Cheddar Man is an intriguing figure; he reminds us that we cannot rely on our preconceived assumptions about what people looked like thousands of years ago based on what people look like today. Cheddar Man has shown us that Homo sapiens in the future may look very different from the way we look now. . Zandile Nare

Image of Cheddar Man by Paul Townsend courtesy of Flickr

Spring 2018 | 5

n e ws

Image of H1N1 virus by Cybercobra courtesy of Wikimedia Commons

Killing epidemics with a light touch Scientists working at the Columbia University Medical Centre have discovered a new way to suppress the microbial activity in the air of public spaces, with results published this month. They suggest that the use of far-UVC light in indoor public locations can be used for wide-scale inactivation of viruses and bacteria, reducing the likelihood of infection when passing through high risk areas like hospitals and airports. Ultraviolet light has been used for many decades for its antimicrobial properties, generally in the form of germicidal lamps used to disinfect medical instruments and foodstuffs. Such lamps have not found use in public spaces due to UVC light being a human health hazard, causing both cataracts and skin cancers. The team at Columbia have extended their own previous work on far-UVC light, light of around 222nm in wavelength as opposed to the 245nm wavelength light used in normal UVC germicidal lamps. This far-UVC light is even more strongly absorbed by biological tissue, which at first seems like a disadvantage, but we can use this property to take advantage of our much bigger size compared to germs. Far-UVC light is near-completely absorbed in the tear layer of the eye and in the stratum corneum (outer layer) of the skin of mammals. Since neither structure contains living cells, the danger of ionisation is mitigated. For microbial life, absorption of the same 222nm light is deadly as they are sim-

Image of an airport by Michael Parzuchowski courtesy of Unsplash

6 Spring 2018 |

Image of a UV lamp by Tarvo Metsupalu courtesy of Wikimedia Commons

ply not large enough to have outer layers of non-living tissue. The Columbia group have shown that the far-UVC light that they use is capable of efficiently deactivating the H1N1 strain of the flu virus even when it is aerosolised, as it is during transmission by cough or sneeze. They aerosolised the virus by adding a solution of it into a nebuliser and mixing the output with a carefully controlled mix of humidified and dessicated air to control the average size of the aerosol droplets. The goal of this fine control was to reproduce the particle size distribution of air expelled from human lungs, as a useful model of infected coughs and sneezes. The aerosolised virus was then pumped through an irradiation chamber and exposed to far-UVC light of variable intensity, then collected and co-cultured with canine epithelial cells to determine the infective capabilities remaining in the sample. A dose of only 2 milliJoule of energy per square centimetre was found to inactivate 95% of the virus present in the chamber. The scientists at Columbia suggest that with an efficiency of this magnitude, it would be possible to fit farUVC lamps to flood the public areas of airports or hopsitals with antimicrobial light. With this breakthough, the rise of epidemics at transport hubs may become a thing of the past. Duncan McNicholl

res ea rc h in edinburgh

Tedx UoE: All about empowerment The University of Edinburgh's magnificent McEwan Hall recently played host to this year's edition of the Edinburgh TEDx talks. For the uninitiated reader, TEDx conferences are independently organised events aimed at the propagation of 'ideas worth spreading', using a format based on the original TED (Technology, Entertainment, Design) conference 35 years ago. They feature typically excellent public speakers giving 10 to 20 minute talks on an astounding variety of bright ideas. So much so that even if none of the topics are of interest to you, the talks typically serve as a good guide for anyone seeking to improve their own oratory artistry. The overarching theme of this year's conference was empowerment, with the day split up into four sessions grouped together by connected subthemes. The tone was set with a performance from the Edinburgh University a cappella society, which served to both hush the expectant crowd and focus attention towards the stage. To begin, environmental journalist Louise Gray (below) gave a compelling argument on how to properly illuminate yourself on the ethics of your diet. This was a topic she has certainly earned the right to educate people on after undergoing a several year-long journey investigating our food sources, all the while only eating animals that she had killed and prepared herself. She made many interesting points, highlighting the fact that a greater proportion of the world's greenhouse gas emissions come from raising livestock compared with those arising from all transport. She was followed by Edinburgh chemistry PhD student Euan Doidge giving an extended version of his award winning three minute thesis talk 'WEEE are golden'. He provided an overview of his current research into extracting metals from waste electronics using solvents, a step towards a more circular economy which reduces the pressure on the constant need for new raw materials. The efficiency of this process was highlighted by the fact that a tonne of waste electronics can produce

Image courtesy of TEDx UoE

350g of gold compared to between 3g and 45g for a tonne of ore. During the lunch break the attendees were treated to a selection of distractions and intrigues in the smaller chambers of the McEwan hall. A personal highlight was the interactive stall in which you could try virtual reality arcade games, though always at risk of embarrassing yourself in front of the large crowd of onlookers. For the equally brave, a selection of edible insects was on offer to anybody still feeling a bit peckish after lunch. There is a strong argument that a concerted effort to remove the stigma around nibbling on such crunchy treats could go a long way to solving such problems as ethical protein sourcing and protein availability in the developing world. The final scientific highlight of the day came from Mandan Kazzazi, another Edinburgh three minute thesis alumni, as she described her work developing a method of determining the sex of a set of human remains from its surviving teeth. This is of particular interest as teeth are much more durable than bone and as such provide a better means of identification. Her technique uses a statistical model built from thousands of CT scans of human teeth, accounting for differences in size and shape between genders, to allow the gender of the owner to be determined. James Hitchen

Images by Andy Coulson and Mal Brown courtesy of TEDx UoE

Spring 2018 | 7

re s e arch i n ed inbu rg h

Brain meets Body: linking metabolism, inflammation and the brain One of the remarkable features of research undertaken in Edinburgh is the high level of collaboration and interdisciplinarity. This was highlighted once again at the Centre for Cardiovascular Science (CVS)-Edinburgh Neuroscience Symposium at the end of February this year. The event was motivated by the increasing awareness that an individual’s metabolic state (e.g. diabetes) can have considerable effects on the brain, such as making Alzheimer’s more likely. Conversely, the health of the brain can have a considerable effect on overall health, for example that lung infections are very common after stroke. “It has now finally been accepted that what happens below the neck has a significant impact on the brain, and vice versa” joked chair of the symposium, Prof. Giles Hardingham (Centre for Discovery Brain Sciences) in his introduction. Aiming to foster new collaborations and enhance current ones, the symposium was split into four sessions lead by Prof. Rebecca Reynolds (CVS), Dr. Barry McColl (UK Dementia Research Institute), Prof. Anna Williams (MRC Centre for Regenerative Medicine), and Dr. Andrea Caporali (CVS). Twenty early career researchers - both from the Centre for Cardiovascular Sciences and various neuroscience groups - presented their work in five minute talks, and discussions ensued over coffee, wine and posters. Many interesting questions were addressed, such as “How does being born to an obese mother affect your later mental health?”, or “Is all

Image by geralt courtesy of Pixabay

inflammation bad after a brain injury?”. Overall, the symposium was an excellent showcase of the outstanding work from labs performing fundamental science all the way to clinical trials. This work is facilitated by excellent facilities such as the pre-clinical imaging facility, allowing magnetic resonance imaging of rodents, and the in-house drug discovery lab located at the Queen’s Medical Research Institute. Hopefully, this will lead to exciting new ideas, collaborations, and more research investigating how the brain and body link together to further strengthen the University’s reputation as a world-class hub for inflammation, metabolism and neuroscience research. Chiara Herzog

Test-tube babies: the new generation “All hell will break loose” were the words of James Watson on the potential of IVF treatment. Controversy and concern were rife, yet today over 5 million of our population are so-called ‘test tube babies’ and now it seems a second wave of the fertility revolution is breaking. A lab in Edinburgh has successfully grown human eggs to full maturity, paving the way for futuristic fertility preservation practices. Although this has previously been achieved before in mice, human oocytes are much more complex and little was known about the possibilities. Using ovarian tissue from caesarean sections, follicles were removed and cultured individually. Those that formed complete cumulus oocyte complexes – an egg surrounded by support cells, as seen at ovulation – were selected for further culture. 10.3% of the original isolated follicles fully matured. This may seem like a small percentage, but only one success was necessary to prove that in vitro oocyte development is possible. Maturation was defined as the formation of polar bod-

Image by Brendan Dolan-Galvitt courtesy of Flickr

8 Spring 2018 |

ies and a metaphase II spindle (cellular structures that split the genetic content into two). Polar bodies are small cells with little content, the result of asymmetric cell division at ovulation. This is a mechanism used to maximise the size of the egg itself, to better support growth. The polar bodies of the in vitro oocytes were abnormally large, so there is a worry that necessary components were lost during division. It is expected that improving culture techniques will yield a higher success rate and normal polar body size. The impact of this research on reproductive biology is significant. “For the first time we can gain insight into the mechanism of human oocyte development from the earliest stages”, explains Prof Evelyn Telfer, co-author of the research from the University of Edinburgh. “This in vitro model system allows us to address important questions relating to human oocyte development that will impact how we tackle infertility and lead to improvements in Assisted Reproductive Techniques.” Should future work be successful, these techniques will provide a new method of fertility preservation, particularly of interest to cancer patients. The disease devastates in more than one way, as many treatments cause infertility. Current fertility preservation techniques involve cryopreservation, but there is a risk that cancer cells could survive in the tissue and proliferate upon reimplantation. This technique would eliminate the risk of contamination, as only the egg itself is re-implanted. This year, Louise Brown, the first born test tube baby, will celebrate her fortieth birthday. Perhaps by her fiftieth, she will see the first born in vitro-grown baby enter the world. Ailie McWhinnie

res ea rc h in edinburgh

Image of areas affected by malaria as of 2014 by the US Centers for Disease Control and Prevention courtesy of Wikimedia Commons

Advances in malaria vaccine development Each year approximately 500,000 people die from malaria. Most are children. In fact, by the time you have finished reading this article, around three more people will have succumbed to this disease. On top of the cost measured in human life, there is also an economic cost, with some nations spending 1% of their GDP and up to 40% of their health budgets tackling malaria. Although there are antimalarial drugs available, there is a growing problem with the evolution of drug resistance. Therefore, given that vaccination is the most effective way of eradicating infectious diseases, an efficacious vaccine is imperative. The development of such a vaccine has always been elusive, and Charles Laveran’s discovery that Plasmodium parasites cause malaria helps explain why. This difficulty is due to a number of factors which include the need for greater understanding of parasite biology and human immune response, and the immense complexity of the pathogen (there are numerous stages in the parasite’s life cycle, with differential gene regulation constantly changing potential targets). However, recent work by the Cowman lab and others has identified a protein complex, named PfRipr, on the outer tip of the blood stage of the parasite. This protein complex is essential for invasion of human red blood cells (the point of infection). PfRipr makes an attractive vaccine target since its genetic disruption prevents parasitic development. Antibodies against it prevent parasite growth in the lab, and there is a correla-

Image of the EGF5-7 domains of the Pfripr protein courtesy of Dr. Christopher Haggarty-Weir

Image of Plasmodium parasite in RBC by Steven Glenn, Laboratory and Consultation Division, USCDCP coutesy of Pixnio

tion between people who have antibodies against PfRipr and the prevention of severe clinical malaria development. Dr. Christopher Haggarty-Weir’s research in Melbourne and Edinburgh involved producing various pieces of PfRipr in genetically engineered yeast so that the part of the protein inhibitory antibodies bound to could be ascertained, in addition to its structural determination. Knowing this allows vaccinologists to use an approach called rational/structural vaccinology to develop an efficacious vaccine candidate for further studies. Dr. Haggarty-Weir’s work showed that out of the ten structured domains comprising the full-length PfRipr protein, domain 7 (termed EGF7) was where the inhibitory antibody bound to, inferring functional importance. This is shown to the left on the structure of EGF5-7, developed using X-ray scattering technology and computational modelling. One of the challenges faced was developing a clone of the bioengineered yeast that would produce the protein in significant quantities. This was overcome by screening numerous clones and developing an industrially-scalable method, using a fermenter in order to produce EGF5-7 in grams per litre quantities at a very low cost. The research also showed that EGF5-7 was stable in high temperatures (such as those found in Africa). This work will allow scientists to further test this vaccine candidate in additional trials, with the knowledge that it can be produced affordably. This reduction in cost is a crucial step forward, due to the fact that malaria is endemic in regions of the world with higher poverty levels. Dr. Christopher Haggarty-Weir Spring 2018 | 9

foc u s


Pop goes this issue! This issue is brimful with delightful reads exploring the relationship between science and popular culture, and, as a teaser: it’s an extremely dynamic inter-relationship . Starting off, Marta Mirkolajczak and Emma Alexander debunk common misconceptions in biology (p32) and chemistry (p31) that are propagated in popular culture and remain engrained in people’s minds. Continuing from this, Emma Dixon explores how science is often portrayed incorrectly on TV to add splendour to the plotline of shows (p34). Helena Cornu tackles the image the general public has of scientists (p15), Genieve Brown explores the scientific polymath (pg14) and Bonnie Nicholson investigates how selfies could help create a more realistic and ‘human’ image of scientists (p11). Can science help to enhance pop culture? Milda Lebedyte and Alina Gukova address this question by commenting on novel fabrics capable of revolutionising the currently wasteful fashion industry (p27), and a new formula for making today’s next chart-topping hit (p26). Turning the tables, Rachel Harrington explains how popular culture and sci-fi movies could reshape scientific research (p36). Dianna Bautista investigates the plausibility behind becoming a superhero (p24), and Andrew Bease looks behind the scientific curtains of time travel (p22). Finally, Carlos Martinez- Perez tells the story of how a Hollywood starlet became the unlikely mother of Wi-Fi (p16). It is undeniable that pop culture has a tremendous impact on humanity. Teodora Aldrea explores how pop culture has become more than a mirror of today’s society and is now also a powerful tool to communicate and educate via its condensed and simplified message (p28). Tom Edwick investigate what happens when the relationship between science and pop culture goes wrong, as demonstrated by the recent return of flat earthers (p19). Maybe celebrity scientists can come to the rescue? Helen Feord answers this possibility (p12). Maria Fernandez explains the psychology behind wanting the products dictated to us by pop culture (p30). Continuing from that, Kate Downie explains how the popularity of the internet’s favourite dog breed (hint - it’s the pug) actually causes severe problems to these animals (p20). Lastly, Clari Burell explores the option of ‘citizen science’ - research conducted by the public via games (p18). Science wants you! Chiara Herzog & Issy MacGregor, Focus Editors

10 Spring 2018 | Illustration by Andrea Chiappino


Let me take a selfie: from mad scientist to insta-famous Bonnie Nicholson narrates how scientists have used social media to alter the public perception of science From the Kardashians to the US President to that all-star Oscars ceremony photo, the selfie craze swept the globe throughout the noughties and early teens. The term was officially added to the Oxford English Dictionary in 2013, and the 21st of June now marks World Selfie Day, a day when we can unashamedly selfie to our heart’s content. With entire books devoted to advice on constructing the perfect selfie, and various phone accessories available to perfect the lighting and camera angles, it’s easy to condemn the selfie as narcissistic. But selfie culture can be used for more than just showing off your new lashes. Search the hashtag #scientistswhoselfie on Instagram, and you’ll find a sea of faces of scientists: some in labcoats, some in scuba suits, some accompanied by weird and wonderful animals, some up mountains, some at their desk, and many, many more. #scientistswhoselfie is a crowdfunded social media experiment initiated in 2017 by a team of science communication researchers. Its aim is to improve public perceptions of scientists and, by extension, public trust. Based on the #scientistswhoselfie Instagram images, the researchers will be measuring the public’s opinion of scientists based on their perceived warmth (related to morality, honesty, sociability, and openness) and competence. Because, as it turns out, scientists may have a public image problem. In a study published in Proceedings of the National Academy of Science in 2014, a sample of the population were asked to rate a range of typical US jobs in terms of both warmth and competence. Despite scoring highly in competence, the study demonstrated that scientists have significant room for improvement in the warmth category. To quote the study: “Scientists, in this view, may seem not so warm. Their intent is not necessarily trusted, maybe even resented.” And, with the recurrent portrayal of scientists across media outlets as a bespectacled man with wild hair and a singed lab-coat, perhaps it comes as little surprise. Throughout history, the stereotypical 'mad' scientist has played god,

Image of Nasa scientists courtesy of Wikimedia Commons

created monsters, constructed alternate universes, and made rubber-like, chaos-causing green substances (‘flubber’, to those of you who are Robin Williams-illiterate). But, never has he come across as particularly warm or likeable. Now more than ever, the role of the scientist is expanding such that communication and public outreach have become integral. It is now common belief that scientists have a responsibility to educate and raise awareness about important findings. And in this growing role as science communicators, likeability is important in building trust. It has been shown that simply conveying the bare facts to the public is an ineffective method of communicating science. So how can the humble selfie improve the process of trust-building? The concept of #scientistswhoselfie hopes that by showing the faces of real scientists in their day-to-day activities, we humanise the profession. In support of this, researchers at Georgia Institute of Technology found that Instagram photos with faces are 38% more likely to receive likes, and another study from the University of Pennsylvania found that scientific discoveries referencing people were more likely to be shared. However, engaging scientists in the art of selfie-taking may be easier said than done. Scientists appear to be reluctant to share visual aspects of their work and emphasise any human involvement in the job. The detached, impersonal manner with which science is carried out and communicated about

is heavily ingrained into its very definition. For example, if you read a scientific paper, personal pronouns such as ‘I’ or ‘they’ are rarely, if ever, used. By removing any inference of human involvement from the language, it is easier to convince colleagues or reviewers of the unbiased, systematic approach free from human interpretation or error. Unfortunately, it is this same sense of detachment that also causes the lack of warmth in the public’s perception. Yet, scientists are now actively encouraged to engage in social media. NASA now shares videos and photos through over 10 different social media platforms. In 2009, NASA astronaut Mike Massimino even sent the first tweet from space. Popular hashtags such as #womeninSTEM, #realtimeresearch and #PhDlife make finding scientists’ profiles on Instagram and Twitter incredibly easy. And YouTube has seen the rise of science video blogs (vlogs) such as AsapSCIENCE, Physics Girl, and Veritasium. Perhaps such social media campaigns are the answer to finally beating the ‘mad scientist’ image. The results of the #scientistswhoselfie experiment, which will be channelled into further resources to help scientists improve their public communication skills, will shed some light on the matter. As, I hope, will my new clip-on LED selfie lamp that I’ve invested in ahead of this year’s World Selfie Day. All for the cause, of course. Bonnie Nicholson is a Cardiovascular Science PhD student Spring 2018 | 11

foc u s

The scientist’s claim to fame Helen Feord discusses the importance of celebrity scientists and their role in humanising the discipline “According to scientists…”, “Scientists find…”, “Scientists have suggested…”: we are all very familiar with these phrases. We see them as newspaper headlines, in the documentaries we watch, and in everyday conversations. However, the general use of the term ‘scientist’ can be misleading. One may infer that ‘scientists’ are a large, mysterious group of intelligent people, who work together with the goal of increasing our society’s knowledge of the natural world. However, in this scenario, they are in an inaccessible laboratory, cut off from the wider population. Scientists produce data, analyse it and provide us with a better future. They are all-knowing. But somehow, if you are not a scientist, you are fed a simplified output of the science, therefore becoming largely disconnected from the scientists themselves. Sometimes, media headlines discard

Illustration by Alyssa Brandt

12 Spring 2018 |

the word ‘scientist’ altogether and replace it with a big name, like Stephen Hawking, Richard Feynman or Brian Cox. And everybody is expected to know who these people are, as these names are an integral part of modern culture and they are often brought up in conversation in regard to their academic brilliance and their characteristic field of study. We know these people to be intelligent and intuitive, making us willing to listen to what they have to say on various topics. But, as opposed to other categories of fame, such as politicians and actors, we are often only very superficially aware of the actual substance that makes them known: the science itself. So, if we are so disconnected from the reality of the world of research, why do some names and personalities emerge into the public sphere? And is it actually important that we become aware of them?

Fame for scientists can occur for a variety of reasons. Evidently, some scientists are known purely because of their outstanding scientific achievements and demonstrated brilliance. Stephen Hawking’s genius and his contribution to quantum physics are the cornerstones of his fame, Jennifer Doudna and Emmanuelle Charpentier are two scientists who have recently been put into the public spotlight due to the development of the CRISPR-Cas9 system, and the premature death of the mathematician Maryam Mirzakhani was announced globally due to the incredible quality of her research. In addition, awards such as the Fields Medal or the Nobel Prize often play a role in publicising these scientists, which makes them instantly trendy, while other scientists make a particular effort to reach out to the rest of society through popular media

focus and achieve fame in this manner. Physicist Brian Cox is a professor of particle physics at the University of Manchester, but is best known for authoring several popular science books and his BBC television broadcasts. The primatologist Jane Goodall has reached international fame through her various activism campaigns.

Brilliance and scientific breakthroughs are not the principal catalyst for their celebrity Other scientists become known for reasons far from their expertise. The mathematician Cédric Villani, for example, was recently elected to the French parliament, hence the public now know him essentially as “the mathematician who got involved in politics”. He has also written a book on mathematics aimed at general audiences, which was generally considered far beyond most people’s mathematical abilities. In all appearances, he is now using his clear genius for a more relatable public service. This non-exhaustive list essentially tells us that high quality scientists can be famous but that, in some cases, their brilliance and scientific breakthroughs are not the principal catalyst for their celebrity. A study conducted in 2004 by Bagrow and colleagues looked at this hypothesis more closely, aiming to quantify the relationship between the quality and quantity of research output, a marker of scientific achievement, and fame, quantified by the number of Google hits, for scientists in the field of condensed matter and statistical physics. They found a linear relationship between these two components. This data was then compared to the relationship between fame and merit for World War II ace pilots which, in contrast, was found to be exponential. Therefore, it was concluded that merit had less significance for the celebrity of scientists than for other categories of famous people. The authors of this paper postulated that, unlike celebrities such as politicians or actors, scientists were more likely to be recognised within their own research community rather than with the general population and thus never achieve what they called ‘true fame’. True fame, therefore, is not directly associated with scientific success, as many excellent scientists who

have made exceptional breakthroughs in their field will never achieve this status. But in the grand scheme of things, scientists may not really set out in their career with the aim to become famous, so simply making a worthwhile contribution to science is generally satisfying enough. Beyond the reasons behind their fame, the fact that there are celebrity scientists has an impact on society. People tend to believe everything these intellectuals have to say, as we are all compelled to agree with somebody with a track record for brilliance. These high-profile researchers are beneficial for the scientific community as they can highlight the issues that are important to them. For example, the Nobel laureate Sir Paul Nurse has played an important role in speaking out about the importance of strengthening the relationships between scientists and politicians, shedding light on the misuse of scientific data by the latter. His involvement in politics has also included a review of the research councils in the UK, which has received significant media attention due to its association with this emblematic researcher. In a world where the scientific community relies heavily on public funding to function, figures with both political and public credibility considerably advance its interests.

Does our admiration prevent us from being impartial to the evidence? Famous scientists are also often role models, and can often list several names of their heroes that inspired them to pursue a career in science. When they become famous, these scientists undertake the same role, inspiring future researchers around the world, and so the circle continues. This is particularly relevant in regard to the growing amount of women who undertake careers in science - in a sector which has historically been dominated by men, female role models have become crucial. We are all familiar with Ada Lovelace, Marie Curie, and Rosalind Franklin: the classic icons of important women in scientific research. While these figures highlight the breathtakingly unequal prospects for women in science in their time, they do not necessarily portray the reality of female empowerment in this field today. Such roles can be attributed to scientists such as Jennifer Doudna and

Emmanuelle Charpentier. Their contribution to the CRISPR-Cas9 technology irreversibly changed the world of molecular biology but also showcased the story of two women who made a tremendous contribution to their scientific field. The profiles of these two women are available in multiple languages and media formats, making their life story, from education to scientific breakthrough, accessible to young girls around the world. The benefits that celebrity scientists bring about are obvious. But in some cases the link between popular appeal and public interest is tenuous. Cases when scientists speak out about a cause that is far from their research area can generate unwarranted public attention. The recent publicised debate on the funding of our National Health Service between the health minister Jeremy Hunt and the late physicist Stephen Hawking exemplifies this. An objective view of this situation is that both parties involved in the debate accused the other of not taking important evidence into account. On one side there was Hunt, whose job is to run the health service in this country, without any background in medical training. On the other was Hawking, whose credibility was underlined by his demonstrated genius and his significant personal dealings with the NHS. However, the former reason related to his achievements in physics and not public policy, and the latter does not differentiate him from any other individual that has had experience with the NHS but does not have the same public platform as Hawking did. Without taking a side, this is an interesting example of a situation where it is important to think about the arguments being put forward. Should Hawking have been believed over Hunt simply because he was a celebrated genius? Does our admiration prevent us from being impartial to the evidence put forward by both men? Celebrity scientists have the very important role of broadcasting the output and functioning of the research world. They are important not only to generate interest in the natural world but to also help steer research in the right direction and inspire the next generation of scientists. It is, however, worth remembering that scientists are experts in very niche fields of research, and are continually being constructively criticized by their peers: as any other human being, they cannot be considered intellectually flawless. Helen Feord is a first year PhD student in Molecular Plant Science Spring 2018 | 13

foc u s

The scientist who has it all Genevieve Brown wonders why pop culture is obsessed with the scientific polymath San Diego 1986; recruiters for the United States Navy began setting up tables outside cinemas. There was an increase in service sign-ups for naval aviation during this time. They were seeking to capitalise on the apparent influence and success of Top Gun. Lieutenants were recorded stating that almost all of the applicants had seen the film. Young men watched the protagonist Maverick, with his celebrated flying ability, and wanted it for themselves. People have modelled their lives on popular culture and it should not be underestimated as a persuasive force. Image courtesy of Flickr

"I can wire anything directly into anything. I’m the professor"

From a young age, we learn what a scientist looks like. In my case -- Dexter in Dexter’s Laboratory; Professor Utonium from The Powerpuff Girls and Susan and Mary Test in Johnny Test. Usually male, often with indistinct accents, these cartoon scientists don white lab coats and are proficient in many fields of science. On-screen, scientists and inventors are frequently conflated. These scientists, masters of multiple disciplines and capable of extraordinary inventions, are called polymaths. The average child watching this is aware that a chemistry set can only get them so far. The ethos of children’s favourite fictional polymaths can be seen today in adult cartoons. Animation lends itself to exaggeration. It also pairs with science’s infinite capacity for innovation to be very entertaining. As Futurama’s Professor Farnsworth says, “I can wire anything directly into anything. I’m the professor.” Even live-action scientists seem to possess unrealistic talents, like the single-handed invention of time travel in Back to the Future and the construction of a teleporter in The Fly. Indeed, science fiction as a genre is so successful that it could be assumed there is little appetite for it to change. Recently, however, the stories of more specialised scientists have proved popular. Breaking

14 Spring 2018 |

Bad is one of the most prominent television programmes of recent years, and is certainly the most prominent to feature science. Its protagonist, Walter White, is a rare depiction of a multifaceted scientist character. In Arrival, a film about communicating with aliens, the protagonist, a linguistics professor, is not a typical scientist. These two fictional characters are doing the work of science – manufacturing and researching, respectively – and this is not something usually witnessed in popular culture. The reason behind the vast majority of scientists in pop culture having their methods described vaguely, or omitted altogether, could lie with the fact that the script writers do not generally have scientific backgrounds. It can mean, unfortunately, that medical dramas have their gore explained in fascinating detail, and fictional athletes can provide blow-by-blow accounts of their championships, while their scientist counterparts are nowhere to be seen. Many screenwriters of The Simpsons have degrees in mathematics, allowing them to insert jokes from their field which are both funny and accurate, and the creators of Breaking Bad consulted a chemist to ensure the accuracy of the dialogue. It could only be beneficial if more writers followed these examples. There is of course, the uncomfortable truth that the lives of real scientists are not very cinematic. Everyone knows that scientists can’t actually make whatever they want, but few people know what a scientist’s day actually consists of. The Top Gun lifestyle is

appealing for its adventure, but also for its simplicity: flying, and being good at it.

The uncomfortable truth [is] that the lives of real scientists are not very cinematic Polymaths are not without merit, of course, and will always be revisited in pop culture. It is thus a greater challenge to form a fictional scientist who is both specialised and interesting. With television being increasingly scrutinised and lauded for its quality, there has never been a better time to create characters with more niche disciplines. Breaking Bad is one of the most critically acclaimed television shows of all time, and is able to strike that elusive balance between real science and science exaggerated for entertainment value. The programme accomplished this without slipping into the realm of science fiction. The polymaths need to be balanced with scientific specialists. Realistic fictional scientists remain elusive within pop culture, ridiculous at a time when it sometimes feels as if no story has gone untold. Indeed, these types of narratives are already having an effect: anecdotally, Breaking Bad has inspired a few young chemists. Genevieve Brown is a third-year Chemistry student at the University of Edinburgh


The scientist look Helena Cornu contrasts the image of scientists in popular culture with life in the lab because all scientists are not, in fact, socially challenged, workaholic geniuses There is a question, inevitably raised during family gatherings, which provokes fear and anxiety in the mind of students: “So, what are you going to do after you graduate?” Thankfully, I have an answer all prepared. “Research” I say confidently. Only I recently realized my perception of what life as a researcher entails only came from what I had seen on TV, and that actually, I hadn’t a clue what that word meant. For most of us, popular culture, television in particular, provides an insight into environments we would otherwise know nothing about. Crime shows and legal dramas, for example, give us an idea of what the life of a detective or a lawyer might be like, but based on typical portrayals of scientists on screen, you would expect every lab to be filled with socially-challenged, eccentric, workaholic white males. The common factor in the stereotypes of scientists in popular culture is their depiction as outsiders. In a recent publication aiming to break such stereotypes down, Antonio Tintori of The Sapienza University of Rome explains that this is linked to the co-evolution of the image of scientists with the words ‘geek’ and ‘nerd’. It is rare to find a scientist on television who isn’t completely defined by their work and their understanding of technical jargon, whose function in the story isn’t related to their scientific pursuits. People usually imagine scientists in white lab coats, set apart in dress as well as in mind. This reinforces a common perception that scientists are out of touch with reality, prodigies who are isolated because of their social awkwardness, and make up for it by throwing themselves unreservedly into their work. A prime example of this is Temperance Brennan, the hyper-rational forensic anthropologist in the television series Bones, whose commitment to facts and evidence over feelings leaves her distant and socially clueless. What about The Big Bang Theory? Here is a show where accurate scientific content meets pop culture, showing scientists in their life outside the lab. Though the show does negate some stereotypes, it also reinforces others, for instance specifically associating scientific aspirations with geek culture. The show’s

Illustration by Rachel Berman

scientists are vulnerable, awkward, and socially challenged geniuses, isolated by their knowledge and geekiness. Kaley Cuoco’s character Penny, a ‘Wendy to the Lost Boys’, is what makes the scientists approachable to the audience; repeatedly using the knowledge gap between Penny and the scientists for comic relief.

People usually imagine scientists in white lab coats, set apart in dress as well as in mind So, in spite of all the scientists I had seen on television, I didn’t have an accurate idea of what a typical day of research looked like. I imagined there to be a flurry of activity, scientists jumping from one experiment to the next with the swish of a lab coat, fuelled by a succession of eureka moments. I imagined long days with few breaks, incompatible with family life. I was worried that I would never fit in, that my knowledge of science wasn’t encyclopedic enough, that I wasn’t enough of a nerd. Then, as part of my year abroad in Sweden, one of the courses offered was a 10 week internship in a research lab of my choice. I decided to take the plunge.

What I found was far from what I had imagined. The lab is quiet and relaxed. At any given moment, few researchers are conducting experiments. The ones that are have earphones in, phone tucked into the back pocket of their jeans, lab coat hanging on a hook in the corner. The office always contains food that people have either baked or brought in for meetings and coffee breaks. People are friendly and eager to share their insights or discuss experimental issues, because they will readily admit that their scientific knowledge isn’t exhaustive. More importantly, the people who make up the lab are incredibly diverse, and it surprises no-one to hear Spanish, Mandarin or Hindi spoken rather than English or Swedish. Half of the researchers are women, most are young. I found people who are passionate and dedicated, but also open and funny, each with a unique set of interests. The one thing they all share, however, is the capacity to look down a microscope at a developing embryo or a stained tissue and say “wow”. Or alternatively, “Huh, that’s weird”. In that sense, at least, we are most definitely nerds. Helena is a third year Biomedical Science student, currently on a year abroad in Uppsala, Sweden

Spring 2018 | 15

foc u s

The star who brought us Wi-Fi Carlos Martínez-Pérez tells the story of how a Hollywood starlet became the unlikely mother of modern communications Have you ever heard of Hedy Lamarr? Unless you are a fan of the ‘Golden Age of Hollywood’, chances are you haven’t. Lamarr was an Austrian actress who, after a film debut so scandalous it was condemned by the Pope himself, ended up becoming a huge Hollywood star in the 1930s and 40s. Often referred to as “the most beautiful girl in the world”, Hedy’s stunning looks became so iconic she served as the inspiration for both Disney’s Snow White and the original Catwoman. However, over the last 20 years we have learned that that was not the whole of Hedy’s story. Although she loved making movies, Hedy was rather bored by her lavish and glamourous Hollywood lifestyle. Instead, Hedy devoted her time to her favourite pastime: her lifelong passion for inventing; as a child, she spent hours dismantling and putting back together her music box and radio. She turned back to inventing when she grew disillusioned with her Hollywood lifestyle, even setting up a workshop with a drafting table, equipment, and a reference library in her Beverly Hills mansion. Hedy’s one-time boyfriend, film director and aviation magnate Howard Hughes, was also a part-time inventor and once put two chemists at her disposal to help her develop an idea for a soluble bouillon-like tablet to turn water into soda. Hughes also took on Hedy’s ideas for a new airplane wing design, which took inspiration from her study of animal anatomy to improve aerodynamics. As World War II took off in earnest, Hedy became distressed about the conflict tearing apart her homeland. In 1940, the deadly bombing by Germany of a passenger ship evacuating children to Canada spurred Hedy to take matters into her own hands. And she had some good ideas on how to help the Allied cause. Hedy was aware of the serious limitations of the antiquated British and American torpedo technologies: about 60% of their fired torpedoes turned out to be duds, missing their targets or even failing to set off. Hedy also had some knowledge of torpedo technology through some dark connections in her past: as a teenager, and despite her Jewish heritage, Hedy had married Fritz Mandl, a very wealthy weapons manufacturer and notorious fascist sympathiser. Although

16 Spring 2018 |

Illustration by Rebecca Holloway

Hedy quickly realised her mistake in marrying him, she found herself trapped in what she called “a golden cage” and, while plotting her escape, was forced to play hostess at her husband’s parties. Dismissing Hedy as just a trophy wife, Mandl’s associates would discuss the latest advances in missile research in front of her, including the development and improvement of radio control. Of course, forever curious and a willing learner, Hedy took in all that information.

The deadly bombing by Germany of a passenger ship spurred Hedy to take matters into her own hands Half a decade later, she came up with an even better idea. Hedy knew that simply radio-controlling weapons was no longer enough: once enemies figured out what signal frequencies were being used, they were able to interfere with the frequency to hijack or stop them. She devised a way of circumventing this: the signal could continuously jump between a range of different frequencies following a unique pattern, known only to the transmitter guiding the torpedo and the receiver in the torpedo itself. Hedy called this new technology ‘frequency hopping’. It made for a

very accurate missile guiding system, where its course could be constantly updated towards a moving target. More importantly, the signal would be ‘jamming-proof ’, as only someone with the exact pattern of frequency changes would be able to interfere with it. Hedy could have drawn inspiration from the Nazi idea of using different frequencies to simultaneously guide several torpedoes towards one target. In order to put her idea into practice, Hedy enlisted the help of her friend George Antheil. The New Jersey-born child of German immigrants, Antheil was a pianist and avant-garde composer who had some ‘tinkering’ experience of his own, having tried to automatise and synchronise pianos for some of his performances. George also felt compelled to help the war effort, as his diplomat brother Harry had been one of the first American casualties of the war when his airplane was shot down by Soviet troops in 1940. Together, they put her ingenious idea for frequency hopping into practice. For this, George drew inspiration from his knowledge of player pianos or pianolas. Like these, their device included small rotating rolls of perforated paper – working as a rudimentary binary code, not unlike the perforated cards used in early computing – to switch different frequencies in the transmitter and receiver on and off. The device included 88 different frequencies – coincidentally the number of keys in a piano – and was the size of a pocket watch.

focus In 1942, Hedy and George obtained a patent for their ‘secret communication system’, which they donated to the US Navy. However, despite their initial interest, the Navy failed to pursue its application, deeming it too difficult. For many years, that was all: the idea was mostly forgotten and, disappointed, Hedy and George went on with their lives and artistic careers.

She experienced some of the respect she deserved not only for her glitzy years on the silver screen, but for her incomparable contributions to engineering and communication sciences

Nevertheless, the patent was kept and, over the years, naval engineers started to apply its principles. In 1954, frequency hopping was included in a new design for buoys. By the 1960s, the technology was incorporated into the USS Mt McKinley, a ship involved in the Cuban missile crisis, and in surveillance drones during the Vietnam War. It was at this stage that engineers incorporated chips to allow for digital, much faster switching of frequencies. This updated technology became known as ‘spread spectrum’ (SS). SS finally emerged from secrecy in the 1970s and, by the early 1980s, a campaign started to advocate for its use in communication technologies. Researchers had realised that frequency hopping could allow many devices to operate in an environment without interference, as long as they used different frequency patterns. The rest is history: once the use of SS was authorised more applications were developed, eventually leading to the advent of Wi-Fi, Bluetooth, GPS, and satellite and mobile communications. By this time Hedy had long retired from acting, having quit Hollywood in the 1950s. The industry and audiences had largely forgotten her talent and box office hits and she was only remembered for her looks. Aware of this, Hedy often said that her beauty had also been her “curse” and that disappointed her greatly because she believed that “people’s brains are much more interesting than their looks”. That was definitely true in her case.

In the decades since quitting acting, Hedy had conserved her curiosity and interest in invention and she was well aware of the different applications that had been developed based on her frequency hopping idea. Seeing how her Hollywood legacy had been blurred by the short memory of a shallow and sexist industry that had defined her only by her beauty, Hedy became somewhat bitter about the fact that her remarkable intellectual achievements were not being recognised. This changed in the 1990s when David Hughes, a retired Army colonel and wireless communication enthusiast, finally realised the connection between Hedy Lamarr and frequency hopping which had largely gone unnoticed due in part to the patent being under her maiden name. He started a campaign to gain Hedy her well-deserved recognition and at 83, Hedy received both the Electronic Frontier Foundation Pioneer Award and the Bulbie Gnass Achievement Award in 1997. At last, before passing away in 2000, she experienced some of the respect she deserved not only for her glitzy years on the silver screen, but for her incomparable contributions to engineering and communication sciences. She even got her own Google doodle to commemorate what would have been her 101st birthday! Rightly, this didn’t just celebrate her years of stardom,

but rather highlighted her moonlighting as an amateur scientist and the extraordinary results of this work. In 2004, both Hedy and George Antheil were posthumously inducted into the USA National Inventor Hall of Fame. The revelation of this other side of Hedy’s remarkable life has certainly transformed her legacy in the past decades. Recent years have seen the production of TV specials, plays, and books about her, with a documentary and miniseries about her life and inventions to be released this year. Hedy’s pop culture influence has gone from that of a half-forgotten beauty icon to the ultimate example of how a book should never be judged by its cover. The world has finally learned of the achievements of a woman who, in an effort to help fight the Nazis, made an invaluable contribution to technology. One that had applications for military defence but, more importantly, revolutionised the way we communicate and helps people stay connected every day. All because Hedy Lamarr decided to put her ingenuity to work. Carlos Martínez-Pérez is a postdoctoral researcher in breast cancer

Illustration by Rebecca Holloway

Spring 2018 | 17

foc u s

Science needs you Clari Burrell discusses the active involvement of the public in scientific research Comet hunting, penguin counting, and butterfly spotting are just some of the ways you can contribute to scientific research from the comfort of your living room or back garden. Citizen science, scientific research conducted by people other than professional scientists, is helping to power ever larger and more in depth studies. Projects which were previously unfeasible, due to the sheer scale of observation needed or analysis processing power, are now becoming possible. Scientists can upload projects to host websites, such as Zooniverse, where hundreds of thousands of volunteers then choose what they would like to be involved in, usually submitting data from first hand observations like wildlife surveys, or sifting through pre-gathered data like photographs of outer space. Engagement varies from people who occasionally dip in to try something fun to so-called ‘super users’ who spend hours each day searching, tagging, logging data and working out puzzles. There are tasks of varying levels of difficulty within projects but none require scientific expertise. It is quite thrilling to think that your contribution could be a small step towards a big new discovery. Encouraging people to take an active part in research also helps to demystify the scientific process and opens up lines of communication between science professionals and the wider public.

Projects which were previously unfeasible [...] are now becoming possible

To this end, scientists are increasingly using interfaces designed in computer game style to communicate with a wider audience in a process known as ‘gamification’. This approach can be used to encourage citizen scientists to contribute to research projects, or as a teaching tool to spread knowledge and understanding. You can brush up on the history of science by playing The Odyssey or explore the ecosystem of a tree in Botanicula, help map the brain by playing Eyewire or solve protein folding puzzles with Foldit.

18 Spring 2018 |

Illustration by Lana Woolford

Craig Docherty, a postgraduate research student at the University of Stirling, works on the application of gamification to science education, outreach, and engagement with a focus on plant health. I asked him what value he sees in gamification, both for scientists and for non-specialist participants. CD: The value for scientists would be gathering data and encouraging participation. If you look at projects such as Eyewire and Foldit, you see members of the project community delivering results. For the participants there will be a range of benefits. For some it will be a case of providing a community where they can be active, share their knowledge, interact with others. Yet for others there will be an appeal to their desire to collect things like points, badges, reputation etc. So there will be a wide range of benefits for the participants from personal gain, sharing their expertise, and learning new information themselves. I was also interested in whether there may be drawbacks of using gamification as a way to engage the public with scientific research. CD: There are some pitfalls. Primarily there's the possibility of gamification being introduced to a scenario where it isn't well received by the participants. With such a wide

range of potential implementations there's always the risk of not selecting the right elements for the job. Even a successful implementation won't attract or encourage everyone. However, it is still a useful tool for driving engagement and participant satisfaction. So as with any new approach, there are teething issues. Scientists’ careers and reputations rely on the quality and reliability of the results they publish. With thousands of people with little to no scientific background submitting data for a study, checks and balances must be built into the design of citizen science games and tasks to ensure false data or any mistakes do not find their way into publications. Games can be a wonderful tool for learning, but they must also be both informative and engaging. These first steps towards tapping into the vast resource of human curiosity are exciting though. In the age of big data there is the potential to truly deepen our collective understanding of how our world and universe work, how we think, and how we relate to one another. Working together, we can achieve great things. Clari Burrell is a third year PhD student in Molecular Plant Biology


Fake news: the state of science in a posttruth world Tom Edwick looks at what the recent return of ‘flat-earthers’ can tell us about the state of science today If there was one term that you could argue defines these modern times, it would have to be ‘fake news’. The people over at Collins dictionary would agree with you, having named it their 2017 ‘word’ of the year, and for good reason. Some claim that fake news played a deciding role in the 2016 US Presidential election, with social media and the internet providing a platform for an unprecedented spread of false information. In the run up to the vote, fake news stories were shared more widely than real ones from the mainstream media. Reportedly, many people believed these stories, which mostly favoured Trump, leading to speculation that fake news had a hand in electing him.

In this post-truth world, there appears to be a worrying trend of anti-intellectualism

In science, the story is very much the same: the internet has provided a platform for fake science news. Whilst pseudoscience is nothing new (think Victorian-era businessmen peddling potions to cure all ailments), social media websites and online blogs provide a platform for a whole host of modern pseudoscience, ranging from phony cure-all medicines to wacky conspiracy theories. One conspiracy theory that has gained traction since the 90s is the presence of ‘chemtrails’ - chemicals deposited into the atmosphere by government aeroplanes for the purposes of geoengineering. The theory claims the deposits are polluting our environment and damaging our health. Though this theory has been thoroughly debunked by the scientific community (the trails left by planes are simply exhaust condensing in the atmosphere), you can still find endless internet forums and videos from popular YouTube commentators that claim to provide proof of the damage caused by chemtrails. As recently as 2016, there was a scientific paper published on the matter - though

Illustration by Alyssa Brandt

it was retracted only a few months later. This movement of pseudoscience to the mainstream shows the power of the internet to popularise fringe ideas, however incorrect they may be. This is nowhere more evident than in the recent resurgence of the ‘Flat Earth’ theory, which states exactly what you think it does. According to the Flat Earth Society (yes, there is an actual society), the Earth is a disc with a large ice wall at the rim, and the sun, stars, and moon all rotate above a central point over the North Pole. Since 2015, the theory has gained ground in popular culture, with Google searches for ‘flat Earth’ tripling in that time. This has been helped in part by celebrity figures getting behind the movement, showing their support publicly on social media. A rapper known as B.o.B even went so far as to set up a GoFundMe page to raise funds for sending a satellite into space to answer the question once and for all. It hasn’t taken off. At this point, you might be forgiven for thinking that pseudoscience does not matter, and that it is only caused by a bunch of conspiracy theorists who harm no-one. However, pushing fake science can lead to dangerous consequences. In 1998, a study, published in the Lancet by Dr Andrew Wakefield and 12 others, claimed that there was a link between the measles, mumps, rubella (MMR) vaccine and the development of autism. The link between autism and vaccines has since been comprehensively disproven, and it was discovered Wakefield was being funded by a lawyer associated with an anti-vaccine group. Furthermore, the

paper was a result of poor, unethical, and fraudulent practices. However, the damage was done. The paper was not fully retracted from the journal until 12 years after publication, and the consequent rise of the anti-vaccination movement citing this work has led to measles outbreaks in the UK, the US, and Canada, due to a lack of vaccination. It is worrying that dangerous ideas like these persist, despite the overwhelming contrary evidence. In this post-truth world, there appears to be a worrying trend of anti-intellectualism. As a result, significant scientific consensus is being disputed, such as by those who disagree with the 97% of scientists that climate change is likely humancaused. Unfortunately, climate change denialism has made its way into government, with one US senator famously bringing a snowball into the US Senate as proof against global warming. The troubling fact of scientific denialism is how it warps the debate to give both arguments equal weighting, despite evidence in clear favour of one or the other. This results in a platform for fringe ideas to become mainstream, much like the conspiracy theories mentioned earlier. Perhaps the resurgence in conspiracy theories is just an indicator of a wider distrust of intellectualism, and can tell us a lot about the state of science in this post-truth world. Tom Edwick is a third year Biology student at the University of Edinburgh

Spring 2018 | 19

foc u s

It’s a pug’s life Kate Downie explores the veterinary health concerns linked to brachycephalic dog breeds, such as the pug Thunderous wheezing suddenly filled the peaceful waiting room of the vet practice. I looked around, dazed, to find the beast that was causing the stertorous noise. There sat a tiny dog, no larger than a loaf of bread, eyes bulging from the sides of his head and tongue curled up to his nose, gasping for breath. My German shepherd, totally shocked at the noise created by this pocket-sized pug, looked up and made eye contact with me. I wish I could begin to explain to him this human creation, but I can’t. There are now over 200 dog breeds in the UK, all of which originally descended from the wolf 15,000 years ago. Breed standards are set and maintained by the Kennel Club, which also acts as the governing body for canine events and the national register of pedigree dogs. Controlled breeding was first introduced by the Kennel Club during the Victorian era, when people kicked out farm animals from cities and began

Illustration by Vivian Uhlir

20 Spring 2018 |

allowing dogs into their family homes. Initially bred for hunting, guarding, and herding, their purpose shifted towards aesthetically pleasing lap dogs. Those with short skulls and flattened faces are known as brachycephalic breeds. Brachycephalic refers to a flat and wide skull shape as opposed to dolichocephalic breeds, which are the longer nosed breeds, such as my German shepherd. Brachycephalic breeds include dogs such as pugs, Cavalier King Charles spaniel, and English and French bulldogs, and our obsession with them is on the rise. This astonishing growth is illustrated by the tripling of the number of pugs born in 2017 compared to 10 years ago. Kennel Club figures predict that by the end of 2018, French bulldogs could be the most popular dog breed in the UK, knocking the Labrador off the top spot for first time in 27 years. The appeal of their squashed wrinkled faces has led pugs to become social media

sensations. One popular account, ‘Doug the Pug’, has over 6 million likes on Facebook, and 1.6 million followers on Instagram, receiving thousands of likes every time a photo is released of him posing in a cute outfit. Doug is simply a family pet who has gained his vast following through the public’s love for his squished face and adorable miniature outfits. Pugs are also in the spotlight from celebrity pug owners, including Paris Hilton, Hugh Jackman, George Clooney, and even Dwayne ‘The Rock’ Johnson, for their use as fashion accessories on the streets of Hollywood and their endorsement on social media. Pugs have also starred in blockbuster movies such as Men in Black (1997) and Kingsman: The Secret Service (2014), making their abnormal facial features desirable to the public eye. All this may seem like harmless fun, but many people are totally unaware of the tidal wave of problems that have been created for this ‘adorable’ breed.

focus One of the most significant problems is brachycephalic obstructive airway syndrome (BOAS). By shortening the skulls of these dogs through years of selective breeding, a number of anatomical abnormalities have developed. These increase resistance to airflow, making breathing and controlling body temperature difficult. One such anatomical abnormality arises from the lack of proportional shortening of associated soft tissue with the shortened bone in the muzzle. This mismatch causes the soft palate to become too large for the oral cavities of these short-skulled breeds. As a result, the overlong soft palate obstructs the flow of air to the trachea and causes turbulent flow in the larynx. Combined often with stenotic nostrils, and a hypoplastic trachea (which further increases resistance to airflow), snoring, panting, overheating, exercise intolerance, cyanosis, retching, and disturbed sleep have become characteristics of these breeds rather than clinical signs of disease. These primary disorders can then lead to further laryngeal collapse and everted laryngeal saccules. Research from the University of Cambridge has shown that 60% of pugs suffer from BOAS, and the Kennel Club claims that only 7-15% of pugs breathe like normal non-brachycephalic dogs. This not only puts these dogs at much greater risk during anaesthesia, but also makes daily life a struggle.

The appeal of their squashed wrinkled faces has led pugs to become social media sensations Current treatment for BOAS includes an early intervention surgical correction, which reduces the incidence of a laryngeal collapse. Although BOAS surgery can often cause a significant improvement in exercise tolerance and ease of breathing, it will never form a completely normal airway. Hence these dogs will remain vulnerable to heat stress and experience reduced exercise capabilities. Whilst laryngeal surgeries can also be performed, their success is limited, and it is much safer to instead address the concurrent problems of BOAS. Surgical treatment combined with increased anaesthetic risks for our flat-faced friends makes me

question how we have not yet realised how serious the problems for these breeds are. It is not just their breathing that is the sole health concern. The characteristic protruding pug eyes increase their chances of developing corneal ulceration by 19 times and put them at higher risk of dry eye, both of which can lead to blindness. Dogs that are unfortunate enough to suffer from keratoconjunctivitis sicca are faced with a lifetime of tear replacements 4-6 times a day, or a tear stimulant (cyclosporine) for life. Overbreeding has even led to a progressively fatal inflammatory disease of the central nervous system called Pug Dog Encephalitis. Even their screw shaped tail can lead to painful spinal conditions. The list of diseases goes on, leading to pugs and other brachycephalic dogs being predisposed to a life of disease and welfare concerns, which would not naturally occur otherwise. Many vets believe it is time for us to take responsibility for the health problems brought about by breeding, and make dramatic changes in the way we select dogs for breeding. The president of the British Veterinary Association, Sean Wensley, said, "Prospective owners need to consider that these dogs can suffer from a range of health problems, from eye ulcers to severe breathing difficulties. We strongly encourage people to choose a healthier breed or a crossbreed instead." These warnings have also been backed up by the Royal Society for the Prevention of Cruelty to Animals, People’s Dispensary for Sick Animals, the Royal Veterinary College, and the Kennel Club. At a veterinary congress in Copenhagen in September 2017, it was concluded that vets should “speak out” about issues surrounding brachycephalic breeds as well as approaches to tackling them. The first approach could be to ban breeds with serious health issues. A recent petition to ban brachycephalic breeds collected up to 40,000 signatories. However, in practice, banning breeds would be very difficult to do. Nevertheless, insurance companies are already changing policies based on breeds. In Scandinavian countries, they exclude cover for respiratory and skin conditions in brachycephalic dogs hence discouraging people from buying pugs. Secondly, we could create a method to carefully select dogs suitable for breeding. The Pug Breed Council recently launched a pug breed health scheme with five health tests that could be used to eliminate unhealthy dogs from breeding. Many believe this could be enforced by prosecuting those who

do not adhere to these standards. In addition, vets urge that the Kennel Club must revisit its breed standards to secure the future health of such breeds. This could help steer top-end breeders away from aesthetics, and move towards longer and healthier skulls. Nonetheless, figures offered at global veterinary congresses range from around 3-8% of dogs being registered to Kennel Clubs worldwide, and therefore a change to breed standards would potentially not be influential enough to make a significant difference.

Vets must stand at the forefront of this change to open the eyes and ears of the public Perhaps the most important change would be to change public opinion on these kinds of dogs. We need to educate the public about these health problems, and help them understand why we can’t continue to breed pugs for looks without any regard for health. Furthermore, celebrities and the media must cease to endorse pugs as ‘cute and healthy’. Instead, we must use images of pugs to highlight the dangers of unethical breeding. Campaigns have been set up to stop companies using such advertisements. The Veterinary Record and In Practice no longer accept adverts that use images of bulldogs or pugs to promote non-breed specific products, and instead use them educationally to illustrate health issues for their readers. Finally, vets must stand at the forefront of this change to open the eyes and ears of the public. These breeds only exist due to human intervention in dog breeding, and with the growth in brachycephalic popularity, their breeding has reached a tipping point. There is little joy in performing the surgical operation to correct BOAS in these dogs knowing that more pugs will just come in day after day if we don’t change our ways. The future of these dogs relies on people seeing their eyes popping from their squished heads not as adorable, but as a need for change. Kate Downie is a fourth-year Veterinary Medicine and Surgery student

Spring 2018 | 21

foc u s

The trial of a Time Lord Andrew Bease examines the evidence to determine whether the fictitious race of time travellers could potentially exist When Sydney Newman created Doctor Who in 1963, an educational science fiction programme about a man called The Doctor, he probably didn’t imagine that 50 years later, new episodes would still be in production and that it would become one of the most popular television shows on the planet. It has become so ingrained in British culture that one need not have even seen the show to recognise its theme song, the ‘T.A.R.D.I.S.’ (a time machine disguised as a blue police box), or Tom Baker’s iconic portrayal of the Fourth Doctor with the multi-coloured scarf. I fondly remember my first real experience of Doctor Who as a child, seeing props of the Daleks, their creator Davros, and The Doctor’s robotic dog K-9 at an exhibition in the Nairn Museum. Over the years, Doctor Who has featured some fantastic, and some equally bad (or misinformed) examples of science in its episodes. Most viewers will likely never

Illustration by Vivian Uhlir

22 Spring 2018 |

consider whether the Time Lords, the alien race to which The Doctor belongs, could ever exist, dismissing them merely as fiction. The truth, however, is often stranger than fiction, and so here we examine the various aspects of Time Lord biology to separate real science from products of the imagination.

Experiments using frogs demonstrated that when fusion of the endocardial heart tubes was prevented, adult frogs developed two independent hearts

While Time Lords resemble humans in their outward appearance, the most obvious physical characteristic that differentiates them is that they possess two hearts instead of one. All vertebrates have only one heart, and most invertebrates do not have true ‘hearts’ since they have ‘open’ circulatory systems, meaning that their blood is not enclosed within blood vessels, as it is in vertebrates. The simplest animals, such as flatworms and jellyfish, do not have hearts at all. Cephalopods, the group of animals comprising octopuses, squids, cuttlefish, and nautiluses, are unlike other molluscs in that they have closed circulatory systems and actually have three hearts. However, unlike in the Time Lords, these three hearts perform the function of one, with two brachial hearts sending deoxygenated blood to the gills and a systemic heart that supplies the body with oxygenated blood. In vertebrate embryos, two heart

focus tubes are initially present, which then fuse into one. Experiments using frogs demonstrated that when fusion of the endocardial heart tubes was prevented, adult frogs developed two independent hearts. This suggests that a Time Lord’s binary cardiovascular system may develop if heart fusion was naturally prevented during embryonic development. Another seemingly fictitious trait that Time Lords possess is an unusually extended life span. The Doctor has lived for over 2000 years, and according to the Second Doctor, Time Lords could live “practically forever, barring accidents.” Meanwhile the oldest recorded human, Jeanne Calment, only lived for 122 years and 164 days. However, there are many examples of organisms that are capable of living far longer than humans. Tortoises are commonly known to live into their hundreds due to their sedentary lifestyle and slow metabolism and, more recently, Greenland sharks have been found to live up to 392 years, which may be the result of a combination of slow metabolism and the activation of genes involved in protein folding and DNA repair caused by the low temperatures of the Arctic Ocean. Both of these examples pale in comparison to the Great Basin bristlecone pine, of which one tree has been recorded as being over 5000 years old. Time Lords supposedly have a body temperature of only 15°C, which perhaps could result in a slower metabolism extending their life span, although The Doctor appears to be too physically active for this to be the case. Perhaps the most well known trait of the Time Lords is their ability to regenerate after being subjected to conditions that would kill a normal human. The concept of regeneration was introduced in Doctor Who in the serial The Tenth Planet as a way of allowing the series to continue without actor William Hartnell, who played the First Doctor, as he was suffering from ill health at the time. While the process of regeneration is depicted as a complete change in the physical appearance and personality of a Time Lord, the ability to regenerate damaged tissue is common in all animals, although not to a great extent in mammals and birds. Remarkably, urodele amphibians, such as newts, are capable of completely regenerating lost limbs and jaws as well as damaged retinas and sections of the heart. During this process of regeneration, it has been shown that Time Lords are capable of changing sex, with The Doctor himself becoming female for the first time in the episode

Twice Upon a Time. Most animals cannot naturally change sex, but this ability is common in fish, gastropods, and some plants, and is known as sequential hermaphroditism. Organisms can be either protandrous, changing from male to female, or more commonly protogynous, changing from female to male. Clownfish species are protandrous and live in social groups with a breeding pair and non-breeding males. When the female dies, the breeding male gains weight and becomes female while the largest non-breeding male sexually matures. Many species of wrasse are protogynous and change sex depending on age rather than social hierarchy. It is thought that sequential hermaphroditism is a useful reproductive strategy as it can reduce competition for mates or allow individuals to change sex depending on the age range in which each sex is most fertile.

Considering the evidence, it would appear that Time Lords could exist to some degree in real life, although they would still be very different from their on-screen counterparts

While the above examples provide good evidence for the existence of a real-life Time Lord or some similar organism, some physical attributes of Time Lords are clearly not scientific. Perhaps one of the most controversial pieces of Doctor Who lore amongst fans is the Eighth Doctor’s claim that he is half human. Hybrids are not uncommon in nature as bacteria can transfer DNA between unrelated species and some plants, such as the genus Rhododendron, extensively hybridise. However, animal hybrids are much less common and are often infertile. This is firstly because animals must be sufficiently closely related to conceive hybrid offspring, and secondly, the parent species must have an equal number of chromosomes with a high enough degree of similarity so that meiosis, a form of cell division that produces eggs and sperm, can occur in

the offspring’s gonads, allowing them to reproduce. Although there are fertile animal hybrids, they are rare and usually do not occur naturally. Given that one of The Doctor’s first companions was his own granddaughter, this would suggest that he is not infertile and although it is still possible that he could be a human-Time Lord hybrid, the chances of two species from different planets having an equal number of homologous chromosomes would be incredibly small in reality. In addition to this, the Time Lord’s ability to change into another species to disguise themselves in a matter of minutes is purely fiction, regardless of the fact that they use advanced technology to do so. Speciation is an evolutionary process that occurs over successive generations, and while new gene editing techniques are currently being developed for medicinal purposes, these are designed to correct or replace faulty genes, not insert an entire set of chromosomes into every cell in the body. The attributes discussed above are only a few of the main characteristics that distinguish Time Lords from humans. Considering the evidence, it would appear that Time Lords could exist in real life to some degree, although they would still be very different from their on-screen counterparts. Doctor Who has been in the hearts and minds of the British public for more than half a century now, and on many occasions the show has used good science, though not always intentionally, as part of its story and character development. However, as with all science fiction, the writers do take artistic lisence from time to time. Any science fiction fan would do well to be sceptical of the science they are presented with in their favourite TV show or movie, and doing some research would certainly not go amiss. After all, you might even learn something and perhaps in doing so we can thank a show like Doctor Who for helping to bring real science into people’s lives over the last 50 years. I can only hope it continues to do so for the next 50 years. Allons-y! Andrew Bease is a Whovian and third year Infection and Immunity PhD student at the Roslin Institute

Spring 2018 | 23

foc u s

Superhero origin stories: fact or fiction? Dianna Bautista explores the plausibility of becoming a superhero From the mid 1930s to the 1950s, people around the world were treated to crime-fighting vigilantes, aliens from Krypton, and radiation exposed scientists who become large and green when angered. This was the golden age of comic books. Like much of science fiction, the science presented in superhero adventures is a unique combination of real and imagined science. This rasies a common question in the minds of comic book readers and film viewers: can someone realistically become a superhero? This question actually has a complex answer when you consider the different definitions of a ‘superhero’. According to the Merriam Webster dictionary, a superhero can be any, otherwise normal, human being who is “exceptionally skillful” or a “successful individual”. A popular example is the characters of Batman fame including Batgirl, Catwoman, and Batman himself: Mr. Bruce Wayne, a vigilante who uses both technology and skills to protect his city. What Wayne lacks in superpowers, he makes up for in training and gadgets.

A common question in the minds of comic book readers and film viewers [is]: can someone realistically become a superhero?

The physiology of Wayne’s transformation from normal individual to skilled crime-fighter is explored in depth in E. Paul Zehr’s book Becoming Batman: The Possibility of a Superhero. The book not only details how Wayne’s training regime prepares his body for incredible feats, but also casts a realistic light on the limitations and demands of his training. In the comic books, Wayne can lift 690 pounds via the clean and jerk method, which involves lifting the weight to the chest then jerking it over the head, but struggles to deadlift 635 pounds, which involves lifting the weight while standing up. However, the current

24 Spring 2018 |

Olympic record for a clean and jerk for someone in Wayne’s weight category is only 580 pounds. For the deadlift, the world record is 1000 pounds and held by strongman Andy Bolton who is said to be five to six times stronger than the average individual. The formula for the maximum strength for weight lifting is roughly equal to the height of an individual squared, giving Wayne a maximum weight lift of only 494 pounds. All this training puts stress on Wayne’s body. Furthermore, a common problem athletes face is a difference in bone mass density between commonly used muscular areas and other areas of the body. For example, their skulls tend to be less dense than their non-athletic friends. Thus, not only does Batman’s training cross a certain stress threshold, it also causes a difference in bone mass density. This is a problem for Batman, who is often depicted jumping off of buildings. The more common definition of a superhero, however, is an individual who possesses superhuman abilities. Under this definition, one factor to consider is that many of the origin stories and explanations for an individual’s superpower(s) are rooted in real science. In the X-Men series, the explanation for the development of superhuman abilities is a change in the person’s biology due to the presence of a gene called the ‘X-gene’. Biologically, genes encode information to create macromolecules called proteins that are, in turn, responsible for the expression of one’s characteristic ‘traits’. There are many types of proteins and each protein uniquely influences how cells, and subsequently, organs and the human body function. Frequently, a gene undergoes a mutation resulting in a modification of the encoded information. This leads to the production of an altered protein, which can change the way the cell works in either a positive or negative way. A similar explanation was used by the writers of the X-Men series to explain the inheritance and expression of the X-gene mutation in the characters. In the Astonishing X-Men series, one of the characters states that the X-gene is a mutated to produce ‘exotic’ proteins that change the functioning of the individual’s cells. The presence of these proteins causes differences in cell-signalling pathways, causing cells to send

chemical signals that induce mutations in other genes. This domino effect eventually has an impact on the individual, causing them to develop abilities such as telepathy or control over electromagnetic fields. The abilities conferred by the X-gene mutation differ from person to person. With some exceptions, X-Men inherit their X-gene from their parents. The mutation then manifests itself when the mutant reaches adolescence. In genetics, certain genes can be ‘latent’ or inactive until something causes the gene to become ‘activated’ and start producing proteins. In the case of the X-Men, the X-gene is activated by puberty or, in some cases, trauma or stress.

Many of the origin stories and explanations for an individual’s superpowers are rooted in real science Ironically, despite having roots in real world science, creator Stan Lee gave the X-Men inherited genetic mutation to avoid the need to explain the origin of their superpowers as exposure to radiation. This accounts for why the biological basis of the X-Men is only explained in later editions of the X-Men saga. However, the assertion that X-Men mutants are a separate species superior to Homo sapiens, as commonly depicted in the X-Men films, is a point of contention between real science and the imagined world. Given the variety of the X-Men’s mutations and abilities, it is hard to define the unifying set of traits which is necessary to identify a specific species. How is one supposed to find the commonalities between a shapeshifter, a woman who can control the weather, and a man with wings? Scientifically, a more likely explanation is that mutants are simply Homo sapiens carrying the mutated X-gene. Using this explanation, the X-gene could have originated as a rare mutation that was maintained in a small population of individuals who reproduced with one another so that this


Illustration by Lucy Southen

mutation would remain in the population. In genetics, genes are made up of a set of alleles. Alleles are alternative forms of the same gene, including forms created due to mutation. For example, if a gene contained the information for the colour of a flower, one allele could encode for purple, whereas another allele could encode for pink. For the X-gene however, the presence of different alleles still can not account for the sheer number and extreme range of abilities conferred by its mutation.

The answer is a theoretical yes if we consider the X-gene to be a set of multiple genes [...] that collectively encode for extreme variants of otherwise normal human traits

Many of the other ‘mutant’ abilities presented in the X-Men series are extreme, defying the laws of physics and/or biology. One example is Darwin,

a mutant who can instantly adapt and evolve to fit different scenarios, such as growing gills by putting his head in a tank of water. As humans, it is possible for an individual to adapt to different environments, but feats such as breathing underwater by growing gills are a multigenerational form of evolution. Furthermore, as opposed to his namesake, Darwin’s ability goes against the theory of natural selection and instead emphasises Lamarck’s discredited theory of individuals changing their genes by using or not using a trait to adapt to a new environment. Under this theory, a giraffe’s neck is long not because a longer-necked giraffe reproduced more than a shorter-necked giraffe, but because the giraffe kept reaching for leaves in higher branches, its neck eventually became longer, resulting in the creation or alteration of longer necked genes. The problem with this theory is that it lacks a genetic basis; it is not possible to actively change gene content by continuously using a body part. Returning to the question of whether or not someone could realistically become a superhero, the answer, in theory, is yes; but only if we consider the X-gene to be a set of multiple genes, each with their own set of alleles that collectively encode for extreme variants of otherwise normal human traits. Under this definition, an individual with

the respective alleles could inherit a magnified version of one of the five senses. The X-Men character Peepers can see and identify objects outside the normal range of human vision. One explanation for this ‘telescopic eyesight’ is the presence of a higher than normal density of cells, called photoreceptors, that convert light into signals that the brain interprets as an image. The more photoreceptors an animal has in its eyes, the greater its ability to identify objects at a distance. This ability is, in fact, a feature of owls and other birds of prey. In fact, many of the more plausible mutant abilities reflect common traits of the animal kingdom. The world of comics and cinema is full of different superheroes, and though it is not realistically feasible to become Superman or Supergirl, there’s a bit more hope in training like Batman or a widget of hope to inherit a magnified version of a normal human trait. In any case, the superhero stories presented in the 1930s comics continue to enthral us all today. Dianna Bautista is a recent MSc Science Communication graduate from the University of Edinburgh

Spring 2018 | 25

foc u s

The perfect song? Alina Gukova looks for the elusive patterns in songs in a quest to find the ‘perfect song’ There is nowhere to hide from a hit song. It plays on the radio, pops up in TV ads and sometimes even infiltrates the world of the big screen. But how does a song become a hit? Is a hit good because it’s popular or popular because it’s good? We know the ‘technical’ way of determining a hit would be through weekly compiled charts based on purchases and downloads. Despite all the means of computation and analysis available today, the exact parameters which define a hit’s purchases and downloads remains elusive. Even more puzzling is why some songs enjoy momentary success, only to soon be forgotten, while others survive the test of time and are played on the radio 40 years following their release, after being revived by a movie or a remix. The International Association for the Study of Popular Music (yes, it exists) formed in the 1980s has yet to come up with definitive answers. There are, however, some clues as to what makes a great song.

‘Mmbop, ba duba dop’ on repeat will do

Repetitive patterns, for example, are an obvious theme when looking for a hit. As explained by Elizabeth Hellmuth Margulis, the author of On Repeat: How Music Plays the Mind, patterns invite us to participate, even if initially we do not feel like it. We welcome this offer with its promises of connection to a wider group of like-minded people. Moreover, with repetition, an aspect of familiarity also jumps into the spotlight. Familiar things are more attractive to us, they ‘grow on us’. Supportive of this theory, brain scan experiments performed at the University of Porto found that the brain’s limbic system, the control centre of emotions, and the reward pathway are more activated by familiar music than by unfamiliar music. Thus, it seems like a song doesn’t have to be sophisticated or tick certain boxes for artistry and meaningfulness to be popular. ‘Mmbop, ba duba dop’ on repeat will do, as long as we can easily recognise it.

26 Spring 2018 |

Social aspects inevitably also influence a song’s popularity. In a study by sociologist Duncan J. Watts and colleagues, participants were asked to rate songs while being able to see other participants’ results. Under these conditions, they found that the songs that already had votes gained even more votes. The study concluded that participants succumbed to the group influence when presented with too many songs (48 in total), apparently carefully listening and deciding for themselves wasn’t worth the effort. Furthermore, not only did it save energy, but it also allowed participants to gain more common ground with each other. However, this did not show any indication about the songs’ quality as while the effect persisted within a group, it did not always translate from one group to another. In other words, different groups showed preferences for different songs. This circles us back to the hit chart system, illustrating a potential self-amplifying feedback loop that propels a song to ‘hit’ status. Another ingredient for a popular song is the right lyrics. A study from North Carolina State University revealed the most common themes in hit songs between 1960s-2000s; over decades, topics gradually went from nostalgia and rebellion, through loss and confusion, to inspiration, but also desperation. Accompanying these gloomy undertones, the most influential words across the whole timeline were found to be ‘love’, ‘time’ and ‘man’. These findings seem hardly surprising as songs tap into our

Image by Spencer Imbrock courtesy of Unsplash

emotions and provide a sense of belonging and comfort in times of hardship. But the emotional response can also be detrimental in extreme cases. A case, documented in Japan, described a young woman who suffered a series of seizures (a so called ‘musicogenic epilepsy’) triggered by American pop songs, particularly Mariah Carey’s Dreamlover, because it brought back painful memories. Hence, universally meaningful lyrics are clearly a powerful aspect of popular songs.

This [...] illustrates a potential self-amplifying feedback loop that propels a song to ‘hit’ status Thus, for a potential hit, simple, repetitive, emotionally relatable and a potential to provide group bonding seem like a good starting set of guidelines, even though all that can be disregarded by seemingly random group preferences. But no matter how much we would like to plot our musical tastes in a nice, clean graph, it is far from straightforward. There are simply too many factors. And what’s the point, anyway? Just listen to whatever makes your heart sing. Alina Gukova is a third year PhD student in Biochemistry and Structural Biology


What you wear tells a story: make sure it’s a good one Milda Lebedyte explores how science can help fashion become more sustainable What we wear is a direct reflection of our personal identity and expression. There is little reason to believe the fashion industry will ever disappear in the same way the food industry will never be obsolete, due to our dependency on it for survival. Here, science has historically played a big role in maximising yields of crops through genetically modified organisms, and, considering that the fashion industry takes up as much as 25% of all globally produced chemicals, it is likely that, here too, science can contribute to innovation and sustainability. The emergence of fast fashion fuelled by modern consumerist culture has led high street brands to make cheaper and cheaper clothing at the expense of exploiting workers. As the second most polluting industry after oil, fashion poses an issue to both the individual and humanity as a whole. The average British citizen throws out 30 kilograms of clothing every year. Principally, how can we continue producing this much waste? Scientifically, how can we mitigate these effects? The role of mitigation can be split in two. Firstly, the production process of garments can be redefined by shifting from a linear economy, defined by the Waste and Resources Action Program as ‘make, use, dispose’, to a circular economy that aims to recover and regenerate products and materials after a service life. Although this is slowly gaining momentum, it is unrealistic to expect drastic changes overnight. The second and quicker option is to replace elements of the existing production process, which involves rethinking the starting materials, the dying process, and the recycling processes of old fabrics. Some companies are already working on this, albeit in the ominous shadow of fast fashion brands. For example, the Italian start-up Orange Fiber has pioneered a technique for converting orange peels from juice-factory waste into a silk-like fabric. Similarly, haute-couture designer Issey Miyake has explored the use of plastics in creating his garments. Auria London have created swimming suits out of recycled fishnets from the ocean. Natsai Audrey Chieza of Faber Futures has investigated the use of Strep-

Illustration by Milda Lededyte

tomyces coelicolor bacteria to dye fabrics, a technique that drastically reduces water pollution on an industrial scale. Highstreet brands are slowly catching up to this new ‘trend’. H&M, the leader of the fast fashion phenomenon, has campaigns to recycle clothes whilst brands such as Primark are reducing their carbon footprint by refusing to have an online store. One can’t help but be optimistic.

The average British citizen throws out 30 kilograms of clothing every year

But, ultimately, it is the mindset of the consumer that must change. We must become more aware of the choices we make. The non-profit organization Fashion Revolution is a perfect example of how this can be encouraged through social media, with campaigns such as Who Made My Clothes? and promotions of mixed-scale efforts around the world. Even the film industry, one of the main sources of information in this day and age, is exploring the ‘true cost’ of our clothes, through thought-provoking documentaries such as True Cost. Overall, Scotland has a rich history that prides itself on hand-crafting. And the University of Edinburgh is teeming with possibilities for making a change. In 2017 there was the Sustain.Ed

Festival. Now, in 2018, the Engineering for Change society has been advancing in their Precious Plastic project, where machines built by students help recycle plastic that can later be used as filaments for 3D printers. The R Sustainable fashion show, to be held in March 2018, is collaborating with ASCUS Labs to explore the relationship between science and fashion under the theme of ‘Cycles’. There are countless artisanal markets, zero-waste projects, Zero Waste Scotland, vintage shops, clothes drives… So, what can we do right now? Stay informed. Clothes are part of your identity; they are how you choose to define yourself. Brands that transparently tell us the impact their products have on the environment, such as Reformation, help us make more informed choices. After all, high-street brands try to adhere themselves to our tastes in order to sell as much as possible. This means that the ball is in our court: it’s up to us to decide what kind of story our clothes tell. Milda Lebedyte is a Chemistry student who is adamantly involved in Sustainable Fashion. She is one of the core organisers of the R Sustainable Fashion Show and a student ambassador for the Fashion Revolution Organisation for the University of Edinburgh

Spring 2018 | 27

foc u s

From folk tales to memes: why we’re obsessed with pop culture Teodora Aldea explores the evolutionary implications of popular culture Like it or not, pop culture is a pervasive influence and a substantial part of our lives, whether you’re binging on the new season of Stranger Things, stalking Rihanna’s ‘insta’, or dressing up for Comic-Con. As with everything widespread, pop culture has been both criticized and sanctified in the quest of understanding and exploiting human needs and impulses. The term ‘popular culture’ was coined in the 19th century, initially to refer to lowbrow forms of entertainment and to clearly distinguish these from the intellectual cultural habitat that, ordinarily, only the higher classes would have had access to. At the time, these forms of entertainment might have included music performances hosted in obscure halls, magic shows, or circuses. Later, fueled by the technological advances of the 20th century, the means of producing and distributing forms of popular culture expanded, and so did its definition. Popular culture today may refer to TV, radio, sports, games, and comic books, with its meaning having evolved to also include forms of cultural mainstream such as early folk tales, modern slang, jokes, and urban legends. While the palpable products of popular culture are easy to observe and quantify, the ‘why’ and ‘how’ are a lot more elusive. Popular culture implies ideas, themes and motifs that spread successfully across large groups of people. Exploring the motivations and implications of popular culture in a methodical, scientific manner thus requires a thorough journey into the collective human psyche. A great deal of insight into the progression of pop culture comes from the social sciences. For instance, anthropology, the study of essentially all that makes us human, is one of the social sciences that has attempted to characterize the importance of pop culture, albeit with somewhat of a rocky start. In the early days, anthropological studies were focused on isolated or ‘tribal’ cultures and carried out from an arguably elitist position, a reflection of colonialist attitudes at the time. Moreover, popular culture (as observed through folklore, song, dance, tribal body art, and so on) was widely dismissed by scholars and viewed either as cheap entertainment or as a manipulation

28 Spring 2018 |

tool of capitalism. However, the 1970s brought a shift towards a more inclusive view on culture. In 1978, anthropologist Johannes Fabian introduced the concept of popular culture in the field of anthropology and argued that popular culture is a valuable form of expression on its own, rather than a cheap distraction or the result of political and economic events. Moreover, Fabian argued that popular culture is a useful tool in the study of mass cultural expression, especially in an age characterized by more communication and cooperation than any previous era. Pop culture thus became a valuable resource in the study of humanity, which was at the same time less economically and intellectually divisive.

It can be argued that the creativity that gave birth to these imagined worlds and situations is both a by-product of human evolution and a driving force for it - after all, we have shaped our world according to our imagination On the other hand, evolutionary psychology, which explores how the traits of the human mind have evolved as a result of the pressure to survive and reproduce, can bring an additional perspective to the table. This approach can be applied in order to understand not only these traits, but also their by-products, such as culture in any form. For instance, evolutionary psychologists Maryanne L. Fisher and Catherine Salmon condense the struggles and experiences of man into three main themes: survival, mating, and social living, pointing out that these themes dominate most products of pop culture. Indeed, films, shows or songs which focus on universal topics and feelings such as fear, love, or loss are the ones that spread most easily

across countries and continents. It can be argued that, since humans evolved so successfully thanks to their ability to efficiently relate to one another and cooperate, popular culture brings this cooperation to a new level, where we can tell tales and learn from one another more efficiently and with less effort. While the fundamental themes in popular culture are reflections of basic human experiences, the details, patterns, and subject matter have gradually evolved to mirror the changes in our world and in our mentalities. In terms of sociality and character development, over the last decades, TV and film in particular have gradually begun to tackle themes like gender equality, gay rights, and mental illness, which are also topics that have become increasingly prominent in our social and political discourse. For example, in the early decades of television, women were mostly portrayed as housewives or damsels in distress; this is illustrated, for instance, by 1950s sitcoms such as The Donna Reed Show and I Love Lucy, where the female protagonists are constantly cooking or cleaning and entirely reliant on their male counterparts. The modern TV landscape, on the other hand, is peppered with shows featuring strong female leads - just look at popular programs like Game of Thrones or Orange is the New Black. From a more scientific point of view, at least part of the evolutionary psychology subject matter supports the idea that the similarities between genders or sexes outweigh the differences. In terms of reproductive behaviour, sexual and gender differences are obviously clear cut, but both genders have had equal participation in other aspects of life such as survival, social exchange, or finding and inhabiting a particular habitat. Consequently, the current tendency of society and pop culture towards more equality does not go against evolutionary principles. Another way in which pop culture is linked to evolution can be observed through the spaces and situations that characters inhabit and encounter. These are often imagined worlds, which may have roots in reality but have been embellished through the creative process. This


Image courtesy of Pixhere

can be seen from ancient folklore, which featured mythical creatures, to modern day pop culture, which is dominated by science-fiction, superheroes, and horrifying monsters. Although these imagined entities may arise from basic emotions such as fear, they manifest differently under the influence of the environment and personal experiences of the author. It can thus be argued that the creativity that gave birth to these imagined worlds and situations is both an effect of human evolution and a driving force for it - after all, we have shaped our world according to our imagination.

A meme, according to Dawkins’ initial definition, is a noun that “conveys the idea of a unit of cultural transmission, or a unit of imitation” While some become absorbed by Netflix and Jennifer Lawrence interviews, others are busy sticking warning labels to the many products of popular

culture: video games will make your child violent, tattoos will turn you into a loser, and TV will make you dumb. However, there is little consensus on whether pop culture does more to reflect or to influence human attitudes and mentalities. For pessimists, the image of pop culture as a mirror to our current society is thoroughly backed by statistics: a recent study published in the journal Psychology of Aesthetics, Creativity, and the Arts found that in the last three decades, popular song lyrics have become more antisocial and self-obsessed and themes like depression have become more prevalent as reported rates of depression have increased. By creating and consuming these products, we are using pop culture as a way to express struggles that are both intimate and universal- whether it is to internalize them, raise awareness about them, or simply to blow off steam. On the other hand, ideas, lessons, and themes often seem to spread indiscriminately across a large group of people and take on a life of their own. This has been theorised in the past through the concept of ‘meme’, which was initially developed by Richard Dawkins and later explored in more depth by the ever-criticized field of memetics. A meme, according to Dawkins’ initial definition, is a noun that

“conveys the idea of a unit of cultural transmission, or a unit of imitation” - essentially any behaviour or idea that easily spreads from person to person, often leading to a more generalized phenomenon (think catchphrases, fashion trends, or even the meme as seen on image boards like imgur and 4chan). Moreover, the meme can be considered the cultural counterpart of a gene, since both units replicate in a similar manner in order to ‘survive’. With all this in mind, it becomes clear that popular culture emerged not only as a way to create mirror images of ourselves and our society, but also as a way to communicate, teach and learn. The basic human instincts - survival, continuation of the species, and cooperation - are omnipresent in most products of popular culture, and the stories passed on this way can travel further and more successfully precisely because their message is condensed and simplified. Teodora Aldea is a second year PhD student in Cardiovascular Science

Spring 2018 | 29

foc u s

Pop psychology Maria Fernandez explores how peer pressure and changes to the science and technology of marketing influence our choices and opinions Popular culture is a powerful force. Created through shared ideas, morality, and standards, it sculpts society by dictating what is fashionable and what is ‘social suicide’. The very nature of pop culture is temperamental, stimulating both successful and detrimental outcomes. Powers beyond our control influence our seemingly personal actions. Our thoughts and feelings are largely guided by the waves of social change, affecting how we view ourselves, our peers, and those we idolise. Peer pressure plays heavily into the choices we make. Adolescents seek acceptance from friends and experience immense pressure to conform. At a time when they face natural insecurity, teens are inundated with marketing. Brands sell self-images, telling them they can find identity through materialism. Psychologist Susan Linn notes in her book Consuming Kids that US companies invest around $15 billion a year in advertising aimed at young people, a demographic that influences over $600 billion of spending. Using psychology to understand and direct marketing strategies works for adults too. An influential study in 1989 collected data from over 10,000 participants across 37 cultures assessing human companion preferences. Males overwhelmingly preferred younger partners and valued physical attractiveness more highly than females. In contrast, females prioritised earning capacity and industriousness. While this may now be an outdated representation, it could explain why, traditionally, men have been pressured to be the main household providers, and why the beauty industry is both relentless

and successful when targeting women. Enter social media. Advancements in technology have given companies stealthier ways to reach and manipulate a larger demographic. With media outlets like Facebook and Instagram, brands are only one click away from communicating with millions. Weapons such as cookies and the ‘internet of things’ provide organisations with an avenue to collect and exploit data on individuals, leading to more personalised advertisements. Fortunately, technologies also give some power and a means of defence back to the individual. With almost unlimited data at our fingertips, ideals of honesty and transparency are promoted, as both company and consumer are under scrutiny. If brands are found to be false, their reputations crumble. For example, the Volkswagen ‘diesel dupe’ scandal of 2015 led to a loss of 23% of Volkswagen’s market value and the resignation of CEO Martin Winterkorn. Marketing strategies are not limited to promoting commodities. In the 2017 UK general election campaign, the Evening Standard reported that social media garnered Corbyn a 45% increase in followers on Facebook and Twitter. Popular culture has infiltrated the political world so deeply that The Guardian published an article headlined, ‘How Social Media Saved Socialism’. Equally, the presence of world leaders on social media, most notably Donald Trump, has rallied a new wave of support for the opposite-end of the political spectrum. This ‘culture of proximity’ has had a profound effect as we now feel "closer" to celebrities. Their opinions become our

own, and society’s morals, from equality to sexual freedom, vary correspondingly. This closeness has encouraged brands to act like people, and people to act like brands. In influencer marketing, companies use prominent figures to act as buyers and endorse their products, and relay them on to target markets. This is especially noticeable on platforms like YouTube and Instagram. Just 24 hours after Kylie Jenner revealed she had lip fillers, Dr Leah Totton’s clinics saw a 70% rise in enquires for the procedure.

Brands sell self-images and tell them [teenagers] they can find identity through materialism Even subcultures promoting independent thinking and counter-culture are affected by mass media and the tendencies of modern society. This anti-mainstream movement has, paradoxically, become mainstream. Being different and ahead of the trends is now considered cool. Ironically, anti-mainstream culture drives mainstream consumerism, with companies exploiting the attractiveness of being unique to market the alternative, cooler choice. If new tastes and styles do not emerge, the economy will become stagnant. Perhaps we can pinpoint our decision-making to the fact that we are driven by attempts to become more attractive to others. Maybe what we wear and the music we listen to are all adaptations in evolutionary psychology - to survive and be successful amongst our peers. In a culture where the line between consumer pressures and social desires is increasingly blurred, I encourage you to find, defend, and embrace your individuality. After all, it is one of your greatest possessions and a quality completely free of charge. Maria Fernandez is a second year Biology student

Image by geralt courtesy of Pixabay

30 Spring 2018 |


Beyond Breaking Bad: chemistry and pop culture Emma Alexander investigates the perception of chemistry in society and how it has been shaped by popular media Fire. Explosions. Drugs. Medicines. This is the image you’ll probably conjure if asked to picture a chemist. It is also typically the image shown whenever ‘scientists’ are broadcast - a (somewhat clinical) and perhaps dangerously alienating perception of chemistry. As a final year chemist, I have observed many of the stereotypes and cliches which shroud the chaotic world of chemistry. Chemistry, however, is much more than that. Not all chemists hide away in pristine labs covered top to bottom in protective equipment. There are many fields of chemistry where that is not required: computational chemistry typically involves complex coding and calculations on state-of-the-art computers, and chemistry can often be carried out in the field, especially in the case of environmental chemists, who typically just wear whatever is weather-appropriate. A popular television show from recent years, Breaking Bad, brought chemistry into the spotlight. The basic premise of the show involved a chemistry teacher recently been diagnosed with cancer who decided to switch careers to start making meth. To some people, mention chemistry and Breaking Bad will be one of the first things they think of. Whenever I am asked the standard ‘what do you study’ question, “Have you seen Breaking Bad?” is often the first follow-up question that follows my response. Although some people may have gained a more negative perception of the science from the show, it has opened up chemistry to others who may never have thought about it. Perhaps it has even resulted in some people taking up chemistry at university because of it. Breaking Bad has not been the first series to showcase chemistry. In 2002, celebrity chef Heston Blumenthal hosted the TV series Kitchen Chemistry popularising the discipline of ‘molecular gastronomy’ and using chemistry to create the perfect ice-cream. In a recent episode of Jamie & Jimmy’s Friday Night Feast, a television show hosted by celebrity chefs Jamie Oliver and Jimmy Doherty, they use a ‘scientific’ recipe for the ‘best’ Yorkshire pudding created by the Royal Society for Chemistry (RSC) back in

Illustration by Rachel Berman

2008. Food chemistry is a not only a vital field of research for the creation of new and improved flavours for our favourite foods, but also contributes to developing novel methods to feed the ever growing global population.

Not all chemists hide away in pristine labs covered top to bottom in protective equipment The RSC conducted a survey in 2015 to gauge public attitudes towards chemistry and found that 51% of people were neutral to chemistry whereas 9-13% were either confused or bored by the subject. This suggests that the overall view of chemistry, while not positive, is not overly negative either. The survey also discovered that some people believe chemistry is the source of some of the world’s greatest issues including the rise of bacterial resistance to antibiotics and the problem of

pollution. The media regularly produces articles on these issues which often cause alarm amongst the public. Chemistry, in fact, actually provides the solutions for many of these global challenges; for instance, the last few decades have seen the popularisation of sustainable and green chemistry, with an associated increase in investment in the field. The wider public’s misconceptions could perhaps be readily changed by reinforcing the positive aspects of the discipline. In the future, we need to keep emphasizing how chemistry plays into our everyday lives, and focus on engaging the wider public with the latest developments in the field. Chemistry encompasses lots of valuable research, and it is vital that society becomes accurately aware of its relevance in relation to our lives. And that, perhaps most importantly, we can begin to understand that not everything portrayed in the media regarding chemists and chemistry is an accurate reflection of the community. Emma is a fifth year Chemistry student at the University of Edinburgh

Spring 2018 | 31

foc u s

Biology for the masses Marta Mikolajczak explores how some of the common misconceptions portrayed in pop culture shape our understanding of biological science The mysteries of life around and within us have inspired scientists as well as the creators of movies and novels. A. Bowdoin van Riper, a historian specializing in the depiction of science and technology in popular culture, stated that nowadays, mass media probably shapes understanding of science more than formal education does. Unfortunately, scientific ideas are often misrepresented, either from a lack of understanding or with the intention to add sensation to a novel or movie. Some misconceptions are so ingrained in our society that demysifying them is very tricky. Here, I take a look at the most common misconceptions.

Another closely related ‘brain’ concept is that [...] the size of the human brain is directly proportional to an individual’s intelligence One of the most common misconceptions is the notion that humans use only 10% of their brains. This idea is thought to have arisen in the 19th century, spreading via flawed psychological analyses of child prodigies, such as William James Sidis, who was accepted to Harvard at age 11. This myth has now been fully debunked and is often laughed at by scientists, yet in 2013 the Michael J. Fox Foundation reported that almost 65% of Americans still believed this to be true. This isn’t surprising since this misconception often appears in novels and movies. In the 2011 movie Limitless, a drug allows a writer to access the ‘full potential’ of his brain, granting him superhuman intelligence. In the 2005 romantic comedy Wedding Crashers, Owen Wilson’s character says: "You know how they say we only use ten percent of our brains? I think most people only use ten percent of their hearts". A third example is Lucy (2014), where a woman gains psychokinetic abilities when a nootropic drug (a

32 Spring 2018 |

substance that enhances cognitive brain function) is absorbed into her bloodstream. This 'humans only use 10% of their brain' plot is certainly not unique. Another closely related ‘brain’ concept is that humans, in comparison to animals, have exceptionally large brains. Or that the size of the human brain is directly proportional to an individual’s intelligence. None of this is true of course, and animals such as mice, dolphins and some birds have similar brain-to-body proportions as humans, as stated by Megan Scudellari in a 2015 Nature article. Yet this myth is frequently perpetuated by cartoons, such as The Simpsons, where Homer’s tiny brain is linked to his stupidity. A second topic that deserves pride of place among common misconceptions is genetics. By brushing over gene interactions and the interplay of genes with the environment, popular culture often thinks that our destiny is written in our DNA. There are countless comic books, cartoons and movies based on the idea that single genes code for specific polygenic anatomical and behavioural features. According to these sources, mutations in single genes can unleash superhuman powers. Paraphrasing Van Riper, some complex qualities, such as personality, beauty and intelligence, cannot really be created by plugging the right genes into the right places in the genome of an individual. Some of the examples that instantly spring to mind are Marvel characters such as the X-Men, Spiderman, the Fantastic Four, and many more. Modifying the genome in order to obtain superpowers in movies is not restricted to humans. In the 1999 movie Deep Blue Sea, a team of scientists researching Alzheimer’s disease create genetically engineered sharks with super-sized brains that become super-intelligent, and super-deadly. Another example of unrealistic mutagenesis experiments is the 2006 movie Black Sheep, where calm herbivores are turned into brutal, bloodthirsty beasts. Misconceptions about gene manipulation are also depicted in fine art, for example, Errol Jameson’s Genetic Engineering, a painting that illustrates an apple with monstrous teeth that bites a human hand. Although these examples add

great comedic value to movies or provoking deeper contemplation through paintings, these scenarios share little with the reality of genetic modification performed in laboratories these days. Possibly the most frustrating misconstruction in biology is the persistent idea that free radicals generated by skin cells can be combated by consuming or applying antioxidants. The myth that antioxidants promote longevity originated in the mid-20th century when chemist Denham Harman proposed that free radicals arisng as by-products of metabolism generate cellular damage and lead to ageing. Antioxidants would supposedly neutralise these radicals and hence promote longevity. While dietary antioxidants have been shown to be beneficial for some medical conditions, for instance in macular degeneration, and some benefits of a diet rich in fruit and vegetables are thought to be from antioxidant compounds, the ‘free radical theory of ageing’ led to the spread of a billion dollar-antioxidant market with poor evidence to support it.

The idea of an alpha male stuck and was hugely propagated in popular culture Studies of the interaction between organisms and their environment is another common fallacy. One of these is the deadliness of certain snake species. The boa constrictor has an especially bad reputation from Disney’s Kaa in the 1967 Jungle Book, and Sir Hiss in 1973 Robin Hood. They rarely attack humans in the wild but are often confused with pythons, which are responsible for a larger number of human deaths. Fortunately, there are attempts to correct this mistake and more modern sources refer to the Disney snakes correctly as pythons. Misconceptions about the prey of boa constrictors can also be seen in fine art. For example, James Ward’s painting titled Man Struggling with a Boa Constrictor, Study for

focus "Liboya Serpent Seizing its Prey”, portrays a clearly distressed man trying to fight off a snake that is tightly wrapped around his chest. A painting by Aloys Zotl, The Tiger and the Boa Constrictor, depicts the snake as an extremely powerful animal one that is capable of squeezing a tiger to death. In reality, boa constrictors eat small and medium-sized mammals but not anything as large as a human. In 1944, Rudolph Shenkel was the first to use the term ‘alpha male’ to describe the most dominant animal within a wolf pack while he was conducting studies in a zoo. Shortly after, observations of animals in the wild showed that wolves display behaviour closely resembling human family relations, rather than being based on competitive dominance. Yet, the idea of an alpha male stuck and was hugely propagated in popular culture. For example, the wolf Akela in the Jungle Book shows typical characteristics of an alpha male - he is described as a powerful animal “who led all the pack by strength and cunning”. The idea of a strict pecking order within wolf groups was pushed further in the 2011 movie The Grey, where there is not only an alpha but also an omega

animal, that is the weakest one. Wolf packs, in fact, tend to consist of a family, usually a mated pair and their offspring, or some cases two to three families

Understanding the psychology of how common misconceptions arose could help determine why they are so predominant

recognise that popular culture tends to entertain, rather than provide us with facts that could be used as a base for further engagement with a given topic. As Van Riper accurately states, “popular culture can be a shared frame of reference, within which scientists and the public can discuss both science and its social implications, but it can only serve that role if scientists are willing to take it seriously.” Discrediting myths themselves is important but understanding the psychology of how common misconceptions arose could help determine why they are so predominant and widespread and prevent new myths arising. Marta Mikolajczak is an early career neuroscience researcher at University of Edinburgh

Popular culture undoubtedly has an enormous impact on how we perceive science. Our education does not always allow us to spot scientific inaccuracies presented in books, movies, or otherwise. Ideally, pop culture should refrain from spreading misnomers, instead being a platform for presenting beneficial knowledge. For now, at least, we should

The most efficient progresses in science happens when people cooperate, sharing ideas and experience, instead of competing with one another.

Image by Elke Karin Lugert courtesy of Unsplash

Spring 2018 | 33

foc u s

Science on the small screen Emma Dixon explores the good, the bad, and the ugly representation of science on TV Like many science nerds, I enjoy science-related TV shows. However, as someone with an eye for accuracy, I can’t help but pick up on scientific inaccuracies and anomalies, be it minor details or wholly inaccurate science. This article will explore the accuracy (or inaccuracy) of some of the biggest science themed shows around, and examine how shows can seek to improve their scientific prowess.

It seems that CSI may as well stand for Crime Scene Inaccuracies

As anyone who has found themselves aimlessly flicking through TV channels at 3pm on a weekday knows, there are a multitude of forensic crime based shows out there. Whilst science may not always be at the forefront of the plot, scientific testing and analysis is often at the crux of a plotline. Unfortunately, these are more often than not a misrepresentation of forensic science. As someone with a science background, the biggest inaccuracy that jumps out to me is the speed at which countless forensic and biological testing is achieved, a far cry from the many hours I recall spending in the laboratory hoping to finish but one analysis. For example, DNA testing, a staple in any forensic show, is often shown to be a simple, quick test. Whilst new technologies such as ‘RapidDNA’ machines can analyse DNA samples in a matter of hours, these are not usually the machines available to forensic scientists. For example, the FBI is still working towards integrating such machines into their systems, with first testing expected in 2019. Even if all forensic laboratories had access to these speedy machines, they can only handle 5 samples at a time, which would create quite the backlog for TV plotlines with large numbers of samples to analyse. More ‘traditional’ non-rapid machines are able to take many more samples at a time, but also take many more hours, if not days, to fully process results. Another firm favourite is fingerprint analysis. Whilst fingerprints can

34 Spring 2018 |

Image of TV Forensics courtesy of Flickr

be useful, they are hard to find, especially whole, well-preserved ones. And although they might confirm that a suspect had, at least once, visited the scene of the crime, they would probably highlight hundreds of others that also have, for example, visited that particular bank. One of the most misleading aspects of forensic science shows, as well as many other TV portrayals of scientists, is how forensic scientists are able to jump seamlessly between disciplines, analysing blood samples one minute, while testing chemicals from gunshot residue and completing geological analyses of soil in the next. Whilst forensic scientists may have a good general knowledge of a range of scientific fields, it is inaccurate to show them as experts of all areas. This presents scientists as all-knowing geniuses, reinforcing the ‘science is only for really clever people’ stereotype that is, unfortunately, still prevalent in our society. It seems that CSI may as well stand for Crime Science Inaccuracies. Much like crime shows, medical dramas are massively popular, with the likes of Grey’s Anatomy, House M.D. and Scrubs raking up countless seasons. Whilst being thrilling and involving seemingly realistic gory surgical scenes, how well do these programmes portray the medical world? Unfortunately, not as well as we would hope. From very high prevalence of rare or exciting

cases, to the overdramatic personal lives of the characters, House M.D. and Grey’s Anatomy fail to accurately depict many aspects of medicine. Whilst many medical cases are built upon real-world examples and tweaked for small-screen drama, many simple and more realistic cases are often mistreated in these shows. For example, a 2010 study examining hundreds of episodes from leading medical shows found that only 29% of seizures were appropriately treated. The narrative of these TV shows usually follows junior doctors from the start of their careers as they rise through the ranks. Yet even in their early years of practice, these shows often show our young medics involved in big cases and assisting in operating rooms (OR) irrespective of their speciality, when in reality young doctors are rarely seen within the OR (Grey’s Anatomy, I’m still looking at you). Instead, they are more likely to be doing ‘scut’ work - placing orders, collecting signatures from patients and accompanying patients when they are taken to be tested. Even in the ward, it is the nurses that handle most of the day-to-day care and medication of patients. One show that does fairly well in this regard is Scrubs, in which the nursing team is shown to be paramount in the care of patients; in fact, by most accounts Scrubs rates as one of the most accurate medical shows of recent years.

focus But what’s the issue here? Does it matter that medical shows are a little far-fetched? Alarmingly, these shows can have quite a large impact on public health. In 2005, the US Center for Disease Control and Prevention (CDC) highlighted that one third of those that watch medical shows took ‘some sort of action’ (including discussing the health topic and visiting a medical professional) after seeing health topics on television. Many other studies indicate that medical TV shows impact the preconceptions that people hold about healthcare and medics; the accuracy, or lack thereof, of these shows can have a real impact on public health. This potential impact should therefore be considered by the shows’ writers. For example, the CDC suggests that storylines include prevention techniques and model behaviourn in order to benefit the audience. For example, a credible or favoured character opting to get a flu jab. Interestingly, some shows are being used to help medical students develop ‘soft skills’, including watching House M.D. as part of ethics discussion groups, indicating an unusual benefit from medical show inaccuracies. Not wanting to finish this article on a negative note, it seems only fair to mention a show that does particularly well at integrating accurate science into its scripts. The Big Bang Theory may have

its flaws, mainly in its crude reliance on stereotyping and misogyny in its humour, but it does score well for scientific accuracy. This is mainly thanks to its science consultant Professor David Saltzberg of UCLA. Saltzberg advises on scripts - where he often fills the in ‘insert science joke here’ section and the physics equations on the whiteboards, which often hark to recent developments and research in the physics world, that are ever-present in the film-set. From Season 4 onwards, Saltzberg was no longer the sole scientist on set, thanks to the addition of Dr Mayim Bialik. Bialik, who plays neuroscientist Dr Amy Farrah Fowler, herself has a PhD in neuroscience and advises on biological topics that come up in the show’s plotlines, as this is beyond Saltzberg’s expertise, highlighting that scientists are only specialists in their own fields. Saltzberg also understands the importance of scientific accuracy on shows such as The Big Bang Theory. He says, "this has a lot more impact than anything I will ever do, [...] It's hard to fathom, when you think about 20 million viewers on the first showing”. With such a large, and international audience, Salzberg is aware that more people will watch The Big Bang Theory than will likely read his written scientific work, and hopes that the show may inspire others to become scientists.

By most accounts, Scrubs rates as one of the most accurate medical shows of recent years Evidently, some science-themed TV shows do better than others in terms of accuracy, but may still continue to inspire generations into science professions. By increasing the accuracy of these shows, hopefully many more will take up science or feel that they understand what science involves. Whilst it may not be practical or realistic for all science-themed TV shows to cast scientists, I believe many shows could take a leaf from The Big Bang Theory’s book in taking on a science consultant of an appropriate discipline to aid both the script writers and actors in portraying science accurately on the small screens. Until then, I’ll continue to be that friend who chimes in with “Really? That’s not how it would actually happen…” every time a detective tries to make blood glow under UV light. Emma has recently graduated from the University of Edinburgh with an MSc in Science Communication and Public Engagement

Image by DarkoStojanovic courtesy of Pixabay

Spring 2018 | 35

foc u s

An inconvenient truth Rachel Harrington explores how sci-fi popularisation has affected the evolution of scientific research ‘An Inconvenient Truth’: A term defining a fact not suited to your comfort? The critically acclaimed documentary following Al Gore’s educational campaign on global warming? Or the encroaching influence of everything from politics to sci-fi on scientific research? Receiving standing ovations upon the documentary’s debut, An Inconvenient Truth is credited with re-energizing the environmental movement in the noughties. Unlike science fiction, a genre often seemingly confined to a stereotypical list of dystopian futures, alien invasions, and time travel, An Inconvenient Truth highlighted the power of addressing relatively marginalised topics, such as climate change. Whether the blame for the seemingly limited scope of scientific research falls with politicians, scientists, or fiction remains to be explored, but the solution could lie with sci-fi. Utilize the genre as a powerful platform to educate and influence, and we may find previously ignored research areas receiving more attention. Whether we like it or not, science is political. Often, those deciding what areas of science are valuable enough to warrant research are not scientists, but politicians. Much of scientific research rests on an obviously political arena: money. The bulk of scientific research across the UK is carried out by seven specialised Research Councils. Each year these Councils receive £3 billion of government funding, but the money is not divided equally. The Engineering and Physical Sciences Research Council received £790 million for the year 2018/19, whereas the Natural Environment Research Council, received just £290 million. Of course, governments (supposedly) act in the public interest and prevailing public opinion can – and should – pressurise the government to change tack. Public opinion and government decisions regarding scientific research are, therefore, dependent upon access and exposure to scientific information, including those that are fictitious. Arguably, the biggest resource to the general population exploring scientific themes is sci-fi. With that in mind, it makes sense that the more commercially exciting and profitable sci-fi topics are the same areas most invested in by the government. Obviously, the government are not

36 Spring 2018 |

the only body making decisions about what scientific research the UK focuses on. Looking further into the Research Councils themselves, funding is approved for individual projects via a highly competitive process. Potential projects are evaluated by independent peer-review and only the most promising receive approval. It makes sense to allow scientists to make these detailed decisions, given that they are in theory the best equipped to understand what is being proposed. However, given the parallels between lucrative sci-fi plotlines and prominent scientific research, perhaps fiction could be influencing the decisions of scientists more than previously recognised.

Utilize the genre as a powerful platform to educate and influence, and we may find research areas previously ignored receiving more attention Historically, it is well-documented that many pioneering technologies now paramount to our everyday lives have sci-fi to thank for their inspiration. Robert H Goddard, credited with creating and building the world’s first liquid fuelled rocket in 1926, became interested in space exploration as a teenager after reading H.G. Wells’ War of the Worlds. The invention would go on to help the US in its successful 1969 moon landing. And we can thank Star Trek for more than its quotability. The ‘communicator’, a hand-held device that enabled characters to talk to one another, inspired inventor Martin Cooper, amongst others, to develop the first prototypes of the mobile phone. Furthermore, in a fashion similar to the ‘Universal Translator’, which explained why every character seemed to speak English regardless of planetary origin, many apps now exist to facilitate voice translation of over seventy languages (sadly, not Klingon just yet). In the hope of emulating a Goddard-style discovery, researchers may continue to be looking at fictional inventions for inspiration.

In turn, with an appreciation of how sci-fi has helped science, scientists may write with the hope to spark the next big finding. Examples range from Michael Crichton, author of Jurassic Park and Biomedicine Graduate, to Muhammed Zafar Iqbal, professor of physics and arguably the most prolific sci-fi writer in Bengali literature. Fred Hoyle, who coined the term ‘Big Bang’, noted he wrote science fiction to publish ideas that would not fit into scientific journals, with the genre seemingly offering scientists a platform for professional brainstorming. Conversely, if scientists and writers stick only to the themes that have led to the eurekas of the past, we can imagine a bottleneck in what they are exposed to and inspired by. Arguably, the reason sci-fi only explores a limited range of themes is simple: it’s a money-making business. Writers focus on the topics that will be lucrative. Sci-fi is hugely important to the film industry’s economy, with five of the top ten grossing films of all time falling into this genre. The highest, Avatar, grossed £3 billion upon its release. Strikingly, this figure is also the yearly total allotted by the government to fund all scientific research in the UK. With that in mind, perhaps we ought to be investigating how to garner more of the public, and their pockets, making them as passionate about scientific research as they are about sci-fi. Sci-fi cannot exist without science, so perhaps this should be more directly acknowledged by the media industry, with sci-fi profits directly funding research. Of course, we must be careful not to rely on sci-fi alone to direct research. Funding must not be limited to themes explored in sci-fi, even if they do become more wide-ranging. After all, even when funding is invested into specific research areas, often discoveries that are made are most beneficial to seemingly unrelated fields. Nobel laureate Peter Mansfield’s physics research into polymers laid the foundations for what in time would lead to the development of the magnetic resonance imaging (MRI) scanners now found in every hospital. We must therefore appreciate that unanticipated breakthroughs are just as valuable, sometimes even


Illustration by Vivian Uhlir

more so, than agenda-driven research. Research Councils and politicians must be willing to sometimes invest in projects with uncertain returns in order to reap the reward of serendipitous discovery. With all of its profitability and awe-inspiring storylines, the question then arises as to why sci-fi doesn’t instil a universal passion for science and research from its audiences. The answer may be that sometimes the concepts are too complex or abstract to be comprehensible to a non-specialist audience. If there is any hope of using sci-fi as a platform whilst allowing it to remain an entertaining experience, we must acknowledge the publications which aim to explain the fundamentals behind some of our favourite franchises. Often authored by professional scientists in the format of ‘The Science of [X Media Franchise]’, these articles use elements from films as a starting point for the discussion of factual concepts, briefly investigating their probability of occurrence and where scientific research would have to head to make it a reality. Not only is this an original and imaginative way to get the public interested in science, it

is accessible to the public and govern- convinced Nichols to stay, noting that her ment alike, increasing exposure to sci- inclusion was helping to portray a future entific information in popular culture. people could aspire to. Her impact even circled back to the scientific community. Nichols went on to work with NASA: Given the parallel between after criticising their selection of allwhite male astronauts, her influence led to lucrative sci-fi plotlines the selection of both the first female and and prominent scientific the first African-American astronauts. Just as Nichelle Nichols and Al Gore research, perhaps fiction is expanded the minds of audiences in influencing the decisions terms of science, societal constructs, and climate change, the genre of sci-fi has of scientists more than even more to offer: namely to catalyse previously recognised the expansion of scientific research. Star Trek and An Inconvenient Truth prove that audience entertainment doesn’t have to A stellar example of how influen- be sacrificed for sci-fi to be educational tial sci-fi has already been, with benefits and influential. It can remain a pillar reaching beyond advances in science to of the film industry whilst continuing restructuring social constructs, is the to inspire scientists, influence public casting of Nichelle Nichols as Lieutenant opinion, and steer government funding. Uhura in Star Trek: The Original Series. The inconvenient truth may be that Ground-breaking for its time, not only science fiction is to blame for the recent was it rare for a black woman to appear narrowing of scientific research. Luckily on primetime TV, it was even rarer to for us, it could also be the solution. see a black woman cast in a high-power role. While she considered quitting after Rachel Harrington is a third-year the first series, Martin Luther King Jr Physics student. Spring 2018 | 37

feat u re s

Is ecology the original science? Louis Walsh explores whether early human interaction with the environment was instinct or ecology The emergence of modern science that took place around 500 years ago during the renaissance is often described as the ‘scientific revolution’. The key hallmark of this revolution was the dramatic shift in the way humans thought about and approached the world around them. The scientific revolution facilitated the extraordinary growth and development of human society, producing vast quantities of knowledge and power. The change in approach was simple, yet profound: admitting ignorance. In his book Sapiens: A Brief History of Humankind, Yuval Noah Harari describes the scientific revolution as ‘a revolution of ignorance’. Originally a professor of world and military history, Harari has recently turned his attention to macrohistorical questions such as the relationship between biology and history and the trajectory of history. According to Harari, by admitting ignorance, new knowledge could be obtained by formulating and testing hypotheses. The information obtained can then be applied to solve a particular

Illustration by Gillian Cavoto

38 Spring 2018 |

problem or to gain more information. Despite humans not having admitted ignorance before the scientific revolution, they were already actively engaging with the world around them, making scientific observations and deductions, just not under the rigid umbrella of what we now call the scientific method. Although this produced many false theories and beliefs, it still required intricate knowledge of mathematics, astronomy, and physics. However, the earliest scientific field that humans would have engaged with would have been what today is known as, ecology. It is important to note that the categorization of different disciplines is a modern practice, so the average hunter-gatherer who observed the reindeer migration would not have been any more aware that they were an ecologist than the reindeer were aware that they were primary consumers. All animals are inherently ecologically aware: they are able to find food, avoid predators, and compete with other individuals within their natural environment. This ability to occupy a specific

position within an ecosystem, termed a ‘niche’, is driven by a combination of evolution and learning. Natural selection has shaped the physiology and internal chemistry of organisms to maximise their ability to survive and reproduce. Animals also learn throughout their lives: humans learn the names of plants, which ones to avoid and which ones to eat, where to find certain prey, and how the changing seasons affect the abundance of plants and animals. Therefore all animals, including humans, develop an innate ecological awareness, and knowledge, that is fundamental to their survival. But can we really call this ecology? A textbook definition of ecology is the study of interactions between organisms and their environment. This suggests that humans are natural ecologists, as a detailed knowledge of local ecology is essential for finding food, gathering resources and avoiding predators. For example, the Yao people of Mozambique have a mutualistic relationship with the Greater honeyguide, a bird which leads the honey-hunters

features to bee hives. The hunters make specific vocalizations which attract the honeyguides, who then lead them to a bee hive. After the honey is harvested from the hive, the beeswax is left behind which is then consumed by the honeyguide. Knowledge of local ecology isn’t used solely to obtain food. In a study of the Tripogon grass species in the Western Ghat region of India, taxonomists identified seven species whereas people of the local hill tribes identified eight. Molecular analysis showed that there were indeed eight species, two of which were extremely similar to one another, but with different ecologies: one was known as ‘kattai pul’ and used for cattle feed, whereas the other was known as ‘sunai pul’ and served as a habitat for cobras. In a country where 45,000 people annually die from snake-bites, it is vital that the locals are able to distinguish between the two almost indistinguishable species, and understand their specific ecologies. This is also an example of how people who live in a particular environment can have a more in depth knowledge of the plants and animals living around them than scientists do - an unsurprising but noteworthy highlight in the context of the marginalisation of rural communities by multinational corporations in order to exploit the natural resources of the environment that they live in.

Our ecological knowledge underpins the world in which we live today The question still remains as to whether having this intrinsic understanding of one’s habitat is the same as being an ecologist. All organisms, from microscopic protozoa to the gargantuan blue whale, must be able to understand and function within their niche, demonstrating ecological aptitude in order to survive. However, modern ecology is an interdisciplinary field that encompasses a huge range of scales - from local ecosystems to vast biomes it shares the central principles of all branches of modern science: the acquisition of information through objective observation and the testing of hypotheses developed during the scientific revolution. Clearly, hunter-gatherer populations did not obtain information about their environment by consciously applying this rigorous way of thinking

Illustration by Gillian Cavoto

to problems and situations that they encountered. However, the basic principles of the scientific method aren’t too dissimilar to how humans instinctively solve problems. Formulating a hypothesis, testing it, and evaluating the results are processes that we carry out on a daily basis. We must therefore assume that the vast quantities of ecological knowledge possessed by hunter gatherers was obtained through this way of thinking. The evidence for this is the birth of agriculture through domestication. Human populations used their acquired knowledge to manipulate the ecologies of a variety of plants and animals. This resulted in changes to the gene pools of these species, which altered their physical traits. These alterations resulted in species that were better suited to human consumption, from producing larger seeds or having a greater body mass than wild varieties. The domestication process was slow, taking place over many generations. Take the domestication of wheat, for example. This first occurred in the Fertile Crescent in the Middle East, and it is estimated that this process could have taken a few centuries or up to 5000 years. Similar developments and careful experimentation occured at agricultural centres around the world, from Mexico and Peru to China and Ethiopia. The humans that originally domesticated the crops and animals we eat and use today certainly didn’t have an awareness of the far-reaching impact that their actions would have. Neither did they have a grand design for their innovations. But the very act of planting

seeds, irrigating fields and controlling the physical environment allowed sedentary agriculture to develop and facilitated domestication. Without intuitive knowledge of ecology, domestication and agriculture simply would not have been possible. And since agriculture paved the way for complex societies, rapid population growth and technological advancement, our ecological knowledge underpins the world in which we live today. This leads to a more philosophical question about the nature of science and humanity. The question is whether the ecological knowledge demonstrated by hunter-gatherer humans is just part of being a living organism with an innate capacity to survive and reproduce, or whether it can be defined as scientific ability, honed through years of study. Undoubtedly, ecological knowledge must have been vital for human survival, but can we really call it science or is it something more primal? This really depends on how ecology and science are defined, and goes to the heart of how we see ourselves as humans. Our technology, growth and advancements have lead to the creation of a narrative where humans stand apart from the rest of the natural world, with accomplishments that have never before been seen in any other species. However, these accomplishments may simply be the product of the intrinsic scientific persuasion of the human brain that has been developing for 200,000 years. Louis Walsh is a second-year Biology student who is interested in the intersection between culture and biology Spring 2018 | 39

feat u re s

Our human bodies: a story of reform, revolution and change Reminding us of the marvel of the human form, Haris Haseeb describes the series of events which the body undergoes in the moments after birth We live in a society which, through a devastatingly efficient, though fundamentally immoral process, feeds off the very insecurities it creates. Yet, whilst there is something to be admired of our society’s bizarre autotrophic tendencies, what we are left with is a pervasive (and dare I say it, growing) sense of inadequacy. Our skin, depending upon which longitudinal hemisphere we live in, is either too light or too dark, our noses too flat or too long, and our bodies too large or too thin. Society has established an aesthetic precedent which, although blatantly unrealistic, has not only been maintained, but also encouraged by a miscellany of societal trends, which ultimately only affirm a tradition of deep-seated, oppressive aesthetic stigma. In turn, instead of relishing in the wonder of our diverse and changing bodies, the spectacle of the human form is sadly reduced into a homogeneity of desirable and undesirable traits. Though it is certainly true that aesthetic standards do, in part, define what is both acceptable and unaccept-

Image courtesy of Wikimedia Commons

40 Spring 2018 |

able in the context of bodily form, the problem is not exclusively confined to that of societal trends. It is, in many ways, much larger, and partially represents a shift in intellectual discourse; it has become the concern of popular literature, and to a lesser extent - science - to bring to life the doings of the mind and the enigma of consciousness, as it is ultimately altogether less familiar. However, how we go about explaining the wholesale neglect of our bodies - whether as the consequence of our ridiculous aesthetic standards, or as a reflection of a shift in the intellectual paradigm - is, in a sense, arbitrary, as the outcome remains unchanged. We have become increasingly preoccupied with our own material inadequacies and rather than embracing both diversity and change, we instead reject it; it is as if the phenomenon of our individual bodies, in its most authentic, scintillating sense, has with time been forgotten. But, and I am reminded of Virginia Woolf as I write this, our bodies demand

not only attention, but also celebration. As we begin to make sense of the great secrets of our physical forms (its silent processes, dynamism and propensity for change), we can move away from a society which encourages a culture of inadequacy, and instead embrace our bodies as both heterogeneous and wondrous.

We have become increasingly preoccupied with our own material inadequacies

The physiological story of our bodies, however, cannot be read as instances or moments; it is instead a narrative of continuity, an unending procession of revolt, change, and reform. The sequence of events in utero, where we are, in the space of 9 months, transformed from an unthinking unicellular zygote to a complex, multicellular organism capable of an extensive range of excretory processes, represents some of the most astonishing processes of change in the human body. But, whilst there is immense beauty in the story of embryogenesis, it is the details of the events which begin to unfold both during and immediately after birth which are testament to the strange, and quite brilliant, exceptionalism of our bodies. Birth, our traumatic introduction to extra-uterine life, demands from us a series of quiet yet dramatic physiological changes which represent, in most cases, the slow emergence of our functional independence. Though there is little in the way of entertainment, the relative safety of the mother’s uterus provides the foetus with the essential components to sustain its growth and development; the foetus is in many senses an obligate organism whose metabolic integrity is maintained by the placental unit. Oxygenation then, and the concurrent exchange of carbon dioxide, is dependant not on the foetal lungs (which remain ineffectual and fluid-filled in utero),


Image courtesy of Pixabay

but instead on the health of placenta. However, unless I am hideously mistaken and we all are currently carrying with us our mother’s placentas, there occurs at some point a silent, internal transformation, where oxygenation (the process upon which the entirety of our metabolism is contingent) becomes a faculty of the lungs, and no longer of the placental unit.

Birth, our traumatic introduction to extra-uterine life, demands from us a series of quiet yet dramatic physiological changes

The physiological details of what we can consider to be the ‘placental-pulmonary’ shift are extraordinary, both in orchestration and sequence. During labour, accompanying each uterine contraction is not only our slow descent through the contours of the pelvis, but also the expulsion of fluid from our rigid foetal lungs. And though, for the mother, the stress of labour is blindingly (or rather, deafeningly) obvious, it is equally as precarious for the foetus. Hormones, known collectively as catecholamines, disseminate throughout the foetal body

in response to stress and, amidst the fire and fury of the labour ward, quietly proceed to prepare our lungs for life without placentas. Upon reaching the lungs, the assault of catecholamines tease out more fluid, an important consequence of which is a decrease in both pulmonary pressure and pulmonary resistance. Though the process of labour itself contributes to the profound, internal changes in foetal circulation, as we emerge from uterine life, baptised in both blood and excrement, it is with our first breath that this quiet narrative of change is continued. Despite the mechanisms of true, independent breathing being poorly understood, as our neonatal selves inspire for the first time, our lungs, swelling with oxygen, undertake a further decline in pulmonary resistance. In doing so, and as the principles of fluid dynamics dictate, our now low-resistance and highly compliant lungs welcome the onslaught of fresh, unadulterated blood from the right-heart. Upon arrival, and as it is distilled across our pulmonary vascular tree, the blood is laced with oxygen and ejected from the left heart, feeding our now growing metabolic demands; an iterative, Sisyphean sequence which continues, often without intervention, for decades to come. In the space of what is essentially 24 hours then, our bodies implicitly recognise the necessity for wholesale physiological change; a shift from obligate, intrauterine life to that of steady but profound earthly independence. And yet, of this as-

tonishing process of reform there is little external record. Even so, concealed in the mystical recesses of our bodies, there remain small pockets of remembrance; embryological remnants which, though unspeaking, are like great archaeological sites, continuing to bear silent witness to an unending procession of change.

This is a story which, by virtue of our status as humans, nearly all of us have shared

Remarkably this is a story which, by virtue of our status as humans, nearly all of us have shared. And, whilst the transformation from our foetal to neonatal selves is brief, when read within the wider narrative of our wondrous bodies, we are reminded not of inadequacies, but instead of our exceptionalism. Our physical form is adaptive and diverse, and it both recognises and embraces the need for change, reformation and revolt. Inadequacy then is fundamentally irreconcilable with the essence of our physical being and, just as our bodies do, as a society we too must celebrate and embrace, unapologetically, a world of heterogeneity and change. Haris Hasseb is a fifth year medical student at the University of Edinburgh Spring 2018 | 41

re g u l ars : p o l itic s

Are politicians ruining the trip? Karolina Zięba investigates the impact social stigma and political hurdles have on psychedelic research

Not all drugs are created equal. The UK recognized this fact in the Misuse of Drugs Act of 1971 by classifying illicit substances into classes ‘A’, ‘B’, or ‘C’ in decreasing order of harm. As well intentioned as the legislation might have been, it seemed to have strengthened the pathological paradigm of drug use. Drugs like alcohol or tobacco, although gravely dangerous and addictive, did not appear in the law, while psychedelic drugs, like LSD or psilocybin, which are generally not addictive, were characterised as class ‘A’. In recent years, the definition of ‘psychedelic’ from Grinspoon and Bakalar’s book Psychedelic Drugs Reconsidered has become the most applicable: “A psychedelic drug is one which, without causing physical addiction, craving, major physiological disturbances, delirium, disorientation, or amnesia, more or less reliably produces thought, mood, and perceptual changes otherwise rarely experienced except in dreams, contemplative and religious exaltation, flashes of vivid involuntary memory and acute psychosis.” Altering the pathological paradigm is no easy task. In the past, as seen with cannabis, the only way to get testing approved was if a medical use of the substance was being researched. Despite the limitations, more and more research on the healing potential of these mind-altering drugs is being done across the UK and US. Psilocybin, the active ingredient in ‘magic mushrooms’, is being studied

lmage by Joel Felippe courtesy of Unsplash

42 Spring 2018|

as a potential treatment for addiction, depression, and anxiety. In October of 2017, Dr. Carhart-Harris and his team at Imperial College London were able to show significant improvements of all subjects treated for treatment-resistant depression with psilocybin. Another cancer research study conducted on patients at New York University in collaboration with Johns Hopkins University concluded that “psilocybin produced immediate, substantial, and sustained improvements in anxiety and depression and led to decreases in cancer-related demoralization and hopelessness, improved spiritual wellbeing, and increased quality of life.” Getting approval and funding to help individuals ‘trip’ while observed by psychiatrists is no easy feat. Most funding will go to research aimed at finding the dangers of a drug, research that goes with the pathological paradigm. Psychedelic research, due to its legal classification, tends to be more costly. Universities and institutions have to face the social and political backlash that often follows. In the end, even those researchers willing to try are left without anyone to back them up. Why are there so few opportunities to conduct more research? The problem is that politicians – with the media’s help – encourage the perception of drugs to be morally-corrupt and present people who take them as dangerous. Stories appear in the media depicting harm and generating fear, and the government feels the need to react. Usually, this manifests in prohibition and punishment-based policies instead of more productive, evidence-based policies. Looking at history, it seems that at least part of these anti-drug sentiments are linked to former US president Nixon’s ‘War on Drugs’ which targeted anti-Vietnam war protesters and black communities. A report by CNN Politics showed that Nixon’s 1968 campaign aimed to discredit both of those communities and, therefore, disrupt counterculture efforts to alter social norms and the African-American struggle for civil rights. It was easier to show war protesters as mentally-unbalanced hippies and black activists as drug-addicted thugs

than validating their concerns and tackling them diplomatically. Consequently, the US federal government outlawed psychedelics in 1970 under the Controlled Substances Act. The United Nations followed suit with a treaty signed at the Convention of Psychotropic Substances in 1971 which restrained psychoactive drugs in 182 UN member states.

There remains strong political, social, and academic conservatism when it comes to psychedelics There remains strong political, social, and academic conservatism when it comes to psychedelics. Perhaps firm and limiting restrictions add to this fearbased sentiment. Perhaps it is the collective fear of the unknown that leads to strict regulations. Either way, what the scientific community is forced to face as a consequence is a catch-22: the potential of psychedelics cannot be explored due to tight restrictions, but these restrictions cannot be lifted until psychedelics are shown to be not dangerous. Researchers and activists hope to alter how people perceive these substances and, hopefully, open the door to testing other aspects of the drugs, like the role one’s environment plays in their effects. It is not that alcohol should be entirely banned or LSD totally accessible; what needs to change is our perception regarding the consumption of these substances, and how we can achieve a more objective view of their harms and benefits. It is understandable that laws are created to protect society from harm, but when those same laws begin to impede research with the possibility of helping thousands, it is clear that legislative bodies are failing us. Karolina Zieba is a first year Chemistry student at the University of Edinburgh

regul a r s : in novati on

A new wireless technology to help paralyzed people walk again Alessandra Dillenburg investigates advances in brain-machine interfaces offering new hope for spinal cord injury treatment The brain you get at birth is what you’re stuck with. After a critical period of change and growth during childhood, the adult brain becomes immutable. The cells within your brain and spinal cord, called neurons, are fixed in place – if you injure them, they’re lost forever. Until the late 20th century, this was the depressing dogma that dominated the neuroscience field. Luckily, some scientists disagreed with this and began investigating neuroplasticity, a term used to describe the plastic, or mutable, properties of the brain. Long gone are the days where the human brain was seen as a hardwired, irreparable entity. Today, it is widely recognised that the adult brain can, in fact, change itself. While these modulations may be due to environment, experiences, or learning, one of the most intriguing aspects of plasticity is how the brain can change after injury to help us heal. This natural flexibility is what researchers hope to harness for developing spinal cord injury (SCI) treatments. SCI affects an estimated 50,000 individuals per year worldwide, often leading to debilitating and life-changing paralysis. When the spinal cord is damaged, the neuronal connections from the brain to the rest of the body are interrupted. Imagine a bundle of fibres (the spinal cord) that connect point A (the brain) to point B (limbs). When this bundle is partly cut, some signals won’t reach their destination. Despite the brain and limbs being intact, the signals produced in the brain required for movement are unable to reach their target, resulting in loss of limb function. One proposed solution to this problem is creating a ‘neural bypass’: A type of brain-machine interface that can decode the signals in the brain and wirelessly transmit them to electrical stimulators on the spinal cord below the injury site, prompting neural circuits in this part of the spinal cord to activate and trigger limb movement. A primitive version of this system was developed in the 1980s. These were some of the first brain-machine interface prototypes, where patients wired into a bulky system

Image from Nature Video "The nerve bypass: how to move a paralysed hand" courtesy of CBA

could move a computer cursor simply by thinking about it. The neural bypass systems being proposed today are a huge step forward. Small, implantable wireless systems coupled with rehabilitation training paradigms can help animals relearn how to walk. Such studies could be the start of delving into the brain’s innate capacity for plasticity and prompting the formation of new connections between the brain and the spinal cord. Dr. Grégoire Courtine believes this is the dream, and recent evidence from his lab in the Center for Neuroprosthetics and Brain Mind Institute at the Swiss Federal Institute of Technology suggests it is an attainable one. Courtine’s early work was unsurprisingly heavily publicized. In rats with SCI, Courtine and colleagues electrically stimulated an area in the spinal cord responsible for leg movement. This stimulation, together with providing chemical compounds that prepare neurons to fire, allowed the previously paralyzed rat to stand, walk, and run on a moving treadmill. This movement, however, was involuntary – the brain was not involved in instructing the legs to move. This is where the team’s new rehabilitation paradigm comes in: a robotic system physically supporting the rat in a more natural manner allowed the rat to attempt movement in any which way without constraints. Importantly, the rats also needed motivation to move: being located in Switzerland, the researchers used only the finest chocolates to lure the rats into taking steps. The researchers found

that after several months of training, the rats had recovered voluntary locomotion; furthermore, they had actually managed to trigger neuroplasticity and reconnect the brain to the spinal cord. With the goal of helping people with SCI regain control of their limbs, scientists have since been working to improve aspects of this model, exploring the limits to which this recovery can be pushed. In fact, clinical trials run by GTXmedical are currently underway to test an electrical stimulation device that can be implanted directly on the spinal cord. This device, coupled with an innovative gravity-support rehabilitation platform similar to the one developed for the rats, could induce the reorganization of neurons and lead to the recovery of voluntary movements in SCI patients. Furthermore, Dr. Courtine’s recent work published in Nature used the wireless system discussed above in monkeys with spinal cord injury. Incredibly, this system allowed the injured animals to regain use of their paralyzed legs only 6 days after injury without any training. It may sound like science fiction but, given the evidence, it’s clear this idealistic dream could soon become a reality. Alessandra Dillenburg is a final year Neuroscience PhD student

Spring 2018 | 43

re g u l ars : s cr iatr ib e

Tasting colours: studying synaesthesia Shinjini Basu explains our current understanding of why some people hear colours and taste shapes Have you ever been stuck at a traffic signal and when the light turns green, you point at it to say ‘green’, but instead say 3? Or perhaps you’re sat at home, drinking tea whilst listening to the radio, and a song comes on that makes you taste apples? For most of us, this is a chance incident, occurring once or twice in our living memory. There is, however, a very special group of people who walk among us, for whom the word ‘Phillip’ tastes like sour oranges, or the note C-sharp on the violin always looks like a brown squiggly line drawn from left to right in the lower left quarter of their vision. These special people, called synesthetes, share a condition called synaesthesia. But what exactly is synaesthesia? Although the precise definition and diagnosis of synaesthesia is highly debated, it is broadly described as a condition in which the stimulation of one ‘inducer’ sense (e.g. hearing) is simultaneously perceived by one or more ‘concurrent’ senses (e.g. taste). This can happen in all manner of combinations, with the synaesthetic experience always occurring automatically, with the same inducer sense always leading to the same concurrent sense. Although currently thought to be prevalent in only about 4% of the world’s population, the synaesthetic experience varies greatly between individuals, even between those from the same family. While both parent and child might have the same type of synaesthesia, they will

probably not form the same associations. For example, if the mother sees the letter ‘A’ as red, the daughter may see it as blue or green. Even within the same individual, the bidirectional synaesthesia might not be symmetrical; so while a certain sound might look red, the colour red might give rise to a completely different sound. So how does one begin to study such a diverse phenomenon? Unsurprisingly, until 1999, synaesthesia was still widely denounced as ‘hallucinations’, ‘gimmickry’, or simply passed off as retention of childhood memories formed from looking at coloured alphabets and numbers in children’s books. However, recent advances in non-invasive imaging technologies, particularly functional-MRI scanning, have allowed physicians and scientists to observe the synaesthetic brain in action. This technique allows us to observe changes in blood-oxygen levels in different regions of the brain as a proxy for brain activity in real-time. In short, functional-MRI scanning measures the degree to which different parts of the brain light up in response to a stimulus. The use of this technique has provided evidence to a hypothesis made over a 100 years ago: synaesthesia corresponds to a form of ‘cross-wiring’ in the brain such that a stimulus simultaneously elicits a response from two distinct brain regions. In addition to supporting this hypothesis, studies have

also revealed that grapheme-colour synesthetes (those who associate letters or numbers with colour) show an increased or hyper-connectivity between the brain regions that are known to process both graphemes and colour. But how does this cross-wiring arise in the first place? During the course of normal development, the brain strengthens some connections while getting rid of others in a process called synaptic pruning. The leading hypothesis in this field proposes that the pruning is incomplete in synesthetes, resulting in the presence of connections that have been removed in most individuals. Direct evidence of this phenomenon, however, has yet to be uncovered. Controversially, some researchers believe that cross-wiring is too simple an explanation, and propose that synaesthesia emerges when children are faced with a ‘semantic vacuum’ - a condition in which you learn a concept, for instance a word in a foreign language, but lack the context to place it in. Adults tend to memorise this information. Some kids, however, may place this information in a sensory context such that the two memories become inseparable, eliciting an empirical brain response. Synaesthesia is indeed a fascinating phenomenon. Its unique underlying mechanism not only allows us to understand the way in which our brains make sense of the world, but also provides a brief interlude to their diversity. For example, medically it offers a controlled condition to gain insights into what might be happening in patients suffering from life-like schizophrenic hallucinations. Outwith medicine, synaesthesia has helped shape society at a cultural level. Many a great artist of the past and present have shown attributes that would class them as synesthetes. From John Keats to Billy Joel to Hans Zimmer, we have this unique eliding of the senses to thank for some of our most memorable poems and tunes. Shinjini is a 3rd year PhD student at the Centre for Discovery, Brain Sciences at the University of Edinburgh

Image courtesy of Pixabay

44 Spring 2018|

regul a r s : letters

Dr Hypothesis EUSci’s resident brainiac answers your questions Dear Dr Hypothesis, It seems as though everyone on the internet is talking about Bitcoin these days but I don’t really get the hype. How does Bitcoin actually work? Is it a serious currency, the latest get-rich-quick craze, or a shady market? And finally, should I invest in it? Puzzled Pia Dear Puzzled Pia, You are right to point out that Bitcoin and other cryptocurrencies have returned to the spotlight recently. All cryptocurrencies work using a technology called the blockchain. The blockchain is essentially a public ledger, a list of all transactions corresponding to a particular cryptocurrency secured using cryptographic techniques. The list consists of different ‘blocks’ that have to be validated by the majority of the network before they are added to the chain. The database is stored across all computers that use a particular cryptocurrency, meaning that it is publicly accessible and individual transactions cannot be retroactively modified without altering the whole blockchain. One of the attractive features of cryptocurrencies is this transparency: because the blockchain is readily accessible, it is incredibly difficult to corrupt the record of transactions for your own benefit.

The database is publicly accessible and individual transactions cannot be retroactively modified You ask if Bitcoin is a serious currency; the answer depends on what you consider ‘serious’ . Can you buy things with Bitcoins? Yes. There are over 100,000 merchants that accept bitcoins and other cryptocurrencies, supported by payment service providers like Coinbase or BitPay. However, the main difference between cryptocurrencies and traditional currencies is how they obtain their value. Traditional currencies are regulated through a centralised body such as a bank, a ministry, or a governmental authority. This gives their value some stability with a system of checks and balances in place to ensure that the financial sector, commercial banks, and the overall economy based around that currency, can survive and thrive. On the other hand, ‘cryptocoins’ are generated through ‘mining’: the process by which the cryptographic puzzles that add new blocks of transactions to the blockchain are solved. Large computational power is needed to solve these mathematical problems, and the algorithms are designed so that the difficulty of the problems is adapted to ensure the production of coins grows at a reasonable rate. Miners are rewarded for their work with coins, and the number of coins generated per block decreases as more coins are mined, limiting the amount of currency that can be generated over time.

Image by Andrew Francois courtesy of Unsplash

Unlike traditional currencies, cryptocurrencies are decentralised by definition and hence not tied to a specific country, government, or bank that can somewhat guarantee a stability to their value or intervene in a crisis. This means that investing in cryptocurrencies such as Bitcoin involves a great deal of risk. While their value goes up, exchanging bitcoins or investing in the currency can lead to spectacular financial gains. However, there is no assurance that the value of the currency will continue to grow, or even stay as it is. Despite the steady rise in its value during the second half of 2017, Bitcoin prices fell up to 40% early this year. On the other hand, you could argue that the value of traditional currencies is also largely based on speculation and has long stopped corresponding to how much gold is locked away in a secret vault. In fact, most currencies today are fiat money, with no physical commodity behind their value. But we use them to purchase goods because we trust central banks to keep their value stable over time. Central banks can print money, intervene in the financial sector to combat inflation, or work with governments to produce better monetary policy, all mitigating the likelihood of a crash. This is why the lack of regulation and guarantee of value is such an issue for the stability of cryptocurrencies and their potential to become ‘serious’ currencies. Moreover, although the blockchain is transparent and publicly accessible, transactions are mostly pseudonymous and sometimes anonymised, such that tracing movements to specific individuals or groups is often impossible. Consequently, cryptocurrencies have a reputation for being used for illegal purchases, money laundering, or fraud. However, these illicit activities are also done using fiat money, so it seems unreasonable to uniquely ascribe this bad light to cryptocurrencies. Ultimately, cryptocurrencies are decentralised systems that ensure the integrity of transactions through the blockchain thanks to their peer-to-peer nature. However, under the status quo, the stability of their value cannot be guaranteed for this very reason. Perhaps in the coming years the global market will adjust to digital currencies, and changes in policy will mitigate the risks associated with investing in cryptocurrencies. In the meantime, loading up on cryptocoins is probably not worth the risk. Dr. Hypothesis' alter ego is Simone Eizagirre, a fourth-year Chemical Physics student currently on placement at the Dutch Institute for Fundamental Energy Research Spring 2018 | 45

re g u l ars : re v iews

Review: Chance by New Scientist James Hitchen reviews New Scientist magazine's latest book Chance is one of the many books published by New Scientist magazine, which features edited versions of standout articles from the print edition, conglomerated together under a coherent theme. As the name Chance suggests, this iteration unites ideas from across all areas of science discussing not only the nature of probability itself, but also how randomness plays a role in subjects as varied as our own evolution to finding fraudulent businesses' accounts. Though the articles are grouped into subsections, they are all independent of each other, meaning that, should one so desire, the book may be read much like a magazine in a piecemeal fashion. As a consequence, the book is equally suited as a conventional cover-to-cover read or as a coffee table tome, to be read in small chunks whenever one’s curiosity fancies. Each article is set up with a short paragraph from editor Michael Brooks, who gives a twenty second snapshot as to the general content, allowing the reader to quickly locate articles of interest. Despite this, I soon found that it made for much better reading going cover-to-cover as all of the articles were sold exceptionally. My faith in the editor proved justified as I was taken on a journey from the best strategies for winning at rock-paper-scissors to why quantum mechanics is a good way to design a universe. It is commendable that Chance does not suffer from the criticism faced by similarly formatted books in that there are no 'filler' arti-

46 Spring 2018|

cles needed to produce enough material for a publication. Whilst each article had its merits, I found myself most amused by one in which the optimal strategy to stop people teasing you is given. The article explains that if you have a threshold below which you will not rise to an insult and above which you will retaliate, people quickly locate your threshold. Should they seek ramification-free goading, they will probe you just below your perceived anger threshold. On the contrary, should you become rather agitated at random, with no credence given to the level of the insult, people will avoid goading you as they are never sure as to when you will blow up at them. Overall, equal enjoyment can be expected from an interested science reader to a veteran professor, though I am sure some general familiarity with concepts such as evolution and quantum mechanics (probably not an issue for the readership of this magazine!) will enhance the reading experience. Don’t leave your next Waterstones indulgence to chance. James Hitchen is a second year Theoretical Physics PhD student interested in complexity and chaos

EUSci #22  
EUSci #22