Page 1

Issue 18: Winter 2015



How soon until we reach the singularity?

What if your own defences were in league with the enemy


NEUROSCIENCE When losing half your brain will save your life

Contents Focus 8 Extremes: pushing the limits of science This issue’s focus explores the phenomena of the exceptional, the extraordinary, the extreme. 9 Medicine on the frontline Tim Wilkinson looks at how conflict can drive new medical techniques 10 A 21st century disease Felicity Anderson-Nathan investigates the issues behind Multiple Chemical Sensitivity 11 When half a brain is better than one Jade Parker explores an extreme surgery used to treat pediatric epilepsy 12 Designer babies: an ethical debate resurfaces Chiara Herzog looks into the ethics and facts behind custom kids 14 Reaching the era of mind control Chrystalleni Vassiliou explores the scientific attempts in understanding and manipulating the human brain 15 Extreme immune system Ashley Dorning explores what happens when the immune system is in league with cancer

18 Beyond the frontier Gitanjali Podhar guides us through the theory behind the Large Hadron Collider

27 Life Unexpected Polina Shipkova explores if extreme environments can sustain life

19 The singularity is nearly here Jemma McClelland explores the repercussions of the technological singularity

28 Politics: Big dangers of big data Nathalie Vladis embarks on a quest in the world of healthcare’s extremely big data 29 Innovation: Microbial enzymes for bioethanol production Viktoria Dome considers the potential solutions to the problem of second generation biofuel production

20 Extremely energetic work: solving cosmic mysteries Simone Eizagirre explores the latest discoveries about some of the rarest high-energy particles in existence

Features 21 Edinburgh: where innovation of bioscience begins Asimina Pantazi explores the supportive role of Edinburgh in the critical transformation of life science findings 22 Is artificial intelligence safe for humanity? Stefano Albrecht explains the potential risks of artificial intelligence 23

The WTF star: where’s teh

flux? Eleanor Spring investigates what star KIC8462852 means for planetary exploration

16 Diamonds, bombs, and elephants Hannah Johnston explores how carbon polymorphism is used to study explosives

24 Back to the Future: how right were they? Lisa Hilferty examines the accuracy of the predictions of the year 2015 made in the 80’s blockbuster Back to the Future

17 Weather gone wild Adelina Ivanova explores why we might have been looking at extreme weather events from the wrong perspective

26 Little worm, big impact Mia von Scheven explores the significances of C.elegans as a research model organism

Cover illustration by Hari Conner

2 Winter 2015 |


30 Technology: A wizarding wardrobe wonder Kirsty Paton investigates the recent technological advances in invisibility cloaking 31 Arts: The computer as an artist Alyssa Brandt reflects on the implications of creative artificial intelligence 32 Sciatribe: Avoiding the gender bias in science: narrowing the gap Alessandra Dillenburg explores the reasons behind the glaring lack of women in higher-level scientific positions 33 Interview: Robot sunflowers: where science meets art Kerry Wolfe speaks to Dr. Dave Murray-Rust about his interactive exhibit Lichtsuchende 34 Dr Hypothesis EUSci’s resident brainiac answers your questions 35 Review: The Power of Habit: Chrystalleni Vassiliou reviews the new book by Charles Duhigg




News Team Dorothy Tse, Chiara Herzog, Halima Hassan, Ifigeneia Stavrou, Jemma Pilcher, Polina Shipkova, Callam Davidson, Rosie Owens

Dear Readers,

Focus Team Tim Wilkinson, Felicity AndersonNathan, Jade Parker, Chiara Herzog, Chrystallen Vassiliou, Ashley Dorning, Hannah Johnston, Adelina Ivanova, Gitanjali Poddar, Jemma McClelland Feature Authors Simone Eizagirre, Asimina Pantazi, Stefano Albrecht, Eleanor Spring, Lisa Hilferty, Mia von Scheven, Polina Shipkova Regulars Authors Nathalie Vladis, Viktoria Dome, Kirsty Paton, Alyssa Brandt, Alessandra Dillenburg Scur, Kerry Wolfe, Liv Nathan, Chrystalleni Vassiliou Copy-editors Nikki Graham, Meghan Maslen, Pranav Bheemsetty, Julia Turan, Brian Shaw, Clare McFadden, Owen Gwydion James, Priya Hari, Lisa Hilferty, Vivian Ho, Simone Eizagirre, Ashley Dorning, Alex Kelman, Sarah Piggott Sub-editors Niels Menezes, Chrystalleni Vassiliou, Rosalind Brown, Adelina Ivanova, Alexandra Kelman, Catherine Lynch, Oswah Mahmood Art Team Hari Conner, Alyssa Brandt, Jemma Pilcher, Amy Wedge, Pei-Ling NG, Lynda-Marie Taurasi, Gabrielė Lisauskaitė, India Pearce, Nathalie Vladis, Paige Collins

Editor Editor Alessandra Dillenburg Callam Davidson

With a new year comes a brand new and exciting issue of EUSci. Welcome to the EXTREMEly riveting Issue 18 (see what I did there?) where you’ll find everything from designer babies to extreme surgeries where half of the brain is removed. We even delve into the issue of artificial intelligence and whether or not we’ll all be ruled by robots soon. Nothing boring about this issue—turn to page 8 to dive in! You may notice there’s an extra section in this issue. We decided it was about time we gave students and staff the chance to write about the groundbreaking research going on in their labs here in Edinburgh. To find out more about what research is happening right here at home, turn to page 6. We have some fantastic features on offer in Issue 18. Head over to page 21 to hear about Edinburgh’s Innovation Forum, before delving into a fascinating debate on the risks Artificial Intelligence may pose for humanity. Follow this up by reading about the mysterious ‘WTF star’, before cranking the speed up to 88 miles per hour for our Back to the Future 2015 special. If it’s small-scale extremes that float your boat, turn to page 26 to hear about the tiny worm we use to study the human nervous system, and polish the section off with a read of this year’s MSc in Science Communication writing competition winner - ‘Life Unexpected’. Our regular fixtures begin on page 28. This issue features a look at the growing use of big data in healthcare, as well as innovative new ways to produce much-needed biofuels. We examine the science behind real-life invisibility cloaks, and ask whether computers can be truly creative. In light of relatively recent events involving Professor Tim Hunt, Issue 18’s sciatribe debates gender inequality in the sciences. We also share a cuppa with Dr Dave Murray-Rust, before joining our resident Dr Hypothesis to hear about the dangers of using Teflon pans (disclaimer: only if you’re a bird). If all this wasn’t enough, we’ve even reviewed a book for you on page 35… As usual, the EUSci team is forever changing. Shinjini Basu and Kerry Wolfe joined our team as Deputy Editors, and Jemma Pilcher has become our new Art Editor. We were sad to see Iris, Rebekah, and Hari go after Issue 17, but these incredibly talented newcomers certainly helped to ease the blow. There’s always room for more writers, editors, designers, and artists, so please get in touch—visit or email us at to subscribe to our mailing list. Finally, we’d like to thank the IAD for their support in printing this issue. Many thanks also to our wonderful team of writers, editors, and illustrators for their hard work in making this issue happen. Last but certainly not least, thanks to you, our readers, for your ongoing support. We hope you enjoy the issue!

Alessandra and Callam

Deputy Editor Shinjini Basu

Deputy Editor Kerry Wolfe

Focus Editor Vicky Ware

News Editor Selene Jarrett

Layout Editor Áine Kavanagh

Art Editor Jemma Pilcher

Winter 2015 | 3

new s

ASCUS Lab: new open-access space for art and science research In the depths of Summerhall, the organisation ASCUS Art & Science has been developing the UK’s largest publicly accessible bioscience lab. This ambitious project poses an exciting opportunity for anyone to conduct their own experimental-based research in art and science. ASCUS Lab is the brainchild of Dr James Howie (ASCUS Director) and Lucy Stewart, which has come to life with the help of Miriam Walsh (Assistant Lab Manager) and enthusiastic volunteers from both scientific and artistic backgrounds, plus support from the Wellcome Trust. Dr Howie founded ASCUS Art & Science in 2008 during his PhD at the School of Geosciences at the University of Edinburgh. Since then, ASCUS has become a well-respected organisation promoting communication and collaboration between the arts and sciences. Summerhall is no stranger to bridging the gap between art and science; it is a space for art galleries and workshops, and a venue for the Edinburgh International Science Festival. It occupies buildings formerly used by the Royal (Dick) School of Veterinary Studies. ASCUS has taken residence in a disused lab, which has been given a new lease of life and equipped with recycled bioscience appliances. However, unlike a conventional lab, artworks and easels are amongst the microscopes, centrifuges, and fume hoods. Normally labs have restricted admittance, which makes scientific research seem elusive—unless you are a scientist. ASCUS is striving to change this by providing access, facilities, and support for scientists, artists and members of the public alike to engage with research in an innovative and ‘hands on’ manner. Here, anyone can become a member of the lab and

Photo of the lab. Credit: Diego Almazan, ASCUS

conduct his or her own independent experimental research in art and science, which will be facilitated by the running of creative workshops and introductory courses in laboratory techniques. Scientists will have the freedom to communicate their research to a wider audience, whilst investigating alternative ideas that they’ve not had the opportunity to pursue elsewhere. Additionally, artists will have access to scientific equipment and inspiration to fuel their own projects. The intention is that the Lab will create a platform for numerous possibilities, whilst fundamentally ensuring a fun and creative space in which to experience artistic and scientific research. ASCUS Lab will be fully-launched in Spring 2016. For more information or to get involved with this exciting project, join the mailing list and follow the Lab on Twitter (@_ASCUS) and Facebook. Jemma Pilcher

Advances in disease outbreak management The recent Ebola epidemic in West Africa has sparked discussions about how to prevent outbreaks of potentially fatal infectious diseases. Researchers from the Centre for Immunity, Infection and Evolution at the University of Edinburgh, in collaboration with the Wellcome Trust Sanger Institute at Cambridge, suggest improving global surveillance of infectious diseases as a possible solution. Surveillance in this context describes collecting and analysing information directly relevant to human health, which would allow early detection of potential disease outbreaks. The authors suggest global surveillance that integrates data from many sources, such as demographic, public health,

Photo of a healthcare worker wearing protective clothing at a field hospital in Monrovia, Liberia. Credit: Maj. Francis Obuseh

4 Winter 2015 |

location, movement, and animal distribution data, is key for monitoring and responding to possible outbreaks of infectious diseases. They call for open sharing of data, as well as developing an international network to assist in such scenarios. The recent review, published in Science Translational Medicine, emphasises on how the incredible advances in science and technology over the last few decades facilitate our understanding of diseases. Powerful machines allow fast genome sequencing of viruses, such as HIV and Ebola, to provide information about their possible resistance to drugs. More sophisticated models of how a disease might progress could also contribute to more efficient disease outbreak management. These models can now include a number of factors, thus painting a more accurate picture of disease progression. When dealing with a possible disease outbreak, underestimating how many people are infected could compromise the way the disease is dealt with. It is important to recognise this problem, as well as the fact that not all regions of the world have adequate resources for surveillance. Point-of-care (POC) tests allow the rapid and specific detection of a pathogen and to help predict the pattern of the outbreak. The new technologies that are available in our modern society and the data they allow us to gather provide an opportunity for better management of infectious disease outbreaks. Polina Shipkova

n e ws

Aspiring science communication stars prepare for FameLab 2016 Three minutes, no PowerPoints and one fascinating scientific topic—FameLab is back, with a new batch of keen scientists competing to be crowned public engagement champions of 2016. FameLab is the UK’s most prominent science communication competition, helping the public understand major scientific issues. The scope of the competition is wide-ranging, from important questions such as ‘why do men have nipples?’ to more trivial concerns such as ‘is nuclear energy a good or a bad thing?’ Established in 2005 by Cheltenham festivals in collaboration with Nesta, FameLab encourages young scientists and engineers with a knack for public engagement to swap the lab for the stage. Participants have three minutes to effectively communicate a scientific concept to the general public, without the comforting backdrop of a PowerPoint presentation. Last year’s winner, Oscari Vinko, won over the panel of esteemed judges with his character Mr Malaria, who convinced the audience of his nefarious talents using only his wit and some self-fashioned headwear. FameLab was founded originally to entertain and engage the public, whilst also nurturing the talents of young science communicators. The competition’s phenomenal success has seen it grow exponentially, with over 5000 participants across 25 countries. Since 2013, FameLab Academy has brought the FameLab format to secondary schools throughout Gloucestershire, encouraging students to begin communicating science effectively from an early age. The FameLab journey begins with regional heats and fi-

nals. The Edinburgh regional final was held in the National Museum of Scotland in early January. Winners at this stage will progress to the national final, held in London in April 2016. The competition heats up at this stage, as finalists compete for a £1000 cash prize and a further £750 to spend on a science communication activity. It doesn’t end here—the 25 or so national winners advance to the international grand final, to be held at the Times Cheltenham Science Festival in June 2016. FameLab’s YouTube channel hosts a series of great videos from previous finals, including the aforementioned Mr Malaria. So if you’ve ever fancied yourself as the next Brian Cox or Michael Moseley, perhaps consider getting your entry in early for the 2017 competition, and pick up some tips for success by following the FameLab 2016 (@FameLabUK). Callam Davidson

Photo of FameLab 2015 winner Oscari Vinko presenting. Credit:

Sniffing out Parkinson’s disease A remarkable ability to smell Parkinson’s disease has recently been uncovered by scientists at the University of Edinburgh. Joy Milne, from Perth, first recognised a musky odour on her husband Les. Several years later, Les was diagnosed with Parkinson’s disease. Scientists have since proven that this smell is also detectable on other Parkinson’s patients. Identification of the chemical responsible for the smell could enable earlier diagnosis of this debilitating disease. Parkinson’s disease is a neurodegenerative condition affecting one in 500 people in the UK. Common symptoms include tremor and muscle rigidity. This involuntary control over muscle movement is caused by the loss of nerve cells deep in the brain which leads to the release of a chemical called dopamine. Dopamine acts as a neurotransmitter by relaying messages from one nerve cell to another. This activity is vital for normal control and planning of body movement. Symptoms typically

Credit: Pixabay

appear after 50-80% of the cells in a specific brain region, called the substantia nigra, have died. Current drug treatments are only capable of slowing disease progression. This is not surprising, given we have yet to understand the reason for this nerve cell death. Scientific testing by the University of Edinburgh has confirmed Milne’s ability to detect Parkinson’s before diagnosis. Dr Tilo Kunath led the study, which asked Milne to identify Parkinson’s sufferers by the smell of 12 shirts, six of which had been worn for a day by known Parkinson’s patients and six which had been worn by a control. Encouragingly, Milne guessed 11 out of the 12 shirts correctly. It later turned out that the one shirt incorrectly identified by Milne was from a Parkinson’s sufferer yet to be diagnosed at the time the study took place. Scientists are now hoping to identify the chemical released by Parkinson’s sufferers. Although the research is still in its very early days, the chemical change is thought to occur in sebum, an oily substance coating the skin. Secretion from specialized glands found all over the body means non-invasive tests such as skin swabs could provide an earlier diagnosis. A diagnosis based on the musky odour would enable treatment before symptoms even begin to show. This simple, yet amazing ability to detect Parkinson’s may therefore lead to improved quality of life for sufferers in the future.

Rosie Owens Winter 2015 | 5

research in e d i n b u rg h

Animal models may provide insight to emotional memories in humans Post-traumatic stress disorder (PTSD) is a mental health condition caused by stressful events, affecting anxiety levels and emotional state. The Wang laboratory at the Centre for Clinical Brain Sciences (CCBS) is interested in developing a novel animal behavioural model that may provide insights into treatment for PTSD and other memory related disorders. To understand PTSD-related memories, consider that retrieval of a fear memory often involves two main processes: reconsolidation and extinction. Reconsolidation is thought to be an update mechanism during which new information is incorporated into old memories. It may be possible to permanently

Credit: Pixabay

change a fear memory by introducing new non-fearful information during this reconsolidation period. Extinction is when a tone conditioned stimulus (CS) that predicts a shock unconditioned stimulus (US) is repeatedly presented in the absence of the US, causing conditioned fear responses to diminish. With sufficient extinction, subjects respond to the CS as if they had never been conditioned. The hypothesis posited by the Wang group is that, by combining this retrieval and extinction paradigm, fear memory could be persistently altered. Another aspect that the Wang group explores is what happens in the brain during fearful memories. To do this, functional magnetic resonance imaging (fMRI) is used in conscious animals to visualise the brain areas involved. The Wang group hopes that by combining this novel retrieval-extinction paradigm with brain imaging techniques, the neurobiology of PTSD can be better understood, potentially contributing to novel treatment strategies.

Dorothy Tse is a post-doctoral neuroscience researcher in the Wang lab at the CCBS

Early tissue reactions as the key to regeneration after traumatic brain injury What’s the difference between humans and zebrafish? Apart from ‘minor’ physical differences and the fact that fish are better at breathing underwater than us, zebrafish show a remarkable ability to completely regenerate all kinds of tissue after injury, most interestingly their central nervous system (brain and spinal cord). This is a feature that is sadly absent in humans – traumatic events or stroke often leaves us with lifelong disability and little hope for a full recovery. By studying the events that lead to successful regeneration in zebrafish, we can begin to generate new treatments to improve recovery of the human central nervous system after injury. This is the focus of Dr Leah Herrgen’s group at the Centre for Neuroregeneration. Her hypothesis is that particularly early events in the seconds and minutes after injury are criti-

Credit: Chiara Herzog

6 Winter 2015 |

cal for the orchestration of a successful regenerative response. Obviously, there are intrinsic differences between humans and zebrafish. However, the zebrafish is a vertebrate just like us and shares over 70% of our genome, which makes findings in fish more applicable to humans. With state-of-the-art live imaging and the use of genetic and pharmacological manipulation, the group aims to explore early tissue reactions occurring during zebrafish regeneration. This could provide us with a starting point leading to novel treatments for human traumatic brain injury. Chiara Herzog just started her PhD in Leah Herrgen’s lab and is excited to study the molecular pathways of neuroregeneration

re se a rc h i n e d i nburgh

On the path to understanding endometriosis Globally, millions of women are suffering in silence from a condition shrouded in ignorance, which physicians often fail to diagnose and scientists have yet to fully understand. This condition is endometriosis: a disease where the lining of the uterus starts to spread and grow in inappropriate places such as the abdomen and on the ovaries. This lining will shed and bleed every month, just like the lining of the uterus does during a regular menstrual cycle. With time, as the body responds to the spread of the uterine lining, small nerve fibres which conduct pain start to grow within the tissue. The symptoms of this phenomenon include chronic pelvic pain, heavy painful periods and pain during sex. Further, recent studies suggest that endometriosis might be responsible for up to half of all unexplained infertility cases in women. The Greaves lab at the Queen’s Medical Research Institute aims to examine the characteristics of the immune cells and the nerve fibres involved, to understand what role they play in the disease. Previous studies have shown how these immune cells are directly involved in the pain caused by endometriosis. Specifically, they trigger the generation of nerve fibres which conduct pain signals. While this is a good start, there is still a lot left to understand about endometriosis, including the underlying mechanism through which immune cells interact with the tissue and surrounding nerve fibres.

Fortunately, endometriosis has recently garnered the attention of mainstream media, which could mean increased research funding and large strides toward better understanding and treatment. Furthermore, improved diagnoses will be made now that physicians are better informed. Importantly, women can take comfort in knowing there is a name that can be put on their suffering and one day, hopefully, a cure. Halima Hassan is a masters student in Dr Erin Greaves’ lab

Photo of a woman with abdominal pain. Credit: Ohmega1982 at

Uncovering the origin of fragile skin in Kindler syndrome The postman rings the doorbell at 9am. When you finally feel brave enough, you open the NHS envelope and discover that you have a genetic condition that you have never heard of before. Countless internet searches and a doctor’s appointment later, you are now familiar with Kindler syndrome, a painful condition whose symptoms include thin, fragile and blistered skin that easily tears and bleeds, and a predisposition to skin cancer. You find out that, while skin blisters tend to improve with age, fragile skin is one of the most persistent symptoms —unfortunately, this becomes progressively worse as patients grow older. Later, you are reading a scientific article that explains the

Diagram of skin cells. Credit: digitalart at

origin of Kindler syndrome. Patients inherit a faulty gene from their parents, which stops the production of an important protein named Kindlin-1. As a result, Kindler syndrome patients are born lacking Kindlin-1, giving rise to the painful symptoms that accompany the disease. You keep searching online for treatment options, only to find that there are none. All that patients can do is manage their symptoms on a daily basis. The disease currently has no cure. You suddenly come across something that makes you smile. Scientists at the University of Edinburgh are trying to uncover the molecular origin of this disease’s symptoms, which will contribute towards finding a cure. I am one of those scientists. The aim of my PhD is to discover the cause of the fragile skin, one of the most painful and persistent symptoms of Kindler syndrome. Researchers have been trying to get to the bottom of this for nearly a decade. They have identified that Kindlin-1 facilitates the attachment between cells and a molecular scaffold that supports their structure. In Kindler syndrome, the cell-scaffold bond is weakened and many skin cells are unable to survive, resulting in thin, fragile skin. However, our study shows that there is another explanation for this symptom. After staring at hundreds of skin cells under a microscope (over multiple cups of coffee), we have discovered that Kindlin-1 is also important for cell division, and without it skin cells are struggling to multiply, resulting in skin fragility. This brings us a step closer to finding a cure for Kindler syndrome. The study, led by Professor Valerie Brunton, will be published in the Journal of Molecular Cell Biology. Ifigeneia Stavrou is a fourth year PhD student in molecular biology Winter 2015 | 7

focu s

Extreme Science

When reading about and researching science, I find a natural gravitation towards anything representing an extreme in its field. Anything weird, epic, and difficult to believe has an extra element of interest. So, this issue was dedicated to exploring extreme science; no lab mouse too erratic; no disease too disfiguring , and no particle bent on ignoring the laws of physics could be excluded from this topic—excellent! Our writers have taken up the gauntlet extremely well. Tim Wilkinson investigates the impact war has had, and still has, on medical innovation (p9) while Felicity Anderson-Nathan describes the experience of people for whom the world has become an extreme place to be—those suffering from multiple chemical sensitivity (p7). Jade Parker explores extreme surgery and a technique used to treat epilepsy requiring removal of portions of the brain (p10) and Chiara Herzog investigates the world of extreme ethics in the designer baby debate (p12). Chrystalleni Vassiliou looks at the science behind controlling other people’s minds (p14) and Ashley Dorning finds out what happens when the immune system loses the battle to stay balanced and takes sides with cancer (p15). Hannah Johnston takes up the challenge of investigating extreme explosions (p16) while Adelina Ivanova argues that our current extreme weather might be here to stay, and on the scale of future averages can’t be considered extreme at all (p17). Finally, Gitanjali Podhar looks at extreme physics with a guided tour of the Large Hadron Collider (p18). Extreme science is often the spark that inspires young people to move into the world of science and which peaks the interest of non-scientists to read about a topic they’d otherwise dismiss. If you’ve got ideas for the next theme of EUSci get in touch at, and if you’d like to write an article or get involved with editing we’d love to have you on board. In the meantime, I hope you enjoy journeying through the world of extreme science over the next few pages. Vicky Ware, Focus Editor Illustration of deep sea animals, stalked crinoids and xenophyophores. Credit: Alyssa Brandt

8 Winter 2016 |

fo cu s

Medicine on the frontline Tim Wilkinson looks at how conflict can drive new medical techniques War and medicine go back a long way. Ambulances, tourniquets, plastic surgery, and modern nursing all resulted from previous conflicts. Modern warfare is no different. Healthcare professionals working in the Afghanistan and Iraq wars have used these extreme working conditions to develop new techniques and processes to advance trauma medicine. Following a study published in May this year, some of these findings may soon move from the battlefield to the National Health Service (NHS). Camp Bastion, the former British Army headquarters in the Helmand province of Afghanistan, started as a cluster of tents and ended as the largest overseas British Army base since the Second World War. At its peak, it was the size of Reading and housed one of the Britain’s busiest airports. Over its lifetime, the hospital in Camp Bastion became known as one of the world’s finest and busiest trauma centres. The outcomes from a review of the military hospitals in Iraq and Afghanistan in 2009 caused Mr John Black, President of the Royal College of Surgeons, to declare, “…the results achieved in the management of the injured soldier in the current conflicts are the best ever reported...this is a truly remarkable achievement.” It became clear there was something special about the medical care delivered in these war zones. The medical staff had long realised soldiers started sur-

viving injuries from which they previously would have died. A study published this year in the Journal of Trauma and Acute Care Surgery has now confirmed this suspicion. The researchers looked at around 2800 British combat casualties in Iraq and Afghanistan between 2003 and 2012. Each patient over the 10-year period was initially given a number on the New Injury Severity Score (NISS). The NISS, a scoring system used to measure the severity of injuries, ranges from one to 75 with higher numbers being a more severe combination of injuries. They found survival rates improved each year during the conflicts, so patients with high NISS scores were more likely to survive as time went on.

The average time from a patient’s arrival to the start of an operation was under an hour It is important to work out what happened over that 10-year period that resulted in the dramatic improvement in survival rates. The sheer volume of trauma doctors treated in these hospitals allowed the army medics to gain more experience in managing serious injuries during their postings than most NHS doctors would in a lifetime. The army

Photo of The Medical Treatment Facility at Camp Bastion. Credit: Cpl Steven Peacock/ MOD, from Wikimedia Commons

introduced a new system that made continuous small improvements, and the staff learned which ideas worked best. One simple example of these improvements involved changing the layout of the hospital to reduce delays in a patient’s journey from the moment they arrived. The term ‘right turn resuscitation’ was coined, which referred to severely injured patients being taken to the door on the ‘right’ and therefore straight to the operating theatre without stopping in the Emergency Department. The entire trauma team would stay with the patient and carry out their tasks in the theatre whilst the surgeons worked on stopping life-threatening bleeding. The focus on reducing delays meant the average time from a patient’s arrival to the start of an operation was under an hour – an exceptionally quick time by any hospital’s standards. Other new ideas included senior doctors travelling out to the patient and administering blood transfusions during the journey to hospital. Some of the new practices developed during the Iraq and Afghanistan wars are now being implemented in NHS hospitals. Several hospitals have reorganised their layout and introduced ‘right turn resuscitation’, ensuring the most unwell patients are taken rapidly to the operating theatre. Trauma centres in London have also learned from the military’s success in giving blood transfusions to patients earlier. These centres now use a ‘code red’ system where ambulances can alert the hospital trauma team to prepare blood before the patient’s arrival. Whether it is possible to produce the same results in NHS hospitals as that of Camp Bastion remains to be discovered. Staff in UK hospitals do not see the same volume and types of trauma as that of war zones, and so there is uncertainty about which of the advances are most appropriate for non-warzone settings. Over the next few years, NHS trauma centres will need to build on these developments and work out how to best adapt them to civilian life.

Tim Wilkinson is a clinical research fellow at the Centre for Clinical Brain Sciences Winter 2015 | 9

focu s

A 21st century disease Felicity Anderson-Nathan investigates the issues behind Multiple Chemical Sensitivity Imagine if your stomach turned at unwashed fruit, or that you could collapse after using everyday cleaning products. What if the smell of deodorant repelled you physically? If you have Multiple Chemical Sensitivity, or MCS, you don’t have to imagine—you’re living it every day. The word ‘chemical’ may seem vague, but it refers to the many volatile organic compounds which make up our modern world. While sceptics remain unconvinced that these can cause such powerful physical reactions, proponents of MCS argue that we have had insufficient time to adapt to these chemicals. We have all felt light-headed opening a tin of paint or wrinkled our noses at the unpleasant odour of a pungent plastic product. Pesticides widely used only a few decades ago are now banned because of their toxic effects. MCS doesn’t present as a traditional allergy and cannot be tested for as such. Sufferers typically find that, after exposure to chemicals in concentrations considered to be non-toxic, they experience a host of symptoms. These include fatigue, headaches, nausea, shortness of breath and fainting. With triggers as ubiquitous as they are, the symptoms can be debilitating and often leave people unable to work or even leave their homes. The advice given by MCS support groups is to remove sources of chemical emissions and install air and water filters to create a sanctuary in the home. Depending on what triggers an individual’s MCS, they might need to eliminate artificial fibres from their wardrobe or even have metal fillings removed. Some sufferers cannot tolerate the dust circulated by fan-cooled electrical devices such as computers or games consoles, while others cannot bear the mould propagated by books or the ink used in newspapers. Living in cities with traffic pollution can be problematic but so can living near fields sprayed with pesticides. Something as simple as maintaining a social life is fraught with difficulty as meeting with friends means enduring the effects of their chemical-filled lives. In a small town in America, a whole community emerged to accommodate the varied and highly specific needs of those with MCS. Snowflake, Arizona, has several dozen houses designed to minimise chemical use as much as possible

10 Winter 2015 |

Illustration by Jemma Pilcher

with people attracted by the cheapness of land, the exceptional air purity and the isolation. Residents are constantly harassed by other MCS sufferers hoping for vacancies. However, the condition is widely disputed. The World Health Organisation does not recognise it as a physical illness, nor does the American Medical Association or the UK National Health Service. In 1996, the WHO proposed to rename the condition to Idiopathic Environmental Intolerance, categorising it as a psychiatric disorder.

The symptoms can be debilitating and often leave people unable to work or even leave their homes In many clinical trials, MCS sufferers responded as strongly to placebo triggers as they did to genuine ones. Some experts believe that MCS functions as a behaviourally reinforced response where a person will experience a coincidental negative reaction to a harmless substance and consequently associate the substance with the reaction. When they are next exposed, they react even more strongly.

It is also possible that people with MCS suffer from another undiagnosed condition. Many people with chronic migraines experience headaches triggered by overwhelming sensory inputs such as strong smells—the cause here is not the chemical components themselves but a sensory overreaction to stimulation. Also known to cause similar effects are chronic fatigue syndrome and fibromyalgia. Taking an avoidance-only approach to treating MCS puts sufferers at risk of missing a diagnosis and isolating themselves in the quest for a safe environment. Given that this illness is poorly understood and that sufferers are particularly desperate, they are ripe for exploitation by unscrupulous practitioners willing to offer bizarre cures in exchange for hard cash. Regardless of the cause, people with MCS experience symptoms and their needs are valid. It is a condition which is isolating and debilitating, forcing people to give up their hobbies, their jobs, and their friends. So, if someone asks you to avoid using bodyspray when you are with them, consider making a small adjustment in your life to make a big change in theirs. Felicity Anderson-Nathan is a writer with an interest in chronic illness

fo cu s

When half a brain is better than one Jade Parker explores an extreme surgery used to treat paediatric epilepsy It may sound counter-intuitive, but in some extreme cases half a brain may be better than a whole. For patients with severe medication-resistant epilepsy, removing half of their brain can make their seizures disappear. The surgical procedure, termed hemispherotomy, involves removing half of the brain and is one of the most drastic forms of neurosurgery. “This operation is used for patients with intractable seizures from large areas of the brain. These areas are usually those that did not develop correctly or were damaged early in life,” says University of California neurosurgeon Gary Mathern. Surprisingly, according to Mathern, “The majority of operations occur before the age of 10 years old.” Taking advantage of the brain’s plasticity at this age, surgeons are able to remove large parts of the brain without affecting the child’s personality or memory. Patients would risk losing their ability to speak if the same operation were performed after the age of 10, as speech is already fixed beyond this age. For children undergoing this operation, there have been high success rates, with seizure control being achieved in 70-80% of patients post-surgery. However, this wasn’t always the case. Like any procedure, there are major risks associated with the surgery and the chance of adverse effects depends on a number of factors, such as the age of onset of seizures, age at surgery, and the severity of the condition itself. When this procedure was first performed, one of the major issues surgeons had to tackle was what they would do with the huge empty cavity left after the surgery. In the early stages, there were reports of surgeons filling the brain cavity with ping pong balls. Thankfully, with advancements in scientific knowledge, they realised this was not necessary as the brain’s own cerebrospinal fluid would rapidly fill the cavity. The cerebrospinal fluid itself can create even greater issues. Considering it is produced at a rate of a teaspoon a second, it can accumulate quickly. Years after the operation, doctors received reports of patients being permanently comatised or dying due to hydrocephalus. This is when the pressure within the brain becomes too great due to a buildup of cerebrospinal fluid. In a normal brain,

the tissue will absorb the fluid and therefore maintain the delicate equilibrium. Unfortunately, it is thought that when patients undergo a hemispherotomy, the brain is not able to absorb the fluid quickly enough.

There were reports of surgeons filling the brain cavity with ping pong balls

Fortunately, surgeons have now devised ways to detect hydrocephalus in its early stages and can drain the fluid before it causes problems. Even with this improvement in care, there are still major risks associated with the procedure. The biggest risk comes from navigating the brain’s intricate network of blood vessels. If they are interfered with, it can send patients into shock or comas, from which they may never recover. Another issue is ensuring the entirety of the hemisphere is removed. Even a small amount of the brain remaining can send out damaging electrical currents into the

healthy brain and reignite seizures. To overcome these difficulties, surgeons are coming up with even more ingenious ways to improve the surgical procedure and minimise risk. By mapping out the brain using electrode grids, they are able to distinguish which areas are essentially ‘bad tissue’ and remove these areas using microscopic surgical instruments. Along with removing bad tissue, they disconnect the corpus callosum that connects the two hemispheres to functionally prevent damaging signals being sent to the healthy brain. As surgeons have improved the surgical techniques, it has become a popular way to cure paediatric epilepsy, and since the first time it was carried out in 1923, it has been performed hundreds of times. Surgery continues to astound us with its advancements but this operation still stands as one of the most extreme procedures.

Jade Parker studied Veterinary Science and is currently working as a journalist for a local newspaper

Illustration of an MRI scan of a brain which has undergone a hemispherotomy. Credit: Amy Wedge, with some components licensed from Pixabay

Winter 2015 | 11

focu s

Designer babies: an ethical debate resurfaces Chiara Herzog looks into the ethics and facts behind custom kids Modern biomedical research relies heavily on the editing of genetic material—DNA—to study the function of genes by mutating them or cutting them out and seeing what happens. Furthermore, genetic disorders are commonly recreated in laboratory organisms by genome editing. Much to the dismay of animal rights activists, thousands of genetically modified animals exist and are used every day in laboratories all over the world. Despite the availability of many genetic tools developed specifically for some organisms, creating genetically modified organisms was a cumbersome, time-consuming, and expensive undertaking with a relatively low success rate—until recently. Times are changing, however. With the steady advance in biomedical research, the need for easy-to-use genetic modification was picked up on and has led to the development of new techniques. A new addition to the genome-editing arsenal, first developed in 2012 by biologists at the University of California, is the CRISPR/ Cas9 system (abbreviated as CRISPR). Think of CRISPR as molecular scissors guided by a satellite, excising a desired bit of DNA and—if designed to do so—replacing it with another predefined sequence.

Think of CRISPR as molecular scissors guided by a satellite

Despite its fairly recent discovery, CRISPR is already in extensive use for generating genetically modified animals and might have reinvented genetic research in little more than three years. Its easy-to-use mechanism and relatively high efficacy makes it possible to use the technique in virtually any organism— particularly those that remained ‘untouched’ by genetic modifications so far: humans. Earlier this year, a group of American scientists appealed to the scientific

12 Winter 2015 |

community to refrain from editing the human genome, specifically the germline that contains the cells used to procreate (eggs and sperm cells). In a comment published in Nature, they mentioned that using the current technologies, the effects on future generations are unpredictable, rendering it dangerous and unethical. ‘Germline modification’ could occur when editing genetic material at embryonic stage, including cells that will later become egg or sperm cells. In contrast, ex vivo modification is a genetic modification technique already in use: blood cells are extracted from a patient’s blood, cultured in the lab, genetically modified (e.g. boosting the immune system), and subsequently given back to the patient. A famous case of this is the gene therapy given to ‘bubble babies’ in the early 2000s—these children suffered from severe combined immunodeficiency (SCID) and had to be quarantined because their immune system was non-existent. SCID is sometimes caused by a mutation in a single gene, so they received gene therapy to replace the defective gene with a correct copy. In 14 out of 16 children treated in the UK, this led to complete recovery. However, two of these children developed leukaemia, which was linked to the gene therapy they received. A cancer-promoting gene was accidentally switched on by the genetic modification. The leukaemia was treated and the boys survived; however, this shows that tampering with genetic material is not without risk. While this type of gene therapy in children and adults can have adverse effects on the patient, the potential mutations will not be passed on to the next generation and it is thus usually deemed ‘safer’ than tampering with embryonic genetic material. Less than a month after the request for a moratorium on human embryonic genome modification this year, an article was published by a Chinese research group describing attempts of modifying genetic material of human embryos, removing a gene involved in a blood disorder. Surprisingly, success rate was relatively low and a high number of ‘off-target’ effects were observed in the genetic subset they looked at. Junjiu

Huang, one of the authors of this controversial paper, mentions his team probably only detected a fraction of the unintended mutations. “If we screened the whole genome sequence, we would probably have found many more,” he says, ultimately justifying the moratorium on human genome editing that scientists demanded one month earlier. However, in a Nature news article the same month it was stated that, according to a Chinese source familiar with the matter, at least four groups in China are pursuing gene editing in human embryos. None of these attempt to result in live births so far.

Even in the United States, modifying human embryonic DNA is not strictly forbidden

Despite many countries having restrictive laws when it comes to modification of embryonic DNA, it is possible in some parts of the world, particularly in parts of Asia. “The truth is, we have guidelines but some people never follow them,” mentions Qi Zhou, a developmental biologist at the Institute of Zoology in Beijing, in a news article published on the Nature website. Some other countries, such as Russia and Argentina, have a fairly ambiguous legal situation where tampering with the human genome is theoretically not forbidden. Even in the United States, modifying human embryonic DNA is not strictly forbidden; however, this type of work will not be funded by the National Institute of Health, the main scientific funding body, and thus is very unlikely to happen. In the UK, a group at the Francis Crick Institute in London is currently seeking permission to modify the genetic material of human embryos for solely scientific purposes. The data so far suggests tampering with the human embryonic DNA in clinical settings is probably not a

fo cu s

Illustration by Pei Ling NG

good idea—yet. However, in the future, families with a predisposition to genetic disorders could benefit from the possibility of modifying their embryo’s DNA. Pre-implantation genetic diagnosis is already available in many countries— the egg cells are fertilised in vitro and the resulting embryos are screened for genetic mutations, such as cystic fibrosis, spinal muscular atrophy, or Huntington’s disease. Most of the diseases for which this is currently available are crippling disorders with severe loss of quality of life for which a single gene is responsible (thus, it is easy to screen for mutations). An embryo that does not carry mutations is then selected, reimplanted into the woman’s uterus and develops normally from that point on. The remaining embryos are so far ‘discarded’, and research on these embryos is prohibited in most countries, even if the experiments would never lead to live birth. With CRISPR, it could be possible to save embryos from being discarded via editing out the mutation, which could be particularly useful for couples who do not have more than one embryo. While screening appears to be the easier approach so far, CRISPR definitely will have its advantages in some situations.

Before CRISPR is used for the modification of human embryonic DNA in a clinical setting, there needs to be a broad discussion involving both the general public and the scientific community about the direction we are headed with this.

Unequal access to CRISPR could result in Gattaca levels of genetic classism

Fear looms that if scientists are able to eradicate genetic mutations from embryos, we may start using CRISPR to introduce, enhance, or eliminate traits, such as sex or eye colour, for non-medical reasons, eventually leading into the very controversial areas of transhumanism and eugenics. Unequal access to this technology could lead to Gattaca levels of genetic classism. While we might start off with stamping out diseases, will we stop at this or will we carry on to

make better humans, introducing genes for stronger bones, resistance to viruses such as HIV, heightened cognitive ability, or decreased risk for Alzheimer’s disease? Dana Carroll from the University of Utah says in an interview with The Guardian that he is sure there are some characteristics that people will want to change. “However, presently those kind of multi-genetic traits would be difficult to edit in because we don’t fully understand their basis, let alone what unintended consequences might result,” he adds. Scientists should work together with the public to prevent CRISPR, a promising candidate for therapeutic use, from being hindered by a public outcry about ethical breaches. As Isaac Asimov, an American sci-fi author and professor of biochemistry, says, “The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”

Chiara Herzog is a first year PhD student at the Centre of Neuroregeneration Winter 2015 | 13

focu s

Reaching the era of mind control

Chrystalleni Vassiliou explores the scientific attempts in understanding and manipulating the human brain

Illustration by Jemma Pilcher

Neuroscience uses neuronal morphology, anatomy and connectivity to determine how the brain performs complex cognitive functions such as remembering, learning, and thinking. Scientists are trying to uncover the mechanisms and relationships that allow the processes of the brain to be understood, modulated, and recreated. Taking this into consideration, in the future, people may use technology to transmit thoughts from one organism to another or to control minds.

A skilled surgeon could remotely control an inexperienced assistant’s hands to perform medical procedures even if not on-site The idea of mind control first appeared decades ago when humans started experimenting with brain activity patterns. This area of research has inspired a range of comics and movies. However, due to the absence of requisite technology, limitations in resolution and lack of funding, it wasn’t until recently that scientists developed a serious interest in the field. Neuroimaging technology was needed to measure how brain activity relates to function in real time. One widely known non-invasive technique used in humans is electroencephalography (EEG), where electrodes are placed on the scalp to record the electrical activity of the brain. It is used clinically to detect changes in brain activity and to

14 Winter 2015 |

diagnose patients with neuronal dysfunction or trauma. In research, it allows us to study brain activity while the subject is doing a particular task. This allows the interpretation of the subject’s attention by recognition and translation of the patterns of brain signals. The next step was to transfer or reproduce these signals in a desired destination. A well known example is neuroprosthesis control where bionic limbs that transmit sensory information to the brain and respond to motor commands are undergoing development. Computers are able to recognise and respond appropriately to brain activity, appropriately called a ‘brain-computer interface’. However, to create a bidirectional network, brain-computer and computer-brain interfaces which would eventually lead to a ‘brain-brain interface’ (BBI), scientists needed to develop tools to stimulate the brain in the desired areas. One technique is transcranial magnetic stimulation (TMS). This uses magnetic fields that create electric currents through electromagnetic induction which stimulates a small brain area. A more recent tool that uses ultrasound transducers to deliver acoustic energy to the desired area is called focused ultrasound stimulation (FUS). We now have a way to record and understand, to some extent, what the brain is doing and a way to modulate its activity with computers as a relay station for BBI. Scientists from Harvard University used a form of EEG and FUS to allow human volunteers to elicit tail movements in rats, while other groups tried to transfer sensorimotor information or spatial memories between rats. Scientists at the University of Wash-

ington used EEG and TMS to allow communication between the brains of participants in different buildings in a video game task requiring collaboration. The first participant saw the screen and controlled the hand of the second participant, who couldn’t see the game, to press make him/her a button on a touchpad. Other scientists from the same university tried to investigate BBI using the same techniques in a task similar to the game ‘20 Questions’ with the participants in separate buildings. The responder, instead of answering, focused on the answer he wanted to give and the signal was transmitted to the receiver’s brain through TMS. Although some things remain questionable, especially due to the limitations of the techniques used, these are successful examples of mind and body control. As we gain more insight into the neurobiology of cognition and the neural circuitry at both the cellular and molecular levels, and as technology advances, we could transmit concepts and rules. BBI will be useful in many aspects, like teaching. For example, Dr Stocco explains, “A skilled surgeon could remotely control an inexperienced assistant’s hands to perform medical procedures even if not on-site.” There is enormous potential for such a technique, but there are ethical issues we should consider. More importantly we should establish a trust that no-one will take advantage of these techniques. If they end up in the wrong hands, we won’t be far from the plot of sci-fi movies becoming reality. Chrystalleni Vassiliou is a fourth year BSc Neuroscience student

fo cu s

Extreme immune system Ashley Dorning explores what happens when the immune system is in league with cancer We are all grateful for our immune system. It scours our body every day, dealing with infection and destroying unhealthy cells. When cells turn cancerous, the immune system can recognise these cells and restrict their growth or kill them. Yet tumours have ways of manipulating the immune system to their own advantage. The immune system no longer recognises the cancer as a threat, and the cancer cells can turn the immune system into an enemy in league with the cancer against the body.

The cancer cells can turn the immune system into an enemy

But how can the immune system, our ally against pathogens and cancer cells, turn against us? It’s all to do with how cells of the body communicate. Macrophage cells of the immune system are thought to be the main culprits aiding tumour survival. In some types of cancer, patients with a higher number of macrophages are less likely to survive. The usual job of these cells is to help with injury repair and deal with pathogens such as viruses and bacteria. When macrophages help in tissue repair they

increase blood vessel development and release factors that cause nearby cells to multiply. Cancer cells use these macrophages to their advantage by mimicking tissue injury signals so that macrophages support tumour growth. Cancer cells release a factor called macrophage colony-stimulating factor (CSF-1) which causes macrophages to join the tumour. Macrophages release chemical signals which promote tumour survival and expansion. One such signal is tumour necrosis factor (TNF)-alpha which prevents cancer cells from dying. The tumour-associated macrophages also release epithelial growth factors (EGF) which promote the migration and spread of cancer cells. These signals help the tumour grow. As it does, more CSF-1 is released which, in turn, recruits more macrophages. Macrophages have even been shown to dampen the ability of natural killer cells to eliminate cancer cells. This back-and-forth causes a continuing exacerbation of the cancer. The most dangerous role of these pro-tumour macrophages is to help the primary tumour become metastatic. A metastatic tumour is one capable of spreading to other parts of the body. This is the major cause of death from cancer and the hardest to treat medically. The risk of cancer metastasis increases as macrophages release more signals.

This causes an increase in the growth of blood vessels in and around the cancer cells which lets them survive and expand. Consequently, the likelihood of the cancer cells entering the blood increases. When the cancer cells touch the blood vessel, macrophages are thought to help guide these cells through the wall of the blood vessel into the circulating blood. The tumour cell is then able to circulate to another part of the body, and if it survives, it will produce another tumour at that location. But the story of the macrophage betrayal does not end there.

Macrophages help cancer cells leave the primary tumour and help them survive at their new locations Not only do the macrophages help cancer cells leave the primary tumour, they also help them survive at their new locations. Experiments show removal of macrophages helps prevent metastatic tumour survival. Research is ongoing to find out exactly how macrophages help metastatic tumour cells survive. Given that we still have limited therapies to treat metastatic tumours, this research is made more significant. The more we find out about the changes induced by cancer in the immune system, the more avenues we have for drug development to fight cancer by targeting the cells of the immune system that turn against us. Future treatments may be able to target macrophages and the factors they release, providing us with a new tool to treat cancer patients and hopefully save lives. Dr Ashley Dorning is a laboratory manager working in Bin-Zhi Qian’s lab in the QMRI

Illustration of a macrophage (grey) interacting with a cancer cell (yellow). Credit: Alyssa Brandt

Winter 2015 | 15

focu s

Diamonds, bombs, and elephants Hannah Johnston explores how carbon polymorphism is used to study explosives

Photo of diamond anvil cell at University of Edinburgh. Credit: Hayleigh Lloyd

Diamonds, graphite and charcoal have one thing in common; they’re all made up of carbon. Graphite is soft and constitutes layers of carbon atoms which slide over each other easily, making it an ideal material for pencils as the lead smoothly glides off under friction with the paper. Re-jig this carbon arrangement and, hey presto, you have diamond whose carbon atoms are arranged tetrahedrally and form a strong 3D structure which can withstand a great amount of pressure. Chemistry researchers at the University of Edinburgh are taking advantage of this quality to help them better understand the materials in bombs. The Colin Pulham group in the School of Chemistry study energetic materials which encompass explosives, propellants (e.g. rockets) and pyrotechnics (e.g. fireworks). Their research doesn’t involve chucking bombs about and measuring their blast radii; they look into the actual crystal structure of the energetic materials in order to increase their detonation velocity (to put it simply, their BOOM!). The shock wave after detonation can produce temperatures of up to 5227ºC and pressures of up to 50 gigapascals. So, what creates the BOOM? These materials store a lot of potential energy which is quickly released as heat and/or gaseous products when bonds are broken upon stimulus. This stored potential energy can exist in three forms: nuclear, physical or chemical. It’s the latter that the Pulham group are investigating. In a chemical explosion, decomposition or combination reactions occur. TNT is an example of a material that undergoes a

16 Winter 2015 |

decomposition reaction. In gunpowder, several components react together exothermically to produce hot gases. The Pulham group are looking into ‘green’ explosives which produce fewer toxic by-products, have a longer lifetime, and are environmentally benign. The research group does not, however, want to compromise their explosive power. This comes with its challenges, which include sensitivity to stimuli and thermal stability—you don’t want these energetic materials going off or degrading when a wee bit of heat or pressure is applied!

Rubies are used as a reference material as their behaviour under pressure is known The power of the energetic materials can depend on several factors, density being one of them. Ideally the materials will contain many crystals of favourable morphology and orientation. But what has been discovered is that the crystal structure can change on detonation or once a high temperature or pressure has been applied – this is called polymorphism. A polymorph has different properties to the original crystal and this could mean a different explosive power. This is where diamonds come into the picture. To measure how much pressure can be applied before a polymorph is created, a diamond anvil cell is used because diamond can withstand a huge amount of pressure.

This device, which is slightly bigger than a twopence coin, comprises two opposing diamonds with their tips compressed against a 0.3 millimetre thick metal piece known as a gasket. The gasket has a small hole containing rubies and the crystal sample under investigation made of energetic material. Rubies are used as a reference material as their behaviour under pressure is known. Metal plates are placed on the top flat surfaces of the two opposing diamonds. These plates contain screws which are turned very slightly in order to apply a great pressure on the crystal. Diamond anvils can recreate pressure existing deep inside our planet. In fact, a slight turn of a couple of millimetres can equate to one gigapascal. To put this into context, one pascal is the same pressure as that exerted by a £5 note on a table. Three gigapascals is the same pressure as 24,000 elephants standing one on top of each other, but is also the same as 20 elephants standing one on top of each other when the bottom one is wearing stilettos. Surface area has a major influence on pressure, hence the reason for the diamond point being compressed on the crystal sample rather than its flat edge. In the anvil, techniques known as Raman spectroscopy and X-ray diffraction are utilised. Without going into detail of how these techniques work, the sample is irradiated with visible light and X-rays respectively. These provide information on the polymorph created and under what pressure it was created. The Pulham group are using this method to study existing energetic materials such as Research Department Formula X, which was widely used during the Second World War. Extreme conditions have been extensively used to study materials such as metals, superconductors and minerals, but less is understood about energetic materials. Upon their detonation, high pressures and temperatures are reached which can lead to polymorphism. More research is needed to better understand the chemistry involved in these processes so that firework displays can be more impressive and rockets can reach the furthest depths of space. Hannah Johnston is a PhD chemistry student

fo cu s

Weather gone wild Adelina Ivanova explores why we may be looking at extreme weather events from the wrong perspective If you were unlucky enough to wake up in Boscastle, Cornwall in August 2004, you would have found your entire house flooded as nearly three times the average rainfall for the month fell in a single day and flash flooded the village. Or if by any chance you were in the United Kingdom during the summer of 2003, you would have been boiling due to the heat wave that struck the country and led to the highest ever temperature recorded at 38.5ºC. Surely, if you were part of either of these occasions, you would have said the weather went crazy—to an extreme.

Over 350,000 lives were lost on a global scale due to extreme weather events over the 10-year period In scientific terms extreme weather is any occurrence that diverges from the usual or average local weather patterns. It includes droughts, cold spells, storms and strong winds. In its report ‘The Global Climate 2001-2010: a Decade of Climate Extremes’ the World Meteorological Organization published an extensive analysis of extreme weather using the heat waves in the United States, the droughts in Europe’s Iberian Peninsula, and the abnormal rainfall in Australia and New Zealand as examples. According to the report, over 350,000 lives were lost on a global scale due to extreme weather events over the 10-year period, accounting for more than a 20% increase in casualties in comparison with the previous decade. Most scientists agree climate change is the result of human activity, but there is a lack of solid proof about the ways in which climate change influences regional extreme weather events. However, rising temperatures have several effects on the weather-determining factors. In a globally warmer environment, the rates of evapotranspiration (total evaporation of water from soil, plants and water bodies) are increased and droughts are more likely to occur as the ground gets dehydrated. At the same time, the atmos-

phere gets more humid and now contains 4% more water vapor compared to the 1970s, which increases the probability of heavier rainfalls. Although these effects on weather factors do not directly correlate with extreme weather events, they do increase the odds of such events occurring. Extreme weather events occurred before we contributed to global warming. But nowadays they are much more frequent and the subtle link to climate change seems to be the underlying reason. “My strong opinion is that these kinds of extremes are something you would expect in a warming world, and expect to happen more frequently,” says Harry McCaughey, a professor of climatology at Queen’s University. Since this warmer global climate is today’s reality, we might start asking ourselves the question of whether extreme weather events will become the norm. Sixty years ago in the United States, the number of record high temperatures equalled the number of record low temperatures. Today, the record highs are over twice the number of record lows. This not only indicates a significant warming of the general climate of the North American continent, but also that the heat waves with record high temperatures are becoming a pattern. The same can be said about the heavy rainfalls in the United Kingdom or the

heat waves striking Europe. These events are no longer isolated, but are occurring almost steadily now that the global climate is settling into its new, warmer parameters. Therefore, it is more efficient to regard them as constituent parts of the ‘new weather’, rather than as extremities. Their appearance is no longer random and unrelated to the normal climate determination.

Extreme weather events occurred before we contributed to global warming. Extreme weather may not be regarded as extreme anymore—at least not in a scientific sense. While extreme weather events are still shocking in terms of the damage they cause, their occurrence has shifted from being isolated to being frequent in the warmer modern world. Although concerning a general conclusion, this observation has the potential to make us less vulnerable to the impacts of extreme weather events since, by expecting their more regular occurrence, we will be better prepared to deal with their effects. Adelina Ivanova is a first year chemistry student

Illustration by Hari Conner

Winter 2015 | 17

focu s

Beyond the frontier Gitanjali Poddar guides us through the theory behind the Large Hadron Collider

Photo of the Large Hadron Collider. Credit: CERN

The main goal of physics is to understand the universe around us. Over the past few years, physicists have developed the Standard Model of particle physics. It explains the visible universe from the smallest to the largest scales. The Standard Model encapsulates our understanding of particle physics to date. It is experimentally tested and describes how fundamental particles interact. However, the Standard Model still leaves us with many unanswered questions about the universe. To help unravel some of these mysteries, scientists have set up the Large Hadron Collider (LHC) at the European Organisation for Nuclear Research, known as CERN.

He compared the Higgs field to a cocktail party of political party workers spread uniformly across a room

The LHC is the world’s largest and most powerful particle collider, spanning vast areas of Switzerland and France. It is designed to compress the maximum number of particles into the smallest space for successful and more frequent collisions. Inside the LHC two high-energy particle beams, circulating at extremely high speeds, are made to collide. This results in the attainment of a region of immensely high energy. At such high energies, particles that cannot be detected under ordinary conditions are created. The total collision energy obtained at

18 Winter 2015 |

the LHC has never been reached in a laboratory. This extreme energy frontier could open doors to exciting new possibilities and shed light on some of the universe’s unknowns, such as the existing imbalance in the amount of matter and antimatter or what gives particles their mass. We already have a proposed theory that may explain the latter: the Higgs mechanism. According to this mechanism, particles acquire their masses by interacting with an imaginary ‘Higgs field’ that pervades the universe. Fields, in physics, are regions where each point is affected by a force. The universe is full of fields and what we know as particles are simply excitations of those fields, like waves in an ocean. For example, electrons are excitations of the electron field while photons are excitations of the electromagnetic field. Similarly, excitations of the Higgs field should produce a particle known as the Higgs boson. Thus, although the imaginary Higgs field may not be directly detected, the presence of the Higgs boson will confirm its existence and therefore validate the theory of the Higgs mechanism. In 1993, the British science minister William Waldgrave observed that British taxpayers were paying a lot of money in contributions to the LHC at CERN, a system very few of them understood. Thus, he challenged particle physicists in the UK to invent a simple way to describe to the general public what the Higgs boson was. The winning explanation was provided by David Miller from University College London. He compared the Higgs field to a cocktail party of political party workers spread uniformly across a room. At this

party, an ordinary person would move through the crowd without facing any obstruction. The person would not interact with the crowd in much the same way some particles, such as photons, would not interact with the Higgs field. These particles are called ‘massless’. However, if the then Prime Minister Margaret Thatcher walked across the room, she would attract a lot of attention. Party workers would cluster around her, giving her metaphorical ‘mass’. This is a great analogy for the Higgs mechanism. Similarly, if a rumor started at the party, it would create the same sort of clustering, but among the party workers themselves. These clumps are analogous to the Higgs boson.

About 96% of the universe is invisible On 4th July 2012, physicists at CERN announced that they had observed a particle that looked very much like the Higgs boson. This is an important milestone in particle physics since the Higgs boson was the last missing piece of the Standard Model. Its discovery marked the culmination of decades of intellectual effort and validated a generation of scientific research on the Standard Model. For this discovery, physicists Francois Englert and Peter Higgs were jointly awarded the 2013 Nobel Prize in Physics. The detection of the Higgs boson filled a conspicuous hole in the Standard Model. However, our knowledge of the universe is still far from complete. The Standard Model itself is not perfect. It only accounts for the visible matter that forms 4.6% of the content of the entire universe. The other 95.4% of matter in this universe is invisible. Ongoing work at the LHC is therefore focused on solving the mysteries of this invisible dark matter and dark energy that pervades the cosmos.

Gitanjali Poddar is a third year physics student

fo cu s

The singularity is nearly here Jemma McClelland explores the repercussions of the technological singularity You sit at your computer and an update appears on your browser: “Would you like to be connected to the OverMind?” You hesitantly click ‘Yes’ and then a swarm of nanobots suddenly come through the cracks in your doors and up your nose, dispersing throughout your body. You no longer need to google ‘nearest Indian restaurant’, because the intelligent bots in your brain already know you’re hungry and have booked you a table at a place that serves your favourite dish. We may have to welcome our robot overlords within the next few decades or maybe this is just an egotistical fantasy of computer scientists.

This intelligence would be so far beyond our human capabilities that we would struggle to understand it

John Von Neumann coined the term ‘singularity’ in 1958 as ‘a point beyond which humanity as we know it will change’. The technological singularity is the moment when we achieve some form of super intelligence, be it biologically enhanced or Artificial Intelligence (AI). This intelligence would be so far beyond our human capabilities that we would struggle to understand it. Our world would therefore become unrecognisable, as it would be changing constantly at an exponential rate. Ray Kurzweil, one of the biggest names in the field, published his book The Singularity Is Near in 2005. In it, he predicts we will reach the singularity by 2045. While many scientists, philosophers and science-fiction writers have similar expectations, others think it is either impossible or a long way off. The singularity could come about in many ways, one of which is through biomedical engineering to create cyborgs. This is based on the theory of transhumanism which says the human race can evolve beyond its current physical and mental limitations using science and technology. Nanorobotics are an increasingly important area of research. Nanobots could potentially enter our bloodstream or our brains and make enhancements,

such as connecting our minds to the Internet and memories to cloud storage. One hypothetically realistic way the singularity might occur is via intelligent machines. The machines could either mirror the complexity of our own brain or form intelligence through their own means. This could materialise through an intelligence explosion where humans build one intelligent machine that builds another smarter machine, and through trial and error and Darwinian principles, more capable and intelligent machines will emerge. Scientists at Cambridge have made one of these ‘parent’ robots, which can build its own ‘children’, evaluate their performance, and improve the design of the most successful offspring. This is the first time natural selection has been successfully built into a machine outside of a simulation, leading us to wonder whether machines could adapt like organic life. General machine learning is another rising field. Google DeepMind, Google’s artificial intelligence company, recently created a machine that taught itself how to successfully play classic video games. This program was inspired by human learning, yet in over half the games the machine was as good as or better than a professional human player. Computers have never before been able to master a variety of complex tasks with no instruction on how to complete them. In this instance, the computer was given only the most basic information: raw pixels on the screen and the goal to get a high score.

will help to improve society and all life forms will co-exist peacefully. The AI could process Big Data and even be able to work out the meaning of ‘Life, the Universe, and Everything’.

Maybe machines will be able to correctly determine our interests and volitions better than we can ourselves Of course, everyone loves to mention the outcome of the movie The Terminator, in which the entire human species is eradicated. For now, augmented brains connected to the ‘Over-Mind’ and nanobots that are able to detect and respond to signals in our brain seem farfetched. For the time being, perhaps we should just focus on self-driving cars and improving Siri. Jemma McClelland is a first year Artificial Intelligence student

The machines could either mirror the complexity of our own brain or form intelligence through their own means

We have no idea what to expect from the singularity. However, many science-fiction writers love to imagine extreme ways that AI could change life on Earth. Maybe machines will be able to correctly determine our interests and volitions better than we can ourselves. A utopia could develop where robots

Illustration by Lynda-Marie Taurasi with modified elements licensed under Creative Commons

Winter 2015 | 19

focu s

Extremely energetic work: solving cosmic mysteries Simone Eizagirre explores the latest discoveries about some of the rarest high-energy particles in existence This summer, the IceCube Neutrino Observatory in the South Pole reported the detection of the highest-energy neutrino ever observed. Neutrinos are neutral subatomic particles with extremely lowmass that travel at almost the speed of light. They are unaffected by electromagnetic fields and are only subject to the weak subatomic force (which only works at short-range distances) and gravity (which is incredibly weak at the subatomic level), meaning that they can travel in straight lines for very large distances without interference. Most neutrinos originated soon after the Big Bang and have been floating around ever since, whereas other neutrinos are formed in nuclear reactions called supernovae galactic collisions. There are three types of neutrino: muon, electron, and tau. Each one is a specific type of subatomic particle. Upon interaction with matter, muon neutrinos release muons (which could be seen as a heavier electrons). The neutrino found at IceCube this summer was observed

Image of the ICL and a high-energy muon neutrino (night). Credit: IceCube Collaboration

20 Winter 2015 |

thanks to the trail of a muon that had 2600 trillion electronvolts of energy— energy so high that it could only have been produced by an ultra-high-energy neutrino (UHEN). The IceCube Neutrino Observatory is unlike any other, as it has a detector searching for particles from the most cataclysmic events in the universe, at a depth of about 2500 metres from the surface of the Antarctic ice. The cubic-kilometre detector observes around 275 million cosmic rays daily and 100,000 atmospheric neutrinos every year. The majority of observed neutrinos are atmospheric, with typical energies ranging from 1–10 trillion electronvolts (similar to the energy generated in the Large Hadron Collider). The discovery of the UHEN this summer comes two years after the surprising discovery of two equally energetic extra-terrestrial neutrinos— fondly named Bert and Ernie after the Sesame Street characters—which led to much excitement in neutrino astronomy and the hunt for additional high-energy neutrinos from outside our solar system. It is widely believed these UHENs could be coming from the same violent astrophysical sources in our universe that produce ultra-high-energy cosmic rays (UHECR). Cosmic rays are extremely high-energy radiation, but how some of them achieve energies millions of times higher than those obtained in our particle accelerators is still a mystery. Supermassive black holes, supernovae explosions of massive stars, and collisions between galaxies could be possible galactic accelerators. Learning more about UHECRs is an opportunity to discover more about the most extreme cosmic events in our universe. The first of such superenergetic rays to be discovered, dubbed the ‘Oh-My-God’ particle, was detected in October 1991 by the Fly’s Eye Cosmic Ray Detector at the University of Utah, with a shocking estimated energy of 300 million tera-electronvolts. This observation broke all paradigms as the particle exceeded the Greisen-Zatsepin-Kuzmin cosmic limit, a theory arguing that a particle of such high energy would lose most of it and slow down upon interacting with background radiation. This

would mean that the cosmic ray had to originate from a recent nearby event. However, scientists were unable to locate any known astrophysical accelerators in the direction from which the particle had arrived. Last year, a supposed ‘cosmic ray hotspot’ was identified, which 26% of the energetic particles observed in the last five years at the University of Utah’s telescopic array were found to be coming from. This is at a higher density than expected if the rays were distributed randomly across the sky. Further data is needed to truly establish whether or not this hotspot actually exists; projects such as the expansion of the Telesopic Array hope to speed up this process, bringing us one step closer to determining whether the current discoveries are genuine.

Learning more about UHECRs is an opportunity to discover more about the most extreme cosmic events in our universe One problem with cosmic ray data is that cosmic rays are formed from charged particles, meaning that by the time they reach our detectors they have been subject to galactic magnetic fields that deflect them to follow twisted chaotic paths. Tracking other types of particles, such as UHENs coming from the same spot in the sky, furthers our understanding of the hotspot. This is where the discovery of the neutrino this summer comes into play, because neutrinos travel in straight lines and their path can be mapped back to their source. IceCube analyses data in large quantities, each batch consisting of data collected over a couple of years. However, they hope to start announcing discoveries in real time to maximise the information obtained from each event. This influx of new data could reveal a lot about the mysteries surrounding the most extreme events in the universe. Simone Eizagirre is a second year chemical physics student at the University of Edinburgh

fe atu re s

Edinburgh: where innovation of bio-science begins Asimina Pantazi explores the supportive role of Edinburgh in the critical transformation of life science findings Another afternoon at the Edinburgh Cancer Research Centre as all the clocks struck three. John was in the middle of an important experiment. He had one hour. It was enough time to focus on his first business plan. John is one of the many researchers working at the University of Edinburgh whose life plans suddenly changed. John decided to pursue a PhD in Cancer Biology at the University of Edinburgh. He knew it was amongst the top universities in the world. His dream was to improve society by devoting himself to cancer research. However, after his first year the impact he desired to create was still unrealized. John considered himself a purist. But even the purist must adapt in a vastly interconnected web of medical research. He realised that collaboration was the key; that the marriage of academia and industry through innovation would be critical to unleash great ideas in the world.

The Scottish capital provides a first-class dynamic environment to promote groundbreaking innovation in life sciences The city of Edinburgh has long been at the helm of scientific advancement. Dolly the Sheep was born here. Darwin studied at the medical school, and roamed the Crags with his closest friend and mentor Joseph Hooker. Today, groundbreaking research continues to produce medical innovations and answer the most challenging questions. Despite the high potential, outstanding findings very often face constraints at the stage of translation into clinical practice. This transition is known as the ‘Valley of Death’. Promising technologies must pass through this dangerous zone before they make it into the world of commercial realization. Even when state-of-theart medical technologies pass through the purgatory of the Valley of Death, many still struggle to find commercial value and are not brought to market. This ulti-

Illustration of Edinburgh by Lynda-Marie Taurasi with modified elements licensed under Creative Commons

mately prevents patients from accessing new devices, treatments and medicines. The existing gap between academia and industry keeps the two worlds apart, impeding the exploitation of research findings. This acts as a major barrier to healthcare improvement and represents an unmet need that we all have a vested interest in fulfilling. The question is how we do this. Fortunately for John, Edinburgh is ripe for start-up formation in the life sciences. More than ever before, scientists are choosing to start-up, spin out and let loose their scientific innovations in the commercial world. At the heart of this phenomenon is the University of Edinburgh’s enterprise scene. Organisations such as the Scottish Institute for Enterprise, Entrepreneurship Club and LAUNCH.ed offer training workshops, mentorship and funding opportunities to entrepreneurial students to help them kick off their own businesses. Furthermore, governmental bodies such as the Scottish Enterprise and business incubators such as Sunergos Innovations and the Edinburgh Centre for Carbon Innovation form an exceptional sustainable entrepreneurial landscape in Edinburgh. Overall, the Scottish capital provides a first-class dynamic environment to promote groundbreaking innovation in life sciences. Last year, students of the Univer-

sity of Edinburgh united to form Innovation Forum (IF), a promising initiative that aims to unleash the entrepreneurial potential of life sciences. The mission of IF is to bring together industry, academia and policy makers to reinforce entrepreneurship in bio-sciences. With 14 branches currently based at leading institutions worldwide, IF creates a large and constantly expanding network of scientists, investors and entrepreneurs aspiring to bring great ideas to life. IF could grow into the missing link between academia and entrepreneurial potential. The exceptional science and enterprise scene in Edinburgh served as a fantastic opportunity for John to build business skills. The once-purist student realised that merging academia and industry is critical if researchers are to pull great ideas out of their lab coat and use them to build a better world. The clocks are now striking four as John heads back to finish his experiment. He is one hour closer to completing his business plan. One hour closer to starting-up his company. One hour closer to achieving the objective that motivated him to pursue a PhD in the first place.

Asimina Pantazi is a PhD student at the Edinburgh Cancer Research Centre and the Marketing and Communications Director Winter 2015 | 21

fea t ures

Is artificial intelligence safe for humanity? Stefano Albrecht explains the potential risks of artificial intelligence Recently, a number of prominent scientists, inventors, and entrepreneurs openly voiced their concerns regarding the potential dangers of artificial intelligence. Artificial intelligence (AI), as an academic subject, is the study of intelligent behaviour exhibited by machines, including software and robots. The precise definition of intelligence varies but it usually includes elements such as inference (drawing conclusions from evidence), planning (setting goals and constructing plans to achieve them) and learning (improving performance based on experience).

One of the major fears is that the speed of development of intelligent machines might reach a point beyond which meaningful human control is no longer feasible

What dangers and risks does artificial intelligence pose? One of the major fears is that the speed of development of intelligent machines might reach a point beyond which meaningful human control is no longer feasible. In other words, the machine could be smarter than humans and may have a will of its own. Another risk is that intelligent machines may not be constrained by the same social, ethical, and legal rules that govern human decision making, and that machines may lack common sense. As a result, an intelligent machine may attempt to achieve its goals in ways not anticipated, and possibly unintended, by its human designer. Many science fiction movies have explored the potential dangers of artificial intelligence, such as the Terminator film series and the more recent Ex Machina (Edinburgh’s School of Informatics hosts a list of films related to artificial intelligence). However, while such scenarios seem distant, there already exists a potential threat in the form of lethal autonomous weapons, or ‘killer robots’ as they are sometimes called. These terms refer to machines that can

22 Winter 2015 |

select and engage targets with no or only limited human intervention. Examples include automatic machine gun turrets and flying drones equipped with missiles. Arguments can be made for and against such technology. Some scientists believe that as machines surpass humans in tasks such as communication, coordination, and targeting, they may potentially reduce the number of both civilian and military casualties. On the other hand, there are concerns associated with autonomous weapons, such as a possible arms race between nations as the technology becomes widely available and a lowered inhibition for armed conflict by using machines rather than human soldiers. As a result, a number of organisations have called for a ban on the development and deployment of autonomous weapons. The need to understand and control the potential dangers posed by artificial intelligence has long been recognised in the scientific community and has gained more significant momentum recently when the US-based Future of Life Institute received a $10 million donation to fund research in this area. The institute’s goal is to “maximize the societal benefit of AI, explicitly focusing not on the standard goal of making AI more

Illustration by Gabrielė Lisauskaitė

capable, but on making AI more robust and/or beneficial”. As an example, Dr. Adrian Weller at Cambridge University received a grant to investigate ‘self-policing’ machines, whose purpose is to police other intelligent machines and recognise undesirable activity.

A number of organisations have called for a ban on the development and deployment of autonomous weapons

The question of whether a machine’s behaviour is desirable, when considered in the human sphere, is deeply connected to ethics and moral values. Earlier this year, Professor Benjamin Kuipers from the University of Michigan gave a guest lecture at the University of Edinburgh in which he examined how robots could make moral decisions. One point that became apparent is that some social dilemmas are hard enough for humans (such as whether to kill an innocent bystander in order to save other people), let alone for machines. Another important issue is that of legal responsibility in case the machine causes an accident, such as a driverless car that accidentally kills a pedestriany—is the owner or manufacturer legally accountable? It is evident that a very substantial amount of additional research and debate is required in order to understand the potential benefits and risks of artificial intelligence. The coming decades will see a further proliferation of computer technology, and it remains a significant challenge to maximise the benefits while minimising the risks.

Stefano Albrecht holds a PhD in Artificial Intelligence from the University of Edinburgh

fe atu re s

The WTF star: where’s the flux? Eleanor Spring investigates what star KIC8462852 means for planetary exploration Where’s the Flux? This is the subtitle of Tabetha Boyajian’s paper, published in October 2015, which sparked such a media frenzy about the possibility of an alien megastructure. The name is a playful reference to how the star KIC8462852 is often referred to: the ‘WTF star’. But is the furore justified? KIC8462852 is one of over 100,000 stars that were investigated on NASA’s Kepler mission, which seeks habitable planets in the area of the sky between the constellations of Cyrus and Lyra. The idea is to find planets not by observing them directly, but by inferring their presence based on how much flux is received from these stars. Flux is a measurement of an object’s brightness. During a solar eclipse, the earth becomes very dark, because the moon is blocking the sun’s light from the Earth’s surface. Kepler works on the same principle. When a planet passes in front of a distant star, the brightness Kepler observed from that star would decrease slightly, by up to 1%. As planets usually have regular orbits, those dips are periodic. Hence, a planet can be detected. The vast amount of data generated by Kepler is being processed using both complex computer algorithms and human volunteers via the online Planet Hunters project. Computers might have perfect logic, but it is extremely difficult to program them to effectively spot patterns that are relatively easy to detect by a trained human eye. The data from KIC8462852 was flagged up as anomalous, and came to the attention of Boyajian.

It was clear that they had stumbled onto something that definitely was not a planet

KIC8462852 showed the most bizarre variation of flux with time that scientists had ever seen. Not only did the dips in flux seem to exhibit no periodicity, but some of them were enormous, with one showing an incredible 22% extinction. Bearing in mind that a really big, Jupiter-sized planet would cause a periodic extinction of about 1%, it was clear

This illustration shows a star behind a shattered comet. Credit: NASA/JPL-Caltech

that they had stumbled onto something that definitely was not a planet. In fact, the patterns suggest that the dimming was not caused by one structure, but by several. Boyajian and her team cautiously suggested the dimming might be due to the debris left over from a broken-up comet. This is by no means a clear conclusion, and in fact what they did most effectively was rule out several other potential causes. The results are definitely not due to error, nor does it seem likely that they are due to a planetary collision. It would also make a lot more sense if KIC8462852 was a young star in an active area of star formation, where there are huge amounts of dust, gas and detritus flying around—yet all observations aside from the flux dips indicate that it is old. Boyajian’s paper strongly recommended further research, and she shared the results with Jason Wright, a Penn State professor of astronomy. He recently submitted a paper which reviews how alien megastructures could be observed, and he specifically recommends KIC8462852 as ideal for further research by the Search for Extraterrestrial Intelligence. The idea of an alien megastructure is not a new one. ‘Dyson Spheres’ were made popular in 1960 by Freeman Dyson. He conceived that as the energy requirements of a civilisation gets larger, and as its technology gets more advanced, eventually it would make sense to start

building massive structures around stars to harness their energy. If the dimming of KIC8462852 is in fact due to some kind of alien technology, it has been suggested that a ‘Dyson swarm’ is most probable. This would be a flock of smaller solar panels, orbiting the star in various patterns.

The idea of an alien megastructure is not a new one

If the possibility of alien technology at KIC8462852 sounds incredibly unlikely, that’s because it is. However, we know that something fascinating is going on at KIC8462852 because the observations are so unprecedented. Boyajian and Wright are proposing to continue investigation using a ground based radio telescope, hoping to pick up signals from KIC8462852. Too much like sci-fi? Isaac Asimov— the author of I, Robot—invented the field of robotics before it even existed. Now, robots are a part of everyday life. The best sci-fi can provide is an incredibly perceptive view of what could come to pass, so let’s not get too excited just yet—but let’s not forget to ask—why not? Eleanor Spring is a fourth year Astrophysics student Winter 2015 | 23

fea t ures

Back to the Future: how right were they? Lisa Hilferty examines the accuracy of the predictions of the year 2015 made in the 80’s blockbuster Back to the Future The second film in the Back to the Future trilogy sees Marty McFly and Doctor Emmett Brown travel 30 years into the future in the much-loved Delorean. Many things had changed between October 5th 1985 and October 21st 2015. The 2015 of the film left us all wanting self-drying clothes and the ultimate hoverboard (one that hovered over water, of course). As the 21st October 2015 came around it became clear that some aspects of the movie would remain objects of the brilliant imaginations behind this once-in-a-generation film. However, some of the things shown in the movie have become reality to some extent. Let’s first look at the Delorean which was able to use rubbish as a fuel source. Recently, a car called the Toyota Mirai has been released which runs on hydrogen that you can get from processing rubbish. Although this type of car cannot travel through time, the process behind the its fuel usage has been used in a real setting. One of the iconic things from the Back to the Future movies is the hoverboard. Since the release of the movie people have been eager to recreate this and have successfully produced three models, which are currently on the market. Firstly, the Hendo hover, a product that uses magnetic field architecture (MFA) had its first prototype in May 2013. A month later it was able to carry a

person and now the Hendo 2.0, released in October 2015, is starting to look more like the hoverboard we all know and love. MFA works on the basis of Lenz law with the engine of the hoverboard generating its own magnetic field. When this field is faced towards the ground it causes the board to levitate. However, the Hendo board only works over a conducting surface.

Rotating blades below the board force air downwards to elevate the board using technology similar to helicopters

Another board that works with magnetic fields is the Lexus hoverboard, which uses superconductors. Superconductivity is when a material cooled down below a certain temperature exhibits no electrical resistance and expulsion of magnetic fields. When superconductors are put near magnets you get repulsion and therefore levitation. In the Lexus hoverboard, the superconductor is in the board and the magnets are in the ground.

Photo of a hoverboard on Wikimedia commons by LoveBoat

24 Winter 2015 |

This board also has as the same problem as the Hendo board in that it only works over surfaces that have magnets embedded in them. The final hoverboard is the Omni hoverboard which, unlike the other two boards, uses air to overcome the challenges of gravity. Rotating blades below the board force air downwards to elevate the board using technology similar to helicopters. This overcomes the issue found with the Hendo and Lexus boards by allowing it to fly over any surface, including water. There are two major drawbacks to the Omni board. It has to be battery powered to drive the rotating blades, thus limiting flying time and could be potentially dangerous to other people if one crashed into them. Even though none of these current models completely fulfil the brief of a hoverboard, they do show that the technology is available for the future development of hoverboards. In the film, the houses of 2015 do not have door handles but are unlocked by the homeowner’s fingerprints. Fingerprint technology has come a long way since the movie’s production in 1989. Even though we have not yet reached the point in which fingerprints replace keys, fingerprints have been used to unlock mobile phones and some laptops. However, there has been a computer-controlled door lock made by Yale called the Linus lock. Although it uses a key code or a phone rather than fingerprints it is possible to keep an eye on who enters your house at a certain time. This is achieved by giving everyone their own key code (like a personal ID), which allows the number of times one person enters the house to be determined. Except that the fingerprint has been replaced by a digit code, it is similar to the house entry system in the movie. The movie also shows 3D technology in the advertisement for Jaws 19 when the shark comes out towards Marty McFly, which is seen by the character without the need for 3D glasses. Although they were wrong in the number of sequels in the Jaws franchise, they did predict the increase in popularity in this form of entertainment along with the redundancy

fe atu re s

Illustration by Sara Ljeskovac

of 3D glasses. Recently there has been a leap forward in with billboards showing adverts in 3D that can been seen without special glasses.

Fingerprint technology has come a long way since the movie’s production in 1989

This uses a laser system that sends beams in different directions at a very fine resolution. So while walking by this type of display you get views from all sides because of the hundreds of different pictures being projected. This gives the same visual signals to the brain as

walking by a real object. Another accurate prediction seen in the back to the future version of 2015 was wearable technology. In the film, Marty’s future children are seen making and receiving calls and watching TV using goggles. Internet giant Google will soon be releasing their updated version of Google Glass originally released in 2013. The new Google Glass will not come with a wire frame but will instead use a button-and-hinge system to allow them to attach to any glasses the owner wishes. It has a faster Intel processor, better Wifi connectivity along with a longer battery life and a spare battery pack that can be magnetically attached to give a total of four hours of running time. Microsoft have created a set of goggles called HoloLens that look more like the ones used by Junior McFly in the film. This is the first completely independent ho-

lographic computer to be released. As it is hands free you can interact with the hologram being shown in high resolution and with very good spatial sound. Back to the future was incorrect in some aspects of its version of 2015 including the prevalence of fax machines. The largest miss being the lack of smartphones and wifi, which is understandable as the world wide web was only invented the same year as the movie’s release. Although some of the features of the fictional 2015 may only be pipedreams at the moment, keep hoping for them as one day soon they might not just be ideas of the past.

Lisa Hilferty is a fourth year reproductive biology student Winter 2015 | 25

fea t ures

Little worm, big impact Mia von Scheven explores the significances of C.elegans as a research model organism Within the wide spectrum of various model organisms, there is a rather inconsiderable little worm whose scientific history can be traced back to a single person—Sydney Brenner. Brenner was searching for a multicellular, but relatively simple, organism to study the development of the nervous system. He introduced Caenorhabditis elegans (C. elegans) in 1962 as the ideal model for this research as this worm has just 302 neurons (in comparison to about 100 billion neurons in humans).

A rather inconsiderable little

worm whose scientific history can be traced back to a single person Different research requires a specific set of tools—it is all about asking the right question, choosing the method accordingly, and applying the method to the right model. Therefore, scientists need to select the model carefully to succeed and make an impact in their field of research. Brenner was convinced about the advantages of this worm as

a model organism—and he was right. C.elegans made it from a compost heap in Bristol up to Stockholm, where it has helped researchers win the Nobel Prize since 2002 for work ranging from programmed cell death to the discovery of RNA interference to green fluorescent proteins. Despite its benefits, there are just about a dozen research groups in Scotland using C. elegans as a model organism. The University of Edinburgh made a brilliant move by recently recruiting three young investigators who are explicitly using this worm for their research in neuroscience to better understand how the nervous system works. The fundamental building blocks of almost all nervous systems are called neural circuits. These are a group of functionally connected neurons that take different inputs from the environment, previous experience, or other senses and turn them into behavioural output. Dr Emanuel Busch’s team within the Centre for Integrative Physiology (CIP) is studying how neural circuits produce reliable outputs; specifically, how C. elegans responds through defined neural circuits to ambient oxygen concentrations. Their findings will enrich our knowledge in this field, which

Illustration of C.elegans interacting with DNA by India Pearce

26 Winter 2015 |

is important for understanding how humans and animals survive. Aside from studying behavioural outputs, C. elegans allows for the development of novel tools in methods like the establishment of complex techniques to reprogram the genetic code in multicellular organisms. Dr Sebastian Greiss’s lab at the CIP is using this technique by incorporating a chemically synthesised amino acid into a specific protein. By applying this method they can give different properties to a protein and study it in detail. In the future, developing this technique further could allow them to control the activity of single neurons in the entire nervous system of the worm.

C. elegans is acting as a starting point after which results and findings can be translated to humans or other higher organisms The newest addition to the ‘worm team’ at the CIP, Dr Maria Doitsidou, is using the worm to model Parkinson’s disease, a gradual neurodegenerative disease. By introducing mutations that cause Parkinson’s in humans, Maria and her team are trying to find genetic factors that influence disease progression to understand the molecular mechanisms that underlie neuronal loss, and to come up with new protective strategies. C. elegans is acting as a starting point after which results and findings can be translated to humans or other higher organisms. Having this complementary hub of expertise under one roof is of great value. It is a brilliant chance to develop a leading international research centre for the work on C. elegans within the scientific community in Edinburgh.

Mia von Scheven works in administrative support for the Centre for Integrative Physiology at the University of Edinburgh

fe atu re s

Life unexpected Polina Shipkova explores if extreme environments can sustain life The recent discovery of liquid water on Mars has been in the news for a good reason, we now know that theoretically, life on Mars can or could have existed in the past. This possibility has challenged the traditional public perception of where life can prevail. The notion that Earth is the only place in the whole universe where there is life in any form is now shaken. The search for life beyond Earth continues, but in the meantime, we could look at whether we already know of any organisms that could survive in space. The answer: Yes, we do. In an exciting new experiment, scientists exposed four species of microorganisms to conditions typical of space. They used a Mars’ conditions simulator to investigate whether these organisms could survive in an environment not seen anywhere on Earth. All of them were found to be resistant to temperature variations. Some of the microorganisms tested could also survive UV radiation, while others were able to grow in low pressure conditions. This remarkable resilience is intriguing, but requires further research. Importantly, the microorganisms in this study were carefully selected to be known extremophiles, organisms that live in extreme conditions, such as too hot, cold, acidic or high pressure environments.

The notion that Earth is the only place in the whole universe where there is life in any form, is now shaken This tells us it might be possible for some organisms to survive on another planet, but what about other space conditions? Scientists set out to understand whether extremophiles that use metals to grow can do so in a meteorite by using resources not found on our planet. The extremophile used in this experiment normally grows in very hot, acidic environments by utilising iron and sulphur. When released into the meteorite, the microorganism not only grew using the metals found inside the meteorite, but also grew faster than when using any Earth metals. An interesting finding was that it also required a lower tempera-

Image of the surface of Mars, captured with High Resolution Imaging Science Experiment (HiRISE) camera on NASA’sMars Reconnaissance Orbiter MRO). Credit: NASA/JPL-Caltech/University of Arizona

ture to survive than the temperature it needed on Earth. Although these results are promising, they are only preliminary and further research is need to help us better understand these results. Although this is all very exciting, we do not necessarily need to look at space to find unexpected examples of how life can prevail. Extremophiles have a few surprises for us on our very own Earth. Life has managed to find its way into every imaginable environment. Think about the Antarctic, a vast frozen land with temperatures way below 0 degrees Celsius. There is a type of midge that survives the harsh winter in Antarctica using a mechanism that is nothing short of ingenious. It gets rid of the water in its body, which prevents it turning into ice, and this allows it to escape its own freezing. A recent study has suggested that the movement of water necessary for this dehydration and subsequent rehydration in summer is carried out via channels that are responsible for transporting water in and out of the cell. The new study connects the function of these channels to the survival mechanism the Antarctic midge uses to endure during winter. This is a previously unknown mechanism for any extremophile and it

sheds light on how diverse life and its manifestations can be.

Scientists set out to understand whether extremophiles that use metals to grow can do so in a meteorite The way we think about life now faces an unexpected turn. It would seem that even the life we know so well on our own planet holds surprises. Extremophiles are an excellent example of life in an unexpected place not only in the context of space, but also on Earth. We may believe we know a lot about our planet, but we still have much to learn about it, and even more to explore beyond it. It might be safe to assume that more and more discoveries will be made in this field in the next few decades. Let’s wait and see. Polina Shipkova is our winner from the MSc Science Communication competition. She is a Masters student at the University of Edinburgh studying Science Communication and Public Engagement Winter 2015 | 27

regulars : p o l i t i c s

Big dangers of big data Nathalie Vladis embarks on a quest in the world of healthcare’s extremely big data Big data is taking over. Every click online creates information about you; what you like, what you might buy, what your interests are, to name a few. The rise of the internet has taken the big data revolution to a whole new level, with information being collected from practically everything, including you. Emerging big data technologies have contributed to an increasing number of opportunities for organisations. From building intelligent houses to managing energy consumption to predicting traffic, data is changing our world.

The largest cause of health-related data breaches is physical theft or loss of non-encrypted hard drives

It is often said that the last frontier for big data will be healthcare. Data coming from your activities on the internet or your connected devices will be able to contribute to an effort to improve one’s healthcare. This could happen through wearable electronic devices that monitor blood pressure, heart rate and hours of

Illustration by Nathalie Vladis

28 Winter 2015 |

sleep, helping you track progress and offer more personalised treatments. Ideally this information would be collected in big databases that would help healthcare specialists identify patterns of diseases, improve disease management and their prevention. A recent controversial example is the NHS led initiative wherein GP records from across the UK will be extracted and fed to the national Health and Social Care Information Centre (HSCIC) databases. The care. data program aims to get a better picture of the care provided in different areas of the country and allocate resources more wisely. This may seem like an amazing initiative, however there is a lot of concern about the control of data and the lack of informed consent from patients. Additionally, organisations will be able to pay a fee to access this data, and may later use it for profit-making purposes. Because of these concerns, the start of the program had been postponed several times, up until now. However, it seems that the biggest concern is that our most sensitive information will be centralised and thus be more vulnerable to breaches. Although the NHS has been hacked in the past, according to Dr Kami Vaniea, a newly appointed lecturer in Cyber Security and

Privacy from the School of Informatics at the University of Edinburgh, the largest cause of health related data breaches is physical theft or loss of non-encrypted hard drives or laptops.

Our health-related information is our most sensitive type of data and the fact that it could be exposed to all these dangers is very alarming In October 2014, it was revealed that Pharmacy2U, the UK’s largest NHS-approved online pharmacy was selling the personal information of thousands of its clients without consent to an Australian lottery and a health-supplement company. According to The Independent, the data was priced at £130 for a thousand records. The Information Commissioner’s Office found that the lottery company was targeting elderly patients and that they likely suffered financially. It is important to note that once these companies bought the data they could sell it to third parties creating a snowball effect. Pharmacy2U was fined £130,000, but it is hard to put a price on the distress of these vulnerable individuals. It’s undeniable that these big data projects can revolutionise healthcare and change the way patients are treated. On one hand, information about our medical history and way of life can be part of more personalized treatment plans for everyone. On the other hand, our health related information is our most sensitive type of data and the fact that it could be exposed to all these dangers is very alarming.

Nathalie Vladis is a second year PhD student at the Centre for Integrative Physiology and the founder of the Model Organism Network (ModON)

re g u l a rs: i n n n ovation

Microbial enzymes for bioethanol production

Viktoria Dome considers the potential solutions to the problem of second generation biofuel production Bioethanol production generates an alternative energy source to fossil fuel extraction. First generation bioethanol is easily obtained from starch and sugar based materials from crops such as wheat, sugarcane, and corn. These plants are important food crops and their agricultural waste products, known as lignocellulosic biomass, can be converted into ethanol, creating second generation bioethanol. Lignocellulosic biomass is made up of cellulose, hemicellulose, and lignin polymers. Transformation of biomass into ethanol involves three key steps. First, cellulose is freed from lignin. Second, it is broken down into glucose by degrading enzymes and finally the glucose is fermented into ethanol. Among these three processes removal of cellulose from lignin is the most difficult task to be solved. Lignin is a constituent of cell walls in plants and is made up of precursors arranged in a highly complex molecular structure. This causes lignin to tightly associate with cellulose, thus creating the difficulty of their separation.

gressive and expensive pretreatments. Compared to fungi, bacterial systems are less powerful in lignin breakdown and their lignin degradation rate is lower than fungal systems. However, a synergistic act of bacterial and fungal enzymes could make this process highly productive.

Steam explosion, acid hydrolysis, alkali washing and ammonia fibre expansion are the most widely used pretreatments

or even organisms capable of efficient lignin degradation. While scientists focus on genetic manipulation of lignin content in plants, genetic modification of microbes may be a more feasible method for optimal lignin breakdown. The large number of enzymes believed to be dedicated to lignin degradation need to be well understood as there is a possibility that only a few of these enzymes are actually required. This could potentially lead to simple industrial processes and solve many of the problems of current fuel production and extraction including environmental pollution and biomass disposal as well as economical challenges of costs and the competition for food supplies.

Furthermore, the exploration of bacteria from unusual environments, such as the rumen of cows and active soils, may help reveal yet unknown mechanisms

Viktoria Dome is a second year ecological and environmental science student with a major interest in plant sciences

Lignin is a constituent of cell walls in plants and is made up of precursors arranged in a highly complex molecular structure Steam explosion, acid hydrolysis, alkali washing, and ammonia fibre expansion are the most widely used pretreatments to remove cellulose from lignin. They require either high temperature, high pressure, or additional chemicals to work effectively. These existing chemical and physical pretreatments and can have side effects that inhibit the further steps of conversion into ethanol. Despite the naturally evolved complexity of lignin structure, there are microbes such as the white-rod fungi and bacteria that are able to enzymatically degrade the lignin, leaving cellulose accessible. Thus, the biological degradation of lignin is a potential alternative to the ag-

Illustration by Pei-Ling NG

Winter 2015 | 29

re gulars: t e c h n o l o g y

A wizarding wardrobe wonder Kirsty Paton investigates the recent technological advances in invisibility cloaking In September, researchers at the University of California Berkeley announced a new development in cloaking technology. The team of scientists, based at the Nanoscale Science and Engineering Center, has successfully developed a new ultrathin cloak that can disguise three-dimensional objects of arbitrary dimensions. Particularly notable is that the cloak is theoretically easy to scale-up. The idea of invisibility is old and seductive—who has not wished to have the chance to be invisible at least once? There’s a reason Harry Potter always kept his cloak on hand. This is perhaps why, in spite of the difficulty of making a working invisibility cloak, there has been so much work in this area. Invisibility technology also has obvious military applications. The basic idea underlying the Berkeley cloak and its predecessors is relatively straight-forward. To render an object invisible, light is bent around the cloaked object so the waves of light seem to travel along a continuous, unbroken line,

Illustration by Paige Collins

30 Winter 2015 |

thereby hiding the object. Such cloaks were first proposed in 2006, but implementation has proven difficult. Metamaterials have been key in the development of cloaking technology. These are man-made materials with a periodic, cellular structure, engineered to have properties not found in nature. In 2006, the first cloak was built at Duke University in collaboration with scientists from Imperial College London and was able to cover a 25mm copper cylinder from microwave radiation.

To render an object invisible, light is bent around the cloaked object so the waves of light seem to travel along a continuous, unbroken line

The structural features of the metamaterial that influence its interaction with light must be smaller than the wavelength of light with which it interacts. For visible light, the structures must be nanoscale in size. Metamaterials are difficult and expensive to make, especially given the complicated physical parameters required to achieve cloaking over the whole of the visible light spectrum. Attempting to reduce the complexity of the problem, many researchers have focused on ‘carpet cloaks’ which make three-dimensional objects appear flat and do not need to be made of a metamaterial. An early version of such a cloak that operated in the visible range of light was also developed at Berkeley. Carpet cloaks using other materials, or cloaks using naturally occurring crystal tended to be bulky and difficult to scale-up. The new Berkeley cloak, although also a carpet cloak is constructed from a metasurface—an ultrathin metamaterial. This is an important difference, as developments in nanofabrication such as nano-imprinting make this cloak easier to manufacture and scale up. Also, this cloak does not cast a shadow and is

not detectable, unlike some of the earlier models. An important caveat to this light-bending cloaking technology is that no light would enter the cloaked region, leaving the cloaked person blinded to the outside world. A cloak developed in 2009 overcame this problem but it had to be tailored precisely to the cloaked object making it generally impracticable. There have been other approaches in trying to achieve invisibility. A cloaking device built at the University of Rochester in 2014 used traditional lenses. Rather than bending light around a cloaked region, the area is replaced with an image of the space behind the object. The advantage of this approach is that the components are readily available and the system can be easily scaled up. However, it is difficult to think of this as being truly invisible as part of the lens system is visible. A coat developed in 2003 by Sasumu Tachi and his team at the University of Tokyo was another attempt. The clothing worked by having a camera behind the wearer, the video from which was then projected from the front surface of the garment which was partially reflective. However, technically these garments made the wearer transparent rather than invisibile.

The structural features of the metamaterial that influence its interaction with light must be smaller than the wavelength of light with which it interacts The cloak developed at Berkeley only hides objects at a wavelength of 730nm, just beyond the edge of the visible spectrum of light and, as a carpet cloak, can only make objects appear flat. This is still unlike Harry Potter’s beloved cloak— you can’t just throw it on and disappear. Kirsty Paton is a recent physics and philosophy graduate

re g u l ars: arts

The computer as an artist Alyssa Brandt reflects on the implications of creative artificial intelligence The past several years of popular science reporting on artificial intelligence (AI) warn us of the impending ‘robotic revolution’. Computers may replace blue collar jobs and easily automated professions. Even creative jobs such as graphic design, writing, or painting are not exactly safe from elimination—or at the very least modification—in the face of artificial intelligence advances. However, it is not always the fear of a shifting professional landscape that causes negative reactions to AI. There is anxiety associated with the idea computers may be able to compete with humans in the creative realm. The question remains: can they? Ultimately, whether or not AI can be truly creative is left up to philosophical and operational discussions of creativity. Defining creativity is a challenge, and it gets more complex as the concept is expanded from human to artificial intelligence. Creativity is marked by the generation of new ideas. It does not necessarily require the creative process to be conscious, or driven by a sentient being. Cognitive scientists, researchers that study the mind, divide creativity into three main types. First, there is the new combination of old ideas, sometimes referred to as ‘improbabilistic’ creativity. At its core, it is a random trial-and-error creativity. Second, there is the exploration of a conceptual space. Simply put, conceptual space is the assigned parameters that define a concept. An example of conceptual space exploration would be to use purple or blue to depict an apple: bending conceptual space does not change the properties of an apple (i.e. a fruit made of the appropriate materials), but changes more flexible components of the concept, like color. Lastly comes the transformation of such conceptual space to allow for entirely new ideas, such as redefining what a building truly is; it no longer needs to be a rectangular box made of steel and concrete or wood, but can be in the shape of a guitar and made out of plastic. Humans have the capability to do all three. Because these require conceptual spaces to be used, artificial intelligence programs have starting definitions with which to work. Current creative AI seem to be in the rudimentary stages of the first two types (improbabil-

Illustration by Alyssa Brandt

istic and conceptual space exploration) but cannot seem to break into transformational creativity.

Contemporary AI-created art is often regarded as ingenuine because it is essentially randomly generated

Contemporary AI-created art is often regarded as ingenuine because it is essentially randomly generated. For example, AARON, a visual art program designed by Harold Cohen in the mid 1970s, produces ‘paintings’ that are sometimes abstract, and sometimes more representative of real objects. But AARON follows the rules set by the programmer and is limited in what it can create because it doesn’t know how to transform the concept of visual art into something different. AARON isn’t able to generate new styles or imagery on its own unless it is programmed to, meaning that everything it produces is reliant on a random recombination of ideas from AARON’s pool of knowledge (or, improbabilistic creativity). Interestingly, Margaret Boden, one of the foremost cognitive scientists studying creativity, points out how the process of randomly recombining ideas

until something works, a mechanism that many AI systems use, is not dissimilar to some parts of human creativity. She stresses that the success of creative AI depends on the definitions of creativity used. Human creativity has multiple complex layers, each requiring more computation than the last. Arguably, AARON is creative, just not as creative as some humans. Although current AI doesn’t quite reach the most complex type of creativity, it may not be long before advances like artificial neural networks allow AI to proceed into transformational creativity. `When AI is creatively advanced enough, society will have to adapt. New technology inevitably changes jobs. There are machines that automatically assemble products humans have handmade in the past and such automation saw the unemployment of thousands, but society has since adapted. The beautiful thing is that creativity is more nuanced than product assembly. Even if AI can eventually produce a pleasing web layout independently, create an alluring painting, or write moving poetry, this does not subtract from the power humans have to create. We will still need artists, architects, novelists, poets, playwrights, and designers to be the mirror of human expression and culture.

Alyssa Brandt is a student in the MSc Integrative Neuroscience program Winter 2015 | 31

regulars : s c i a t r i b e

Avoiding the gender bias in science: narrowing the gap Alessandra Dillenburg explores the reasons behind the glaring lack of women in higher-level scientific positions You may have heard of the internet outrage that resulted from Tim Hunt’s mishap earlier this year. In case you missed it, let’s quickly recap. “Let me tell you about my trouble with girls. Three things happen when they are in the lab: you fall in love with them, they fall in love with you, and when you criticize them they cry,” the Nobel Laureate said while speaking at the World Conference of Science Journalists. While the outcome (i.e. his swift dismissal from his position at University College London) may have been a little disproportionate for what was likely a misguided attempt at humor, the unfortunate event brought to light, once again, the plight of being a woman in the field of science, technology, engineering, and mathematics (STEM). Before we get too emotional and cry all over our lab books, let’s first take a look at the facts. The number of women getting their PhDs in STEM fields is on the rise—around 50% in most developed countries, though in the UK this number is closer to 40%. However, when women start looking for paid positions in academia, this number declines drastically—in biology, about 26% of applicants are women, a number which reduces to a mere 12% in the physics field. This gap only widens as the women who do make it through try to rise through the ranks of male-dominated academia—with 33% of junior faculty and a mere 11% of sen-

Illustration by Hari Conner

32 Winter 2015 |

ior faculty in European universities being women. This phenomenon is widely known as ‘the leaky pipeline’. So where is the leak coming from? Is there something inherent about being a woman that can make you a lesser scientist than a man? According to both men and women, there is. This is due to something called the (mostly subconscious) gender bias. In a study carried out by the Handelsman group at Yale University, both male and female senior scientists were asked to evaluate CVs of fictional candidates John and Jennifer. Despite their CVs being identical, both men and women were more likely to hire John, showed greater enthusiasm about mentoring John, and would pay him $4,000 USD more per year. Other studies like this one exist with similar results: work done by women is rated as less impressive than work done by men, women are rated as less competent than men across various domains, and over 70% of people subconsciously view being a scientist as a man’s job. To state the obvious, research shows there is nothing in the biology of being male or female that would affect one’s ability to perform in science and mathematics. The discrepancy between girls’ and boys’ interest in science is related to the degree of gender equality in their social context. Our socially ingrained biases towards women in science and our

subconscious ideas of what constitutes a woman’s job versus a man’s job is exactly what needs to change. The first step to conquering gender bias is to acknowledge it—especially for women. It’s important to overcome your own bias and challenge your subconscious stereotypes by deliberately taking steps such as being proactive in seeking funding and mentorship, as well as creating ‘women in science’ discussion groups within your field or research center.

Despite their CVs being identical, both men and women were more likely to hire John

Making people aware this bias exists and challenging people to consciously suppress it is something every individual in science can achieve; however, changes at the institutional level are also required. Institutions and funding bodies can create funding programmes especially for women, provide daycare facilities so women don’t feel they must choose between family and career, and impose quotas on hiring, among other initiatives. Conference organisers can take a page out of Canadian Prime Minister Justin Trudeau’s book and ensure a gender-balanced slate of speakers, just as he’s ensured a gender-balanced cabinet of ministers. Those responsible for appointing scientific advisory boards or commissioning review publications can ask themselves, “who are some suitable women for this role?” It’s time to make it clear that jokes about women crying and falling in love in the workplace are definitively not funny. It’s time to eradicate gender bias and show women and girls that they have an important place in science. Together, we can change the numbers. Alessandra Dillenburg is a PhD student in the Miron lab at the Queen’s Medical Research Institute

re g u l ars: i n terview

Robot sunflowers: where science meets art Kerry Wolfe speaks to Dr. Dave Murray-Rust about his interactive exhibit Lichtsuchende Lichtsuchende—a name meaning ‘light seekers’ in German—merges science, technology, and art in an interactive, international exhibition. Created by Dr. Dave Murray-Rust and Rocio von Jungenfeld, the project consists of a family of robots whose movements mimic the way sunflowers shift to face the sun in their quest for light. At the moment, nearly 50 little robots make up this photo-kinetic sculpture. Each individual entity both consumes and emits light, creating a snowball effect of illumination. Observers interact with the static creatures by carrying torches into the exhibit, which causes the machines to set off a chain reaction of light emission. The exhibit debuted in Edinburgh’s Hidden Door Festival in 2014, and has now spent the past five months in Germany. EUSci sat with Dr. Murray-Rust to chat about the project. Kerry Wolfe: Can you give some background info on Lichtsuchende? Dave Murray-Rust: It started very much through playing, so putting together a few sensors and microprocessors and just starting to do something that could follow light a little bit. That evolved into creating little robotic creatures, and then we just wanted to see how far they could go and where they could end up. So we started making a society of them that could communicate with each other through exchanging light. KW: What was it like working on this project in particular? DMR: A lot of it was great fun. The thing that I think is going to stick in my head is that it was a change from trying to make one thing to trying to make a lot of things. And there were a lot of questions about fabrication and can technology change massively because suddenly everything has to be done 20, 30, or 40 times over so it requires a much more of an industrial design approach. It was more about learning ways of doing fabrication on that scale.

Dr. Murray Rust demonstrates the robotic sunflower. Credit:

KW: Where did the idea for Lichtsuchende even come from? DMR: It was one of those things that just appeared. The initial theme appeared, and then most of the stuff about how it developed came from a dialogue with the physical materials and by being responsive to what was happening. We tried not to come in with too much of a grand vision of how things should be. It was much more a sense of being responsive to the creatures that were emerging through our tinkering and trying to help them become the things they wanted to be. KW: How has the response been? DMR: Fantastic—much better than we’d imagined. One of the things we’re both really keen about is making art that you can really interact with and that you can play with. So the fun bit for me is seeing little kids gravitating towards and talking to these robots and getting really excited about what they do. And so that has been wonderful. KW: Is this what you do with all of your projects, or was this one the first? DMR: A lot of my projects go in different ways. I take different trajectories through the space of science and technology. My favorite ones start in one place and end in another. I’m really excited about where technology can illuminate things that we don’t usually get to experience or aren’t normally visi-

ble. And that’s kind of where the science comes in. KW: Should thinking of art and science as more cohesive become more of a widespread thing to do? DMR: Absolutely. And we are seeing there’s a lot of art/science stuff out there at the moment in all sorts of interesting places. There’s the creative technology side of things where people are doing fun things that tie into critical design, and using design to ask questions about the wider picture in society. And things like bioart, where people do artistic projects around biohacking. KW: Why do you think it’s important to blend art with technology and science in this way? DMR: To me, they’re all different aspects of the same thing. It’s less about blending these things together and more about seeing them as different facets of the same kind of process. It’s easy to feel that art is all intuitive and magical stuff happens, and science is purely rational and goes according to the scientific method. But actually, neither of those are true. Art is very rigorous as a practice and science draws heavily on intuition. So really, it’s just different stories that you tell about the same kinds of activities. Dr. Murray-Rust’s work can be followed on Kerry Wolfe is an MSc student in environment, culture, and society Winter 2015 | 33

regulars : le t t e r s

Dr Hypothesis EUSci’s resident brainiac answers your questions My friend told me that non-stick coating on pans can be dangerous if you heat it up too much! Isn’t it meant to get hot?! Pan-icked Pam Dear Pan-icked Pam, The non-stick coating on pans is most often polytetrafluoroethylene, better known to most cooks by its brand name Teflon. The compound is a fluoropolymer made by stacking together hundreds of identical units of repeating chemical blocks to make something greater than the sum of its parts. In Teflon’s case, it’s a great insulator, has low friction, is non-stick, and non-reactive. Since it is extremely chemically inert, Teflon also ticks all the boxes for a substance that might be going near your food. It doesn’t irritate skin, is safe to eat, and in a liquid form causes minimal ill effects even when injected into mice, rabbits, and dogs. So is it the perfect candidate for a non-stick coating? What your friend may have heard is that at temperatures above about 240° Celsius (or 513 Kelvin for you physicists in the back), Teflon can start to decompose to form various nasty chemicals. It turns out that this sort of thing can in fact kill rats—depending on how the compound is heated and the temperature as well as how long the gases have been exposed to room air before inhalation. In one study though, achieving a lethal concentration of Teflon took 4 hours of heating. On top of that, ‘ageing’ a fume mixture by just one minute before the rats breathed it in reduced pathology fairly significantly. What we have to remember is that rats are very small and have very small lungs: generally the amount of toxin required to kill a small animal will be less than that which would even

Image created by Áine Kavanagh with some components licensed from Pixabay

34 Winter 2015 |

make a human dizzy. Rats are also particularly prone to respiratory infections in general, so even a carefully managed experimental population could have undetected, underlying respiratory issues that exacerbate any toxic effects. In people, the thermal degradation products of Teflon are accused of causing respiratory problems and flu-like symptoms, albeit rarely. Similar compounds found in leather protectants can also occasionally off-gas and cause the same syndrome. Serious complications are uncommon. There is little research in humans into the long-term exposure to small amounts of the gas from over-heated Teflon. However, most studies agree that practically, the dangerous compounds are unlikely to form at a high enough concentration to make humans sick—at least before you set something on fire anyway. You shouldn’t worry about your pet rats either. Unless they are close enough to the pan to be at risk of becoming a menu item themselves, any toxins are likely to dissipate sufficiently in the air.

In people, the thermal degradation products of Teflon are accused of causing respiratory problems and flu-like symptoms, albeit rarely

There is a particular group of apartment dwellers that might want to take heed of these warnings around overheated Teflon: pet birds. You can think of a bird as essentially a brightly decorated giant lung. Because of their incredibly sensitive respiratory systems, they feel the effects of any toxins significantly quicker than their non-avian counterparts. This means even a low concentration of noxious or toxic fumes can prove deadly. You might have heard of miners using this trait before sensitive electronic equipment replaced arguable animal cruelty: the canary in a coal mine stops singing because it gets sick from low levels of carbon monoxide, warning miners in time to take precautions. For best practice, if you have any live birds in your home, then turf out your Teflon cookware tout-de-suite. However, if the closest you get to chirpy chums is your morning-person flatmate then it’s fairly safe to continue with your non-stick pans. If you’re still worried, Good Housekeeping did an informal study on heating up pans with and without things in them. To follow their advice to minimise risk: don’t preheat empty pans for more than a couple of minutes, sear meats in a conventional pan, avoid scratches to your non-stick coating, and keep your kitchen well ventilated. And just to hammer the point home: no birds! Dr Hypothesis is also known as Liv Nathan, and when she isn’t answering your questions she studies Veterinary Medicine at the Royal (Dick) School of Veterinary Studies

re g u l a rs: reviews

The Power of Habit Chrystalleni Vassiliou reviews the newstudent booklearning by Charles Duhigg development You wake up. You make your coffee as you like it, brush your Helping and teeth, make your bed,students get dressed and head staff out. Do you think before making any of that? succeed in their current roles Probably not. These are all habits that emerged in your life and in their future careers, to make it easier. You don’t need to think how to make your coffee;by thatproviding would take an unnecessary amount of time. Your University wide brain stores information about what you do and how you react in different support situations andfor learnsteaching, to repeat the same behaviours automatically to make your life easier. Unfortunately, it reacts in learning and researcher the same way for unhealthy behaviours, for example smoking. development. Charles Duhigg uses examples drawn from a variety of behavioural and psychological research, situations, real-life stories and different points of view. He is trying to explain how habits More can be found at: circuits are emerge and howinformation they get stabilised, what neuronal involved in process and how people are able to change theirs using simple tricks, but he doesn’t stop there. He expounds how big organisations and institutions create habits that allow their employees to work more efficiently, more effectively and more safely and how these habits are expanded to other aspects of

the employee’s life, as well as how companies take advantage of researcher skills development— your habits.research He expands even more to make readers understand planning, communication that even societies have habits, approved and followed by memskills, professional development, bers of thecareer community, not always consciously. management, business and This well-written enterprise,book and helps moreyou understand why things work the way they do in a scientific but yet simple way. The author manages to elaborate in a topic development that everyone thinks but continuing professional no-one analyses, while he does so in an entertaining, story-like and practice sharing in teaching, way that islearning suitable for bedtime reading. and supervision At the end, you will understand that everything is a big habit loop and the writer will then answer your question. How much free will do we support forfinal curriculum, really have?programme and assessment

design and development The Power of Habit: Why We Do What We Do and How to Change That by Charles Duhigg is available online and in bookshops. Chrystalleni Vassiliou is a fourth year BSc Neuroscience student at the University of Edinburgh

student learning development

Helping students and staff succeed in their current roles and in their future careers, by providing University wide support for teaching, learning and researcher development. More information can be found at:

researcher skills development— research planning, communication skills, professional development, career management, business and enterprise, and more continuing professional development and practice sharing in teaching, learning and supervision support for curriculum, programme and assessment design and development

Winter 2015 | 35

EUSci #18  

Issue 18 of Edinburgh University Science Magazine

Read more
Read more
Similar to
Popular now
Just for you