OXFORD’S SCIENCE MAGAZINE
2007 ~ 2017
Bang! THE PROGRESS ISSUE
Gastro-revolution The cutting edge technology that promises to transform how we eat
The Future of the LHC What’s next for the giant particle collider?
Resistance is Fatal Innovative strategies for the war against drug resistant superbugs
HAVE YOU THOUGHT ABOUT...
A CAREER AS A PATENT ATTORNEY?
An intellectually challenging and rewarding career option
What Does It Involve? Training as a Patent Attorney is a career path that will enable you to combine your understanding of science with legal expertise. You will leave the lab environment yet remain at the cutting edge of science and technology, applying your knowledge and skill in a commercial context. You will help to protect intellectual property assets and grow businesses.
James Egleton MChem in Chemistry, University of Oxford (2011) DPhil in Organic Chemistry, University of Oxford (2015)
Sound Interesting? Patent and Trade Mark Attorneys in London, Oxford, Cambridge and Munich. We welcome applications from exceptional candidates at any time of the year. Eleanor Healey BA and MSci in Natural Sciences, University of Cambridge (2011) DPhil in Structural Biology, University of Oxford (2015)
Whatâ€™s On in Oxford
Affirmitive Action and its Opposite Reaction
My Other Car Drives Itself
The Automation Revolution
The Future of the LHC
CRISPR: Redesigning Life
Bang! Talks to Angela Saini
Next Generation Genetics
Eavesdropping on Ecosystems
Resistance is Fatal
Shockwaves in Spacetime
Staff List Editors-in-Chief Thomas Player Ray Williams Deputy Editors Ellen Pasternack (Print) Jacqueline Gill (Web & Blog) Claire Ramsay (News) Sub-editors Harri Ravenscroft Theodore Keeping Hannah Ralph Brianna Steiert Marianne Clemence Laura Steel Creative Director Pat Taylor Artists Eleanor Minney Tiffany Duneau Jacob Armstrong Gulnar Mimaroglu Business Team Harry Gosling (Manager) Jeff Lidgard Publicity Daniel A. Villar
OSPL Staff Chairman Louis Walker Managing Director Rebecca Iles Finance Director Katie Birnie Tech Director Utsav Popat Events Director Tess Hulton
World-leading science in Bristol â€“ the cultural hub of the UK Bristol Centre for Functional Nanomaterials carry out pioneering nanoscale research through interdisciplinary training and scientific exploration. Study PhD or MSc. bristol.ac.uk/bcfn //
Condensed Matter Physics Centre for Doctoral Training exposes students to the most exciting research topics in condensed matter physics, from experiment to theory. cdt-cmp.ac.uk //
Nuclear Science and Engineering MSc draws on nationally leading industrial research to teach students the science and engineering that underpins modern and future nuclear energy systems. southwestnuclearhub.ac.uk/teaching/msc //
Quantum Engineering Centre for Doctoral Training offers a unique training and development experience for those wishing to pursue a career in the emerging quantum technologies industry. bristol.ac.uk/quantum-engineering //
School of Physics explores physics at all scales from the cosmological to the sub-nuclear, including strong activities in nanoscience and condensed matter physics. Study PhD or MSc by research. bristol.ac.uk/physics/courses/postgraduate //
very warm welcome to a very special issue of Bang! Science Magazine. When Bang! was founded ten years ago the aim was to let people ‘write about, read about, explore and be inspired by the fascinating and beautiful workings of the world around us’, and that is as true today as it was then. The world has changed so much in the past decade, and we’ve been here recording that progress. In this issue we take a look at just how far we’ve come, and get a glimpse of how far we might go. We’re on the cusp of true artificial intelligence, epigenetics is capturing people’s imaginations, and a breakthrough in gravitational waves has just won a Nobel Prize. All of this and much more is covered in this term’s magazine. Bang! has always been a team effort and isn’t possible without the hard work and creativity of so many committed individuals. We would like to thank them all for their effort and dedication, and hope that it inspires some of you to get involved next term. We want to keep on doing what we’ve always done— bringing you beautiful content from across the sciences. Next term, in the spirit of progress, we’ll be doing exactly that under a new name: The Oxford Scientist. Our name may be changing, but our core ethos isn’t. It’s what we’ve been saying for ten years: science isn’t pretty boring… it’s just pretty.
Ray Williams & Thomas Player Editors-in-Chief
News What do fruit fly body clocks, gravitational waves and cryo-electron microscopy all have in common?
ctober saw the announcement of the winners of the 2017 Nobel Prizes in Physics, Chemistry, and Physiology or Medicine. The Nobel Prize in Physics this year was split. Half was awarded to Rainer Weiss and the other half was shared between Barry C. Barish and Kip S. Thorne, for the contributions of all three to the discovery of gravitational waves (see page 28), predicted by Albert Einstein over a century ago. These scientists were the architects of the Laser Interferometer Gravitational-wave Observatory (LIGO), which in 2015 was the first instrument to ever record a gravitational wave when it detected those produced from a collision of two black holes a billion light years away. Since then at least four more black hole collisions have been detected. ‘An enormous amount of rich science is coming out of this,’ said Thorne. ‘For me, an amazing thing is that this has worked out just as I expected when we were starting out back in the 80s. It blows me away that it all come out as I expected.’ The Nobel Prize in Chemistry was awarded to Jacques Dubochet, Joachim Frank, and Richard Henderson for their work in developing cryoelectron microscopy, a new technique that determines the 3D structure
of complex molecules like proteins, RNA, and DNA. This technology has helped researchers to understand previously invisible processes in cells. Knowing the shape of proteins is crucial to understanding their function, and cryo-electron microscopy makes this a lot easier. Cryo-electron microscopy gets around some of the problems associated with older technologies. Regular electron microscopy dries out the sample, causing it to lose its shape, whilst X-ray crystallography only works for molecules that can be easily crystallised. However, these cryoelectron microscopes are very expensive, so it is unlikely that individual labs will be able to afford them—but they may soon be a commonly commissioned service, similar to DNA sequencing. ‘I think the feeling is that the three of us who have been awarded the prize are sort of acting on behalf of the whole field,’ Henderson said. ‘It’s kind of a worldwide effort that’s just now come to fruition.’ Cryo-electron microscopy has already been used to visualize the Zika virus, where sites that antibodies could attach to and disable the virus have been discovered, opening up a new avenue for antiviral drug design. Last but not least, the Nobel Prize in Physiology or Medicine was awarded to Jeffrey C. Hall, Michael Rosbash, and Michael W. Young for their work related to the molecular mechanisms that control circadian rhythms. Circadian rhythms are internal clocks which allow plants and animals to synchronize their activity, such as sleepwake cycles, with the Earth’s rotation. The announcement that they had won came as a
MT17|Bang!|4 Images: Wikimedia Commons
surprise, as it was predicted that the Nobel would go to immunotherapy or CRISPR (see page 19), which are currently in vogue. The work of these three researchers extends back to 1984 when they isolated a gene, named “period”, in fruit flies that was involved in circadian rhythm. They later discovered this gene encodes a protein, PER, that accumulates at night and degrades during the day, corresponding with an organism’s sleep-wake schedule. Since then, they discovered other proteins and genes involved in the molecular mechanism. ‘We were hopeful that what we saw in the fly would pertain more widely. I don’t think we ever thought a beautiful mechanism would emerge,’ said Young. This Nobel Prize in particular highlights the divide between basic research and applied research. Basic research is for the sake of knowledge, and applied research focuses on immediate human benefits, such as developing medical treatments. The work of these scientists would be classified as basic, since its purpose was to understand the biological clock of the fruit fly. But as the Nobel committee noted, science of this nature often informs and contributes to an understanding of health and disease. This is exactly what their research has done, by later along the line informing our understanding of the relationship between disrupted circadian rhythm and obesity and heart disease. As Rosbash said, ‘It’s a great day for the fruit fly.’ Brianna Steiert
Cassini’s mission comes to an end After 13 long years years orbiting Saturn, the Cassini spacecraft makes its final descent ust over a month has passed since the Cassini spacecraft ended its life in a dramatic plunge into the planet Saturn, aided by the gravity of Titan, Saturn’s largest moon. Despite being first observed over 350 years ago, little was known about Titan before this voyage. During the mission, Cassini fly-
bys of Titan were able to image hydrocarbon lakes on its surface, which were chemically analysed by spectral measurements. This revealed that these lakes contain liquid methane, which acts as Titan’s equivalent of our own familiar water cycle—creating lakes, condensing as clouds, and falling as rain onto the -179°C surface. Titan has an abundance of natural resources: its polar lakes alone contain hundreds of times more natural gas and other liquid hydrocarbons than all the known oil and natural gas reserves on Earth. Cassini has also found evidence that Titan boasts a huge sub-sur-
face ocean made up of liquid water and ammonia. This exotic world is viewed by some as an analogue of a young Earth, being similar to the conditions on our planet over 4 billion years ago—around the time life first emerged. This has led to excited speculation about the possibility of living organisms in the lakes of Titan. There is still a lot to learn about this moon, and there is a hope that future missions will be possible—in particular, NASA are supporting the design of a submarine to explore its intriguing oceans. Eliza Dickie
Deadly Zika mutation
Scientists have identified a single mutation that increased the virulence of the Zika virus
Researchers have discovered a new kind of optical trickery used by flowers to lure in bees
ika virus was first isolated in the Zika forest of Uganda in 1947, but only in the past couple of years has it hit the headlines, with an epidemic sweeping across Central and South America. While this mosquito-borne virus usually has relatively mild symptoms, its ability to cause microcephaly in babies born of infected mothers is what led to the recent outbreak being declared a public health emergency. Whether this effect of Zika has come to light because of the larger scale and better reporting in the recent outbreak or because of an actual change in the virus is unknown. A new study might be bringing us closer to an answer. Researchers at the Xu and Qin labs in China compared an ancestral strain of the virus, isolated in 2010, to three strains from 2015–16. Not only did injection of the recent strains into mice cause 100% mortality, as opposed to 17% mortality seen with the ancestral strain, but they also found that when these viral strains were injected into brains of mouse embryos, brain damage was far more severe with the more recent strains. By creating seven versions of the ancestral strain, each with a single mutation from the newer strains, they found that one particular mutation led to increased virulence in mice and enhanced replicative capacity in human cells. However, this single mutation may not fully explain the severity of the recent outbreak of Zika. A host of factors, including the genetics of the human population, their previous exposure to related viruses, and the number of people infected, also likely play important roles. This study does, however, provide an insight into the workings of the virus and will hopefully aid in development of better preventative and therapeutic interventions.
s immobile organisms, plants recruit the help of pollinators, such as bees, to bring the male and female gametes together for reproduction. Bees rely primarily on visual cues to discriminate between flowering plants, but are relatively insensitive to all colours except blue. Yet how often do you see the colour blue in nature? Blue pigments are difficult to synthesise, so researchers speculated that plants must use some other mechanism to attract blue-sensitive pollinators. Recent research may have revealed their secret. Scientists have discovered that unique ridges on the surface of petals scatter light when hit at certain angles to produce a “blue halo”, often invisible to the human eye. This idea was tested using synthetic flowers, some mimicking real-life petals and able to produce the blue halo, and others not. The more realistic petals were associated with a sweet reward, meaning bees let loose on them quickly learnt to respond to the blue halo and preferred these flowers over others. This confirmed that bees are in fact very sensitive to whether or not a flower has a blue halo. It was also observed that the bees moved a third more quickly between blue halo flowers, suggesting this improves the efficacy of their foraging. It seems that these ridges are found in a diverse range of species, rather than only a feature of a closely-related group. Whether this is an ancient trait or has evolved several times independently is not yet clear. Regardless, now that we know that the blue halo is a visual cue used by foraging bees, this opens up exciting new avenues for further research into bee populations and their role as pollinators.
Molly Weiland MT17|Bang!|5
Have you ever considered a career in IP? How about becoming a patent attorney? We are proud to be a leading Tier 1 IP firm. We only recruit the best graduates, which means that you are surrounded by other committed, hardworking, ambitious and highly capable colleagues. We act for global corporations such as Proctor & Gamble, BP and Cisco, but there is also plenty of scope to work on behalf of smaller organisations, universities and start-ups. Career progression is based on meritocracy, and we have a clear career progression structure from Trainee to Associate to Partner.
We take pride in having a diverse working environment and are active members of IP inclusive. We are keen to hear from ambitious, commercially minded individuals who would like to join us. If you want a career that offers a professional qualification, excellent prospects and a fresh challenge every day, consider joining the team at Mathys & Squire. For a confidential chat call Morwenna Scholes, our HR Director, on T: 020 3770 6127 or email firstname.lastname@example.org
Mathys & Squire LLP T +44 (0)20 7830 0000 // E email@example.com // @Mathys_Squire // www.mathys-squire.com London // Manchester // Cambridge // York // Paris // Munich // Luxembourg
What’s On in Oxford How to build a star: nuclear fusion on Earth Come along to hear about work on replicating conditions similar to those in the core of the Sun, and how this fits into the global roadmap to commercial fusion. Wednesday 15th November | 7pm | St Aldates Tavern
Talking climate in Texas: and living to tell the tale Evangelical Christian and award-winning climate scientist Katherine Hayhoe discusses the challenges of climate outreach and how we can overcome them. Wednesday 15th November | 7:30pm | University Church
Mind-altering drugs: a safer way forward? Dr Ben Sessa draws on his work with clinical addiction to consider how illegal drugs might be used in treating mental health conditions. Thursday 16th November | 6:30pm | Natural History Museum
Learning to Read: Biology to Culture Professor Kate Nation describes recent advances in our understanding of how we learn to read—whilst languages and words might be cultural creations, reading and writing is deep-rooted in our biology. Wednesday 22nd November | 6:30pm | Natural History Museum
The Element in the Room Steve Mould and Helen Arney discuss their fun new book, which aims to help you explore the science that is staring you right in the face. Thursday 23rd November | 1pm | Blackwell’s
How the quantum universe became classical Professor James Halliwell will discuss how the familiar classical physics of the world around us emerges from quantum mechanics. Thursday 29th November | 8:15pm | Clarendon Laboratory
Code[Laborate] A social coding event, where you can code, meet people, discuss different projects or learn and get help. The event is open for all skill levels! Every Wednesday | 8pm | LSK B, Wadham College
Image: The British Library
Bang! Blogs This term our Bang! bloggers have been writing about their summer science internships, ranging from a micro-internship at a science start-up, to a summer working at the world’s largest museum, the Smithsonian Institution. On this page, two of our Bang! bloggers have summarised their blog series.
Mental Health Policy
Read more about Claire’s summer of science in Paris in her blog: Internship at École Normale Supérieure.
Read more about Jacqui’s internship at the Parliamentary Office of Science and Technology in her blog: Internship at POST.
his summer break I was able to spend two months doing science whilst living in Paris—the dream! I was looking to gain some laboratory experience in the field of epigenetics and a group at the École Normale Supérieure in Paris agreed to host me, allowing me the chance to gain some insight into their research. The lab I worked in focused on diatoms: photosynthetic algae ubiquitous in aquatic habitats, microscopic in size but giants in their ecological importance. I learnt how studying their genome architecture (that is, how the DNA is packed together within the nucleus) and how this changes under different environmental conditions can teach us a lot about how genomes function, and how organisms adapt to their surroundings by altering which genes are switched on. It was a great chance to try out some fancy-sounding techniques such as pyrosequencing, and also to experience research in what is still a very young field. Understanding how genes are regulated can be challenging to study and there’s so much we’re yet to figure out, but this just means that it’s really exciting! To find out more about epigenetics, turn to page 23. The fact that the lab is located in Paris added another dimension to my experience. It was amazing to experience living abroad for the first time and to put into practice my rusty high school French. That science is in many ways an international enterprise is one of the best things about it, and I’d urge undergraduates looking to gain some laboratory experience to take advantage of this as much as they can!
recently spent three months interning at the Parliamentary Office of Science and Technology (POST), finding out about how science and policy interact in the UK. POST provides parliamentarians (MPs and Peers) with peer-reviewed, evidence-based information about key public policy issues related to science and technology. It aims to improve their understanding of these topics, and help them to make informed policy decisions. During my internship I produced a four page parliamentary briefing document (called a “POSTnote”) about Young People’s Mental Health Services, to provide parliamentarians with a clear and unbiased summary of the research surrounding this important issue. I attended parliamentary meetings, and interviewed many mental health experts as part of my research. I found condensing the huge amount of information on a topic as broad and complex as Children’s Mental Health into a clear and concise four page POSTnote to be extremely challenging. My experience of working in Parliament was that it’s really hard to get anything done when it comes to changing policy! However, it was great to see so many people working in Parliament who were trying to make positive changes, with at least four different select committees focussing on mental health during the three months of my internship. I would definitely recommend an internship at POST for any scientist interested in getting some experience in science policy!
All Bang! blogs can be accessed at www.bangscience.org. If you are interested in blogging for Bang!, then please get in touch by emailing firstname.lastname@example.org. MT17|Bang!|8
Images: Wikimedia Commons (left), David P Howard (right)
Epigenetics in Paris
Affirmative Action and Its Opposite Reaction Megan Engel Illustration Eleanor Minney
ecause of these ridiculous hiring quotas, another underqualified woman gets the position instead of me,’ someone fumes from the common room of the Rudolf Peierls Centre for Theoretical Physics. I am venturing down for a coffee when I overhear the unhappy researcher. I retreat to my office, opting to avoid conflict; after all, given that only 10 of 101 members of the department are women, he can be forgiven for assuming none were listening. Affirmative action, which seeks to rectify representational imbalances by giving special consideration to minorities—in this case, women in science—can be polarizing. The paucity of women in the scientific academy is undisputable. And the last 10 years have brought little progress: the UK’s professorships in the natural sciences held by women have declined from 10.1% in 2007 to 9.0% in 2013. A 2012 study found that affirmative action improves the numbers of women in the laboratory without sacrificing efficiency, and a German government program that preferentially funds professorships for women has resulted in an increase in the number of women professors in the country from 10% to almost 20%. Some argue, however, that affirmative action undermines academic meritocracy. Head of Oxford astrophysics professor Steve Balbus feels they unhelpfully place ‘identity above ability,’ and James Binney, professor of theoretical physics, calls affirmative action a ‘pernicious industry’ that is ‘deeply and terribly wrong.’ Further, many believe—as evidenced by the infamous recent Google memo— that there is a biological basis for the imbalance. Balbus argues that ‘a statistical departure from strict demographic proportions is not necessarily…indicative of prejudicial behaviour.’ In other words, a lack of talent
and inclination may be responsible for the dearth of women scientists. However, there is overwhelming evidence that women’s capacity for scientific thought is equal to men’s. Furthermore, affirmative action strategies are not intended to unfairly bias the applications of less qualified individuals; they are meant to to put all candidates on equal footing by accounting for hidden variables in the career of a female scientist, such as sexual harassment and discrimination that she’s had to overcome. Even if on paper, their qualifications are comparable, women who make it to the interview stage for a job are often more resilient, better scientists than their male competitors. A conversation with a fellow DPhil student illustrated this point starkly. Swapping undergraduate experiences, I detailed how multiple classmates sought romantic relationships with me and despite firm refusals wrote letters, called me repeatedly, and treated me scornfully in the classroom. One professor pointedly refused to answer questions asked by myself or the single other woman in class. He pulled me aside one day and said, ‘next time you feel like asking a question, close your mouth instead of wasting everyone’s time.’ We had to ask a male friend to pose questions for us. And there were no women physics professors at my university from whom I could derive the subconscious comfort that others like me have succeeded before. I hadn’t considered these issues in much depth previously. But my friend, appalled, said, ‘when I did my physics degree, all I had to worry about was understanding the physics.’ If anecdotal evidence of my own life leaves you unconvinced that women face a tougher climb, there has been plenty of scientific work establishing this. Studies have shown that
it is more difficult for women to get scientific articles published; women require more publications than their male peers to advance in the scientific academy; parenthood disproportionately harms women’s academic careers; and women professors often earn less than male counterparts. One 2012 study revealed a bias towards hiring men for a laboratory position when identical applications were given under male and female names. Affirmative action campaigns account for additional obstacles that female candidates must surmount— obstacles that only deep passion for science can overcome. Affirmative action is not about “identity trumping ability”: it’s about recognizing abilities that are not detected in traditional hiring processes. It aims to redistribute some of the help meted out to male colleagues at every step along their journey into science Affirmative action is no panacea. Institutional barriers to women’s progression, like family-unfriendliness, must also be tackled. As part of MIT’s efforts to increase gender equality in the School of Science, a daycare centre was constructed, and the fraction of women professors subsequently increased from 8% to 19%. Going forward, a multi-pronged, evidencebased approach will be needed. A prerequisite for this to occur, however, is a shared understanding between women and their male colleagues that efforts to foster equality are not tantamount to injustice or favoritism. They’re a long-overdue acknowledgement of what CVs cannot communicate: men begin their scientific careers with a few extra weights on the scale. MT17|Bang!|9
Gastro-revolution The cutting edge technology that promises to transform how we eat Bramman Rajkumar Illustration Pat Taylor
computational biologist, an organic chemist, and a head chef walk into a bar. It sounds like the start of a bad joke, but it’s actually a typical Friday night social for a number of new firms which are hoping to revolutionise the way we cook, eat, and even think about food. From the incredibly surreal recipes competing in the ‘International Contest of Note’ by Note Cooking to the large-scale commercial production of synthetic meat and
He claims that his methods will allow us to explore 103000 new recipes dairy products, foodies and scientists are coming together to tackle issues of animal welfare, sustainability, and human health. Hervé This has long pioneered new culinary techniques, founding the Molecular Gastronomy movement in the 1980s, and by 1994 he had developed the more subversive “Note by Note” style. It started with a desire to simply improve food and drink, however this concept quickly developed towards the idea of making entire dishes from chemical compounds. Parallels can be drawn with the birth of electronic music, when musicians started to make harmonies from pure sound waves. Chefs are instructed to use raw substances like water or ethanol and sucrose or amino acid solutions; by mixing these, they can control not just the taste, but the texture, odour, colour, shape and even nutritional value of a meal. Proponents of the technique arMT17|Bang!|10
gue that we’re already using “pure chemicals” like water, sucrose, gelatine, and sodium chloride—and that’s not to mention a smorgasbord of E numbers and additives. Such is This’ enthusiasm, he even hosts an annual competition, open to chefs of all standards—the youngest participant so far was just ten years old. For the most recent event, candidates were required to design and prepare a number of dishes using cellulose. Yes, cellulose, the main ingredient in plant cell walls and cotton T-shirts. For the very latest updates on Note by Note cooking, converts can even subscribe to an open access journal dedicated to the discourse. You and I might balk at some of the wackier ideas described here, but even prestigious schools such as Le Cordon Blue and the Dublin Institute of Technology are incorporating the Note by Note method into some of their courses. It’s not difficult to argue that the idea has legs. Rather than entirely replacing current culinary methods, Hervé This wants Note by Note cooking to coexist with them. You can’t accuse the man of lacking ambition: he claims that his methods will allow us to explore 103000 new recipes—to give this incomprehensibly large figure some context, he estimates that traditional cookery limits us to a “mere” 1030. If humanity can overcome its tendency to favour the familiar, Note by Note cooking will have a profound effect on every aspect of the way we eat. Recipes might start to sound as if they’re lifted straight from a postapocalyptic science fiction novel—but to many, this is a small price to pay for huge gains in the efficiency of the food
supply chain. After all, concentrated chemicals can be transported far more easily than pre-packaged food, let alone frozen goods or indeed any sort of livestock. Hypothetically we could tailor the nutritional values of a meal, remove allergens and toxins, all at a fraction of the energy cost. For now though, only a first glimmer of this future can be seen in Hervé This’ commercial counterparts in the US, perhaps helped by the multi-million pound investment they have drawn. The movement is mainstream enough that various companies have received funding from the Bill & Melinda Gates Foundation to investigate healthier and more sustainable food sources. Slightly easier to stomach than chemical snacks, some of these products are in fact already on the market. NuTek Food Science sell “Salt For Life”, in which sodium is swapped out for potassium, a chemical associated with reducing the risk of the high blood pressure and heart disease excessive sodium can cause. Another company, Beyond Meat —also backed by Gates—claim to have perfected the classic American
Recipes start to sound as if they’re lifted from a postapocalyptic sci-fi novel beef patty: only catch is, their version is comprised entirely of plant proteins and is totally vegetarian friendly. To question an old adage: if it looks, smells, tastes, and even bleeds like a burger, does that make it a burger? Founder and CEO Ethan Brown thinks so, and supermarket taste testers agreed. That’s why Beyond Meat products are found in the
meat section of major U.S chains like Whole Foods or Safeway, as opposed to lining up alongside meat substitutes like Quorn, which for all their healthy and environmental credentials, simply cannot compete with the real thing in terms of taste or texture. Brown, perhaps unknowingly, subscribes to a watered down version of the Note by Note philosophy. By considering meat in its constituent parts— ’basically fat, protein, and water’—he envisions a future where animal agriculture is essentially obsolete. His vegan diet, rather than extinguishing his dream of the ideal burger, feeds his devotion. His vision has attracted a number of high profile backers. Joining the Gates Foundation are the Twitter founders Biz Stone and Ev Williams, while a former McDonald’s CEO sits on the board. However, the real stars are the scientists: the inhouse research team and various external advisors boast degrees and fellowships from a host of Ivy League colleges. They face the Herculean task of ensuring the products are indistinguishable from their animal meat counterparts, yet far superior in terms of nutritional content. Whereas Hervé This pursues cooking as an art form, Brown is driven in pursuit of a tantalising trifecta of goals: improving human health, protecting animal welfare, and saving
If it looks, smells, tastes, and bleeds like a burger, does that make it a burger?
the environment. As you might have expected, this food revolution has faced some opposition, despite US companies facing some of the most relaxed food regula-
tions in the world. Impossible Foods, founded by Stanford biochemist Patrick Brown, only came under scrutiny after they voluntarily submitted their own research into their ingredients for review by the Food and Drug Administration (FDA). Having done this, they were then required to conduct further testing into one of their key ingredients, soy leghaemoglobin. This crucial “meat” flavouring component, despite having since been independently reviewed as safe, has not only drawn questions over the legitimacy of Impossible Foods’ testing, but caused many in the media to doubt the FDA’s willingness and ability to take action against them. For the sake of the planet, we have to hope these bumps in the road can be overcome. In its short time as a company, Impossible Foods claims to have saved as much fresh water as 50,000 average Americans drink in a year, by exchanging 22,000 kg of beef burgers for their own plant derived version. And yet, that’s only a tiny 0.00005% bite of the market. It’s going to take an upheaval of our food habits for Ethan Brown, Patrick Brown, and their contemporaries to make a difference. If they do, we might just be one step closer to tackling the hugely destructive impact that agriculture and livestock have on the environment. MT17|Bang!|11
My Other Car Drives Itself Get ready for the driverless future, coming to a highway near you Ian Foo
t’s 2017, and we can’t help but hear about driverless cars. Some people are already lucky enough to own one of Tesla’s Autopilot-enabled vehicles, capable of autonomous lane changing, adaptive cruise control, and driving themselves on the highway. These features are just the beginning of a technological revolution that looks set to alter cities, the ways we get around them, and societies worldwide. Driverless cars use a battery of sensors to see what’s on the road. Cameras, radar, and LiDAR (which bounces
pulses of laser light off nearby entities) work together to create three-dimensional, 360 degree images of the car’s surroundings, distinguishing every object and its shape, speed, and even softness. These are complemented by Machine Vision Algorithms, which track, bound, and label each object the car sees. Pattern recognition software is then used to refine the cars’ approach to driving. The computer is fed images and situations and makes initial guesses, iteratively modifying parts of its own code structure based
on the success of these guesses until it can identify even novel objects with high accuracy. It’s not a question of if, but when. Trials are taking place worldwide: in Singapore and Japan; Pittsburgh and San Francisco; and closer to home, in Milton Keynes and Coventry. The algorithms used work better the more data they have, and more testing time is sorely needed. Driverless cars also collect information at an astonishing one gigabyte per second, and companies like NVIDIA and Intel are devel-
The Automation Revolution Automating key processes will be necessary if researchers are to keep up with the accelerating pace of scientific discovery Joseph Elliott Illustration Gulnar Mimaroglu
utomation is a powerful scientific tool and one that is often taken for granted. In almost any lab one will find machines executing precise and often repetitive tasks that generate huge swathes of data by streamlining the experimentation process. A striking example of the effects of automation is the ability to amplify DNA fragments automatically and efficiently, massively reducing the cost of genome sequencing. In 2006, the cost to produce a human genome sequence was around $14 million. With “nextgeneration” sequencing technology the cost has dropped to below $1500. Nowadays entire pharmacological production lines can be automated and start-ups such as Transcriptic are building systems to automate nearly every physical task demanded of biomedical scientists. NASA even has plans to build a fully automated laboratory on its Deep Space Gateway space station, set to launch for 2020. There is no doubt that automation thus far has brought innumerable MT17|Bang!|12
benefits to the scientific community, even simply considering its effect on cost. However, it has the scope to expand out of experimental science and threaten the human driven fields of data interpretation and hypothesis generation. “Adam”, nicknamed the Robot Scientist, has successfully generated and experimentally tested its own hypotheses regarding the genomics of the yeast Saccharomyces cerevisiae. Automation need not be perceived as a threat but as a necessary tool; the ability of algorithms, such as brainSCANr, to analyse vast quantities of data may be necessary to keep up with the pace of scientific advances. In 2016 there were 1.2 million papers published in biomedical sciences alone. With such developments over the horizon, it would not be extreme to re-evaluate what a human scientist might be able to contribute to science
that a machine or an algorithm could not. Although complete automation of scientific development is currently out of reach, it is clear that the way we do science is changing, and that automation will play a key role.
oping superfast processors and data transfer networks to handle these volumes. Car manufacturers have promised to deliver fully autonomous vehicles anywhere between 2018 and 2021. Tech visionaries predict that widespread use of autonomous vehicles will bring about a dizzying number of benefits to society. Think Uber, but driverless; anyone will be able to summon an unmanned taxi from their phone. Besides providing enormous mobility, this reduces the need for private car ownership, which brings environmental benefits and a huge chunk of disposable income for commuters. Taken further, the majority of cars will be owned by companies in taxi fleets, active 24/7 with no need for parking. The cars themselves will navigate more safely than any human can. They’ll communicate with each other in order to travel in tighter configurations, saving road space and reducing risk. The landscape of urban
centres will change: parking spaces can be converted to other uses, kerbside parking will vanish, and roads will become narrower. All this points toward a more efficient, greener world which may be wholly unrecognisable in 20 years’ time. This utopia is attractive, but every new technology comes bundled with its own set of problems. On the implementation front, infrastructure needs to be set up to support the imaging capabilities of autonomous vehicles wherever they are used. Legal nuances also come into play: who is to blame for an accident, the manufacturer or the software developer? The same algorithms that power driverless cars’ decisions make it difficult to discern why, and how, it makes wrong decisions. Driverless cars are also dangerously susceptible to hacking. Development thus far has focused on engineering, but not on encryption. Systems security is a never-ending arms race, and protecting autono-
mous systems will be an indefinite and continuous effort. Additionally, it’s very possible that driverless taxis will be used to gather data on and sell ads to their passengers. But the biggest impact driverless cars will have on society is the resulting dearth of jobs. Taxi drivers and truckers will become obsolete. Car manufacturers will sell fewer vehicles. A swathe of workers will find themselves in a world where their skills are unnecessary, and yet another public utility is controlled by an oligarchy of tech companies. Much like the introduction of cars a century ago, autonomous vehicles are going to transform the way we live. The technology is sound and realisable, but that may not be the priority. As development ramps toward completion, work must be done concurrently to mitigate the social and economic consequences of what’s to come. But one thing is for sure: there’s no stopping the driverless revolution.
Bioethical Dilemma Tilly Ansell Illustration Thomas Player
esearch ethics in the UK is founded on the Nuremberg Code, derived in response to horrifyingly unethical Nazi experimentation on human prisoners. The Nuremberg Code consists of ten points including the importance of informed consent, prevention of unnecessary suffering, and that the risk to participants must be proportional to the humanitarian benefits it could provide. However, this code has not totally prevented unethical research taking place. In the Tuskegee syphilis trials, four hundred disadvantaged black American men suffering from syphilis were actively prevented from accessing treatment until 1972, when a whistleblower told the press about the trial. Many of the men died, along with spouses who became infected and children born with syphilis as a result of the disease being left un-
treated. More recently, in 2014, Australia’s National Health and Research Council was accused by Organisation Intersex International Australia of funding research which breached human rights. While ethical practices have improved dramatically in the last 50 years, it is clear that a greater degree of stringency is needed to prevent further unethical research being conducted. Research ethics standards will need to be updated to keep up to date with current medical science. For instance, the recent developments in CRISPR/Cas9 technology and creation of three parent babies through spindle nuclear transfer has reinstated an excitement and urgency within genetic research. Advances in karyotypic manipulation—the ability to alter the chromosomal composition of a cell—pose particular attrac-
tions to same-sex couples as it could allow them to have children where the chromosomes come from both parents. While the potential applications of these technologies continue to increase, it is important that in the near future vulnerable groups are not exploited or used as a buffer for more experimental research. It is largely unavoidable that the ethical decisions made in counties reflect the particular political standing of the country, particularly if ethical bodies are privatized by government. In a time of political instability and the rise of openly homophobic, ableist and racist figures, scientists must continue to fight for inclusive, fair, humanitarian ethics in research. MT17|Bang!|13
Turned On Will pornography drive advances in technology? And is this a good thing? Thomas Pak Illustration Tiffany Duneau
ime and again throughout human history, new waves of technology have given rise to new forms of pornographic material. One of the oldest known prehistoric art objects is the Venus of Willendorf (circa 28,000 BCE), a figurine carved from limestone that represents a hypersexualised version of the female form with enlarged breasts, a clearly pronounced vulva, and a curvy derrière. Thirty millennia later, anyone with an internet connection can tap into an endless supply of high-definition pornographic videos in every category imaginable at a moment’s notice. Even though this relatively recent explosion of pornographic content is remarkable in many ways, internet pornography has essentially already become yesterday’s news. The pornography industry is quickly moving on to the next wave of consumer technology: virtual reality (VR). It may seem like a risky business: VR technology has gone through multiple cycles of hype and disappointment over the past decades and has still not become a mainstream technology. Pornography companies are marching ahead nevertheless, and the reason is obvious for anyone who may have tried VR pornography: the level of immersion goes above and beyond anything that conventional pornography can provide. Moreover, with the advent of VR cardboard goggles such as Google Cardboard, an entrylevel VR experience has become available to anyone with a moderately recent smartphone. It is the hope of VR pornography companies like BaDoinkVR, who gave away 10,000 pairs of VR cardboard goggles in 2015, that an increased awareness of VR technology, paired with high-quality VR pornographic content, will drive widespread adoption of VR technology. Similarly to how an insatiable demand for pornographic material was a crucial factor in popularising the internet, it could also play a key role in mainstreaming VR technology. These developments are not without criticism, however. Although popular, pornography contributes to harmful attitudes towards women and difficulties in real life sexual relationships, especially when viewed by young people. It’s possible that by making pornography more immersive, the conditioning effect would be increased, strengthening psychological connections between unrealistic sex in which partners may be hurt and degraded, and reward for the viewer. Meanwhile, the ongoing transition from cold, inanimate sex dolls to highly realistic, interactive sex robots closely tracks current advances in artificial intelligence (AI) and robotics. The arrival of sex robots that move and
talk like actual people promises to fundamentally revolutionise the relationship between humans and technology. Whereas VR pornography provides an immersive, but ultimately passive experience, sex robots offer a physical, multisensory experience. An experience that is tailored to match the consumer’s exact tastes and preferences, is available at all times, and does not require the effort and complexity that goes into maintaining a relationship with a real human being. The technology required to build a lifelike sex robot will take many years of research and funding to perfect. Nevertheless, their potential to transform society is so immense that a debate is already raging on whether they should exist in the first place. Proponents argue that sex robots, or euphemistically “love robots”, can improve the lives of people who have difficulties connecting to others. Moreover, sex robots could be employed as alternatives to human prostitutes, potentially reducing the spread of STDs and lowering rates of human trafficking. A yet more controversial idea suggests that robots could serve as substitutes for people who harbour deviant sexual desires such as paedophilia to prevent them from acting out their impulses on real victims. However, it is not at all clear that sex robots would have these desired effects: examples from existing technology suggest that allowing people to act out violent fantasies on lifelike dolls may encourage, rather than discourage, similar behaviour towards humans. There’s a reason child pornography, for instance, is illegal in most parts of the world: it’s not just for the protection of the children in question, but also because we believe it to be wrong to indulge such fantasies and encourage them to develop. Critics say that sex robots would reinforce harmful attitudes towards women by literally reducing them to objects. A recent outing by the sex robot “Samantha” in the Arts Electronica Festival ended up with her having to be shipped back to the workshop for repairs due to the damage she suffered at the hands of festival attendees who gleefully molested her. This incident evokes a Westworldlike future, where people turn to robots for acting out their violent rape and murder fantasies. As humanoid robots become more realistic, the line between real and fake is increasingly blurred and human-robot interactions may spill over to the domain of human interactions. Whatever the future may bring, it is all but certain to bring technological changes to the position of sex in society. Whether we are prepared to deal with them is less certain. MT17|Bang!|15
The Future of the LHC After the Higgs boson discovery, what’s next for the giant particle collider? Mark Pickering Infographic Pat Taylor
his year marks the fifth anniversary of the discovery of the Higgs boson particle, an event which brought to an end the long search for the origin of particle mass. Its discovery was one of the key aims of the Large Hadron Collider (LHC), the particle collider ring based at CERN inside which protons are accelerated to near-light speeds. The two beams of protons cross at the centre of four major particle detectors located around the ring, with two of these experiments, ATLAS and CMS, optimised for the detection of the Higgs boson. This success, however, does not mean the end of the LHC’s task. Seven years on from its first operation, tens of thousands of scientists from over 600 universities continue to meticulously collect and analyse the debris from the LHC’s proton collisions, presenting a remarkable example of
international scientific collaboration. They hope to shine a light on some of the gaps in our understanding of physics, and maybe even discover more fundamental particles. To understand the importance of the LHC mission, and why the discovery of the Higgs boson was so significant, we must first go back and look at the state of particle physics in the middle of the 20th century. Many unexpected new particles had been found, but there was no clear pattern to their properties or interactions. This was most evident in the 1936 discovery of the muon (a heavier cousin of the electron), with one frustrated physicist exclaiming ‘who ordered that?’ In many ways, the circumstances resembled the situation in which chemists found themselves a century prior, when many new elements were being discovered, before being organised into the periodic table.
However, during the 1960s and 70s, physicists honed in on a cohesive theory: the Standard Model. According to the theory, all matter is made up of a set of particles known as fermions, whilst the forces between them are mediated by particles called bosons. The fermions are further divided into the leptons, which come in the so-called “flavours” electron, muon and tau, and the quarks, with the flavours up, down, charm, strange, top and bottom. Quarks combine to form composite particles, for example, two up quarks and a down quark bind to form a proton, whilst two down quarks and an up quark come together to form a neutron. The protons and neutrons themselves combine with electrons to form atoms, and so make up the matter of our everyday existence. Around the turn of the century, all the particles predicted in the Stand-
CERN European Organization for Nuclear Research
Super Proton Synchrotron (SPS) | 40 m underground | 7 km circumference
A history in three accelerators
Large Hadron Collider (LHC) | 100 m underground | 27 km circumference
Future Circular Collider (FCC) | 270 m underground | 100 km circumference
ard Model had been observed, with the exception of a special breed of boson required for the introduction of mass to the theory, to which physicist Peter Higgs gave his name. The LHC’s ATLAS and CMS collaborations put into place the final piece of the Standard Model puzzle, the Higgs boson, two years after the switch-on of the LHC on the 4th July 2012—known amongst some physicists as Higgs-ipendence day. In addition to this fundamental particle, many composite particles and rare particle decays—predicted in the theory but previously out of reach—have also been seen by the LHC experiments. These include the observation of exotic tetraquarks and pentaquarks, discoveries which settle the debate over the existence of states of matter containing four or five quarks, as opposed to the known two- and three-quark states. The LHC experiments have also met with great success in confirming the measurements of previous experiments to a much higher degree of precision, further confirming the accuracy of the Standard Model. The story doesn’t end here though. We know that the Standard Model isn’t the final model of physics. It doesn’t include gravity and it requires the insertion of many seemingly ar-
bitrary parameters into the theory. Additionally, it provides no explanation for the origin of dark matter or dark energy, which at over 95% of the material in the universe, seems quite an omission. So what is next for the LHC’s mission? There are two ways we can look for signs of physics beyond the Standard Model. The first is through direct searches for new particles that do not feature in the theory. Many suggested extensions to the Standard Model require extra particles: supersymmetry, for instance, whose lightest hypothetical particle is a candidate particle for dark matter. Following Einstein’s famous E = mc2 equation, which shows that energy can be converted into mass, the high energies of proton collisions in the LHC produces particles across a far larger mass range than previously explorable. However, whilst there have been hints of new particles, these later turned out to be statistical fluctuations as more data was collected, and no fundamental particles beyond those predicted by the Standard Model have yet been observed. This doesn’t mean they aren’t out there though; they may be well hidden or beyond the reach of current accelerators. Either way, the LHC experiments will continue their search. New particles don’t have to be ob-
served directly; we can also infer their existence through the tiny contributions they make to various properties as they pop in and out of existence during interactions with other particles. By measuring the interactions of known particles—now including the Higgs boson—to a high degree of precision, we can look for deviations from expected behaviour. Whilst the majority of tests of this nature have shown the Standard Model to be very precise indeed, and there are not yet any standout examples of LHC data contradicting the theory, some measurements have given enticing hints of a tension. More data will enable us to probe these results further. In its five years of active operation, the LHC has confirmed to us just how accurate the Standard Model of particle physics is, particularly in discovering its final fundamental particle, the Higgs boson. However, the most interesting task of the LHC lies in trying to poke holes in our knowledge of physics. The machine will run for at least another 15 years, and whilst no one knows what these years will bring, history tells us that nature often proves to be stranger than we could ever imagine.
In 1983, the W and Z bosons (responsible for the weak force) were discovered using the SPS
Max energy*: 450 GeV
The Higgs boson (which gives fundamental particles their mass) was discovered using the LHC in 2012
Max energy: 13 TeV
The FCC won’t be operational until at least 2035. It’s goal is to further understanding of physics beyond the standard model
Max energy: 100 TeV
* On units: 1 eV=1.6×10-19 J. The prefixes G, T denote factors of 109 and 1012 respectively.
Robot Minds The development of artificial intelligence presents challenges, but could change the world for the better Adam Radford Illustration Tiffany Duneau
ew could fail to be impressed by the advances made in the development of artificial intelligence over the past decade. Since Apple released the first iPhone in 2007, it has become commonplace for smartphones to organise our diaries, tell us how to get places, and, after the launch of Siri in 2011, even hold conversations with us. In 2012 Google’s driverless cars navigated their way through traffic, and humans are now regularly defeated by computers in complex games. Science fiction is fast becoming reality. Little wonder, then, that optimism is currently high about the rewards that artificial intelligence might soon yield. The next decade will inevitably witness the continued proliferation of robots and artificial intelligence (AI) technology that increase both the efficiency and possibilities of human activity. Yet, at the same time, many scientists and AI researchers are currently working to create AI which can understand abstract, complex objectives, and then determine for itself strategies through which to meet them. Such is the pace at which both AI competence and computational power is increasing that in 2014 Stephen Hawking and others mooted that AI might even facilitate the eradication of war, disease, and poverty in coming decades. In the heady, early years of AI development researchers aimed to derive mathematical algorithms describing all aspects of human intelligence. It was hoped that a computerised machine would be able to use these algorithms to “learn” to do everything of which humans are capable. MT17|Bang!|18
Yet by the 1970s, with little to show from this “top-down” work, funding started to dry up. Interest was only renewed when business leaders saw potential in AI to perform very specific tasks more cost-effectively than humans. This led MIT professor Rodney Brooks to propose a new “bottom-up” route to general intelligence:
rather than trying to pin down a few “mega” algorithms, each enabling the performance of many functions, researchers could build up a computer’s competence incrementally with more and more separate algorithms, each specific to a particular task. Without the multifunctional intelligence to discern how to complete a new assignment, bottom-up AI is re-
liant on data detailing how humans accomplish the task in question. Serendipitously, just as this bottom-up approach was being pioneered, the internet was coming into its own, and has subsequently provided the vast amounts of data necessary to teach AI to perform many complex human functions. For instance, only with the internet has the comprehension and autonomous use of human language by machines finally been achieved. Virtual assistants both understand vocal commands and deliver their responses in complete, grammatical phrases. Only last month Google Translate outstripped Skype with the release of its Pixel earbuds, allowing users to translate conversations in realtime while on the go. The mass exploitation of web data by tech corporations is, however, resulting in various ethical, legal and political quagmires. Is it right that our browsing history is used to determine what advertisements are shown to us online? And should corporations like Google and Skype be allowed to benefit from the work of human translators, freely available on the internet, without offering remuneration? Existing legal frameworks pertaining to privacy, ownership, and labour need to be updated and extended to the digital economy. Otherwise, there is a very real danger that the spread of data-hungry AI might fuel rising inequality, as tech-savvy elites cream off vast fortunes at the expense of the rest of society. There are also, currently, severe limits to the efficacy of bottom-up artificial intelligence. Small deficiencies
in data provision by programmers can lead to considerable mistakes, as when in 2015 a Google product labelled a photograph of two black people “gorillas”. Even equipped with a seemingly comprehensive dataset, bottom-up AI can still prove inadequate to serve its user’s interests —if, for example, required to function appropriately in a statistically atypical scenario. Simply reproducing the overall, most common response by humans to a given stimulus may not, dependent on context, always be the best course of action. Incapable of a detailed, accurate “understanding” of its own surroundings, bottom-up AI remains, for the moment at least, more a passive tool than a truly autonomous helpmate to humanity. Consequently, top-down approaches are once again receiving attention and recently have produced some impressive results. In 2015 one of Google DeepMind’s AI systems successfully defeated human competitors at various arcade games, without previously having been given
any data about how to play. The only inputs were the pixels from the game console and the score. Unencumbered with data about how humans typically perform tasks, top-down AI might benefit humanity with an astonishing ingenuity. Equally, though, it could flout any number of accepted and valued norms of human behaviour. Stuart Russell, for instance, postulates a situation where a domestic robot, charged to prepare dinner for its family, is accidently left with no food in the fridge, and so resorts to cooking the pet cat. To avoid such disasters, Russell and others are calling for work on “value alignment”: instilling robots and AI systems with the subliminal values that we typically regard as common sense. However, distilling humanity’s varied ethics, norms, and shibboleths into one, universally accepted set of computer-friendly data will be no mean feat. Perhaps more intractable, though, are the likely ramifications of AI’s impact on human employment, as workers are inevitably replaced by robots
CRISPR: Redesigning Life A versatile new tool for genetic editing with unprecedented ease
very week there seems to be a new article about CRISPR, but what exactly is it and why is it causing such excitement? CRISPR was first discovered in 1985 by scientists in Japan working with E. coli, a well-studied bacterium. Inside the E. coli’s DNA, the scientists found five identical repeating sequences. They called these “Clustered Regularly Interspaced Short Palindromic Repeats”, or CRISPR. Researchers continued to find these regions in the DNA of more and more species of bacteria, and in 2005 some scientists realised that in between the CRISPR regions were short sequences of viral DNA. These are a part of the bacterial immune system, used to defend the bacterium against viral infections. When the bacterial cell comes into contact with the virus, it can recognise the viral DNA that it has stored in between its CRIPSR regions, and cut that DNA inside the virus using proteins known as ‘CRISPR associated’ (Cas)
and algorithms that can perform the same tasks at a much reduced cost. Between 2011 and 2016 automation destroyed half a million jobs in Britain. Almost 50% of all jobs lost in that period, this considerably exceeds the 365,000 that automation created. Schemes such as universal basic income might resolve the issue of how a largely non-working population gets by financially. However, far less easy to answer is the question of where humans will find direction, fulfilment, and self-worth when human labour is largely reduced to a needless leisure activity. With the appearance of legal software, computerised medical diagnosis and scientific research, and even AI-generated art, it’s not just the dull, back-breaking, low-paid jobs that are under threat. Might it be that, to ensure the true alignment of artificial intelligence with human values, we actually should temper our scientific curiosity, curb our commitment to efficiency, and resist the urge to plough on regardless in pursuit of ever more powerful, more widely competent AI?
proteins. By replacing the viral DNA inside the CRISPR regions with a sequence of their choice, researchers have developed CRISPR/Cas9 as a tool to cut DNA wherever they want (Cas9 is a particular Cas protein which can cut almost any CRISPR-associated DNA sequence). For example, researchers at Emory University have used this tool to replace the faulty gene that causes Huntington’s disease in mice, effectively curing the symptoms of the genetic disease. CRISPR/Cas9 could be used for everything from creating human embryos that are free from genetic disease to editing immune cells so they target cancers. However, with this newfound power comes a mountain of ethical responsibility. For example, changes made to embryonic DNA can be passed down to future generations, who have not given consent to being used in these experiments. Additionally, we do not fully understand the potential global consequences of altering DNA. Editing the mosquito genome to prevent carriage of the malaria parasite would save millions of lives, but could also lead to an increased number of mosquitos, causing repercussions throughout the entire food chain, or maybe even making way for a new, deadlier parasite. CRISPR/Cas9 is already being used in revolutionary science, but we still need to think carefully about how we use it, and what effect it may have on our future. MT17|Bang!|19
Bang! Talks to Angela Saini A
ngela Saini is a science journalist and author. Her first book, “Geek Nation”, which was selected as an Independent book of the year, explored India’s rich scientific history and its future as an emerging scientific superpower. Here, Ellen Pasternack asks about her most recent book, “Inferior: How Science Got Women Wrong, and the New Research That’s Rewriting the Story”, published in 2017.
Explanations like the one you mention are sometimes called “just-so stories”: they might make intuitive sense, and they’re a neat explanation for what we see around us, but there’s no or very little evidence that they’re the correct explanation. It’s easy to string together explanations that sound plausible, even if the supporting evidence is pretty thin. In certain areas of science there are a lot of speculative conclusions but not much data. When data comes along that confirms our preconceptions, we latch onto it. It’s a very human thing to do.
Image: Rainer Niermann
What inspired you to write Inferior? When I was a journalist for the BBC, I was asked to research a piece about different evolutionary explanations for the menopause. One paper I read suggested that older women had evolved to stop ovulating after a certain age because men no longer found them attractive. Being an engineering graduate, this area was totally new and fascinating to me, but this argument struck me as pretty ridiculous. To be honest, I was surprised it was published at all. In my research, I found it interesting that there were so many different explanations for the same thing. These explanations often aren’t compatible with one another: they can’t all be right. Often, these divisions are strongly politicised and cut across gender lines. I decided to write a book on the disagreements surrounding women in science.
How much do you think inaccuracies in science’s views on women has to do with the historical exclusion of women from science? I think the exclusion of women from academic conversations has had a significant impact on our body of accepted knowledge today. We all have biases, but if there is only one set of biases shaping scientific research then your perspective is much more limited.
When data confirms our preconceptions, we latch onto it
It’s easy to think that because a scientist suggested this explanation, it must be objective, however we all bring existing biases with us into research. Exactly. I draw attention to Darwin’s overt sexism* not to demonise him, but rather to demonstrate that although he is the single most recognised figure in evolutionary biology, he too made mistakes and held prejudices. I believe that science should be humbler, and acknowledge that mistakes have been made and can still be made today.
It’s easy to be critical of research when you don’t like the political implications, but perhaps harder for things that you agree with. Do you ever wonder whether you might be presenting just-so stories yourself? I’ve tried very hard not to—I definitely don’t want to
*Examples quoted in “Inferior”: Darwin wrote in “The Descent of Man” that ‘man [attains] a higher eminence, in whatever he takes up, than women can attain’ and in a private letter in 1881 that ‘there seems to me a great difficulty from the laws of inheritance in [women] becoming the intellectual equals of men’.
We all have biases, but if there is only one set of biases shaping scientific research then your perspective is much more limited present just one side of the story. I’m a feminist, but I’m not going to gloss over research that doesn’t support my personal views. The science of sex differences is a topic people are really invested in, and I think it’s particularly important to be as rigorous and objective as possible. What I find galling is scientists who hold strong opinions and don’t listen to research that contradicts them. As a journalist, I’m trying my best to maintain balance, but really their standards should be even higher. In your book I sometimes get the impression that you’re playing devil’s advocate: challenging flawed reasoning by making provocative arguments in the other direction. Would you agree? Actually, I wasn’t trying to do that at all—I very much wanted to approach this subject in an objective way. For instance, I wouldn’t go as far as denying that there are psychological differences between men and women at all. We know that gendered differences in play behaviour emerge from a young age; that’s well established. The problem lies in overinterpretation of these observations. There are many potential explanations for behavioural differences between boys and girls; we shouldn’t leap to the conclusion that they reflect innate, evolved differences between the sexes. Inferior covers science from a wide range of disciplines: anthropology, medicine, evolutionary theory, psychology, neuroscience. Did you find that a challenge, especially with a background in engineering rather than biological science? Yes, it was tough moving from one discipline to another. But in some ways it was helpful to come to biology as an outsider, because I could see it in a broader context and with a fresh perspective. What was it like talking to the scientists whose views you go on to discuss in your writing? It was really enlightening. Published research can only tell you one side of the scientific story—it’s vital to see research in its context, so to meet the men and women behind it was fascinating. Some of these people carried
out some of the most important, widely recognised work in evolutionary biology, so it was extremely interesting to find out what they were like in person. I was especially interested to learn about what it was like to work in science as a woman a generation ago. Inferior challenges some ideas which are almost taken as axiomatic, both within the scientific community and among the general public. Have you been surprised by any of the conclusions of your research? Yes! Like everyone else, I have been exposed to gender stereotypes from a young age and hold a lot of prejudices. One example of something that surprised me: it’s considered a fact by many people that women are in general better at language and worse at maths. I was really interested to learn that actually, systematic evidence doesn’t support this idea.
I wrote this book because I think feminism is missing out by not engaging with science How has the reception to your book been? It’s been amazing! I was surprised because I was expecting resistance, but I think my effort to write in a balanced way paid off. Both the right-wing and left-wing press have received it positively—even the Mail! Who would you like to read your book? What message do you hope readers take from it? Well, everyone, of course—but I wrote it with female readers of all ages particularly in mind, and perhaps younger women most of all. I wrote this book because I think that feminism is missing out by not engaging with science. I want women to take heart, as I did, from the fact that science is on our side.
About the book
n Inferior: How Science Got Women Wrong, and the New Research That’s Rewriting the Story, Angela Saini reviews centuries of scientific thought on the female sex from a modern day feminist perspective, with a healthy open-minded scepticism and a journalist’s sharp eye for detail. Accessible and wide ranging, this book is sure to provoke discussion among scientists and non-scientists alike. For your chance to win a free copy, turn to our crossword competition on page 29. MT17|Bang!|21
Not all healthcare communications agencies are the same AMICULUM was designed to be different. In 2001 we created a plan to build an independent global healthcare communications, consulting and learning business from scratch. We now have a worldwide team of 195 healthcare and communications professionals and work with most major pharmaceutical companies in some of the most complex and exciting areas of medicine. If you would like to learn more about AMICULUM’s agencies worldwide, our business philosophy and how we continue to create opportunities for talented individuals, please contact Richard Allcorn (email@example.com) to arrange an informal and confidential conversation. altogetherdifferent.biz
ƕĺƓbv-l;7b1-Ѵ1ollmb1-ঞomv-];m1 based in Oxford and Richmond in the UK. We provide the perfect balance of v1b;mঞC1;r;uঞv;-m71oll;u1b-Ѵ -1l;m|oou1Ѵb;m|vbm|_;]Ѵo0-Ѵ r_-ul-1;ঞ1-Ѵbm7v|uĺ
You’ll Cm7|_; 1Ѵ|u;lou; bm|;u;vঞm] -|ƕĺƓ
Our talented teams work with clinical experts from around the world to explain the science behind medical treatments -m77;b1;v|_-|1_-m];r;orѴ;ŝvѴb;vĺ );7;;Ѵor1ollmb1-ঞomruo]u-ll;v |_-||;ѴѴ1Ѵ;-uķ1olr;ѴѴbm]v|oub;vb|_ bv-Ѵv|Ѵ;ĺ =oĽu;u;-7|oѴ;-;|_;Ѵ-0-m7 =ou];-u;-u7bm]1-u;;ubml;7b1-Ѵ 1ollmb1-ঞomvķbm-m;mbuoml;m| _;u;ouob1;bѴѴ0;_;-u7ķou -vrbu-ঞomv;m1ou-];7ķ-m7ou development supported, please do ];|bm|o1_ĺ
Next Generation Genetics The last decade has seen us come closer to a complete understanding of how genes work Laura Steel
pigenetics is a term that was scarcely heard of a few decades ago, but has recently made its way onto the school curriculum and is splashed across the media, claimed as the cure for cancer amongst other things. Is there any truth to this? Although some in the scientific community remain unconvinced, there is quite rightly a genuine interest in the potentially massive implications, which range from a better understanding of evolution, to uses in the treatment of disease. So what is epigenetics? It translates from Greek to “on top of genetics”, and refers to chemical modifications that change how genes are used, without altering the DNA sequence itself. Have you ever wondered why our liver cells couldn’t be more different to our neurons, or our muscle cells more different to those of our kidneys, despite every cell containing the same DNA? The reason for this is the presence of chemical compounds which attach themselves to DNA and control which genes are switched on. These are known as epigenetic marks; the most common being DNA methylation. This is where chemical compounds called methyl groups bind to the DNA molecule and block it from being “read” by the cell’s machinery, resulting in the gene being used less or not at all in that cell. This allows two genetically identical cells or organisms to appear very different. Epigenetic marks determine how the DNA functions, and in this respect one could draw a parallel to an orchestra. The DNA takes the role of the instruments, and the epigenetic marks are the musicians who control pitch, tempo and volume—producing varying melodies by using the same instruments in different ways. The combination of epigenetic marks present in a person is significantly determined by environmental conditions, thus linking the two aspects of the age-old debate “nature versus nurture”, and showing that both genetics and the environment influence our development. It has been shown that classical Darwinian theory may only partly explain the evolutionary process. Whilst species do evolve adaptations via natural selection, the rate at which DNA sequences mutate may be too slow to explain the rate of phenotypic divergence in some cases. For example, changes in epigenetic marks have been shown to correlate more accurately with the phylogeny of Darwin’s finches than DNA mutations themselves. Epigenetics may also provide a mechanism for the previously scorned Lamarckian hypothesis—that characteristics acquired during an individual’s life could be passed down to the next generation. The integration of Lamarck’s hypothesis with
Darwin into a Unified Theory of Evolution may provide a more complete explanation for how adaptations are produced by natural selection. Epigenetics’ therapeutic potential is another exciting prospect. One useful feature of epigenetic marks is that, unlike genetic mutations, the marks are reversible. The epigenetic mechanisms themselves can also provide inspiration for treatments. An example of this is in the case of Down’s Syndrome, which is caused by a third copy of chromosome 21, and affects 1 in every 1000 babies born in the UK. This results in serious problems including risks of early onset Alzheimer’s, heart disorders, and vision problems, as well as moderate learning disability. But perhaps an epigenetic mechanism used in almost every female mammal could be used to help treat this condition. Women possess two X chromosomes, however only one of these is needed—after all, men survive just fine with a single copy. Therefore, early in development in each somatic cell, a gene on the X chromosome is busy covering one copy in a large amount of material known as non-coding RNA, which causes the chromosomes to coil up tightly. This “silences” the genes, as they are wrapped so tightly that the cell’s molecular machinery can no longer read their sequence. So, what if this mechanism could be utilised to shut down the superfluous copy in trisomy 21? Since 2013, this is just what researchers have been attempting to do, although it will take many years to work out how this mechanism can be made into a treatment. An increased understanding of mental illness in the past decade has rocketed it to the forefront of public awareness. Ongoing research has shown epigenetics to be involved, although its role and significance is still to be fully uncovered. For example, the enzyme responsible for DNA methylation has been shown to be overexpressed in the neurons of people with schizophrenia and bipolar disorder, resulting in disrupted neural transmission. Epigenetics is also providing hope for the problem of addiction, with alcohol and cocaine dependence being linked to epigenetic modification of certain genes in the brain. An exciting investigation was carried out where a DNA methylation inhibitor was given to cocaine addicted rats, and upon receiving this they stopped seeking out the substance. Although it will again inevitably be a while before humans can pop into the doctor’s surgery and receive a similar treatment, there is optimism. If this could be mirrored in different genes, it could be the solution to one of society’s most pressing problems. MT17|Bang!|23
Eavesdropping on Ecosystems A new way of understanding the natural world: listen to it Annika Schlemm Illustration Jacob Armstrong
ake a step outside and, just for a moment, close your eyes and listen. You may hear the melodic chatter of birds, the thrum of insects, and the dry croak of a frog. The growing field of soundscape ecology investigates natureâ€™s sound signatures, and the arrangement of vocalising organisms in their environment. This eavesdropping tool is allowing scientists and conservationists to explore the composition and the health of an ecosystem. As with shelter, food, and mates, sound is a limited and partitioned
ment is recorded and analysed. This provides a tool to monitor an ecosystemâ€™s health, and therefore track the impact of environmental changes. Whilst satellite images can be lacking detail, and field surveys are time consuming and labour intensive, soundscape ecology can be a low-cost and non-invasive alternative. The boundless richness of tropical forests provides a key example where bioacoustic monitoring plays an integral role in understanding biodiversity and conservation. Despite occupying less than 2% of the earthâ€™s land masses, tropical forests house an estimated 50% of all life. An understanding of bioacoustics allows for the rapid measurement of hundreds of species, where measures such as soundscape saturation provides an insight into the species which dwell within the habitat. Scientists can then use this data to identify the most diverse parts of the forest and the effectiveness of local conservation strategies. This provides vital information on the most at risk areas, and how to optimise conservation efforts, so that we can balance the needs of the ecosystem with human use of the land. Zuzana Burivalova, a tropical forest ecologist from Princeton University, has ventured into the depths of the jungle in Borneo, to record the orchestral arrangements of the vocalising creatures. Even if the organisms within an ecosystem are swooping through the forest canopy or rummaging amongst the leaf litter, the sound recorders provide a 360 degree view, about 50m into every direction. This allows access into even the most obscure corners, or
Sound is a limited resource for which organisms must compete resource for which organisms must compete. These interactions between animals have driven powerful evolutionary processes, which have resulted in organisms occupying their own space within the frequency spectrum. This allows them to vocalise clearly above the cacophony of sounds in an ecosystem, in order to communicate, court mates, defend territory, and warn when danger is approaching. This concept is encompassed in the acoustic niche hypothesis, which likens the acoustic niche of a species to a specific radio bandwidth that enables clear sound transmission. Traditionally, sound recordings have been used to monitor individual bird species, which caw and cackle at a certain frequency within the sound spectrum. This reductionist approach has more recently been expanded into an all-encompassing view of the ecosystem and its vocalising organisms, in which the full acoustic environMT17|Bang!|24
towering high habitats, which would usually be hard to access. The resulting soundscape is awash with noise from all forms of life, no matter how small or hidden. As the use of sound provides a measure of the overall state of the forest, Zuzana was able to monitor the impacts of different land uses and conservation efforts on biodiversity. The affected soundscape from forest disturbances indicated that even benign uses of the forest, such as selective logging, resulted in a more homogenous forest with lower species richness. The recorders enable long term monitoring, with minimal labour intensive work and a small price tag. This means large databases of sound recordings spanning many years can be built up, and used to monitor long term changes in an ecosystem. Stepping from the terrestrial habitat and diving into the oceans, a similar scape of distinctive diversity can be found in coral reefs. These habitats house some of the greatest biodiversity on the planet, with one quarter to a third of all marine life residing in these underwater jungles. In this submerged world, animals use sound to entice mates, safeguard their homes, and coordinate spawning events between males and females. As with life on land, the complexity and saturation of sound production can be used as a signal for the variability and diversity of life within the reef, meaning scientists can also keep an ear on the coral reef, to identify changes in
Scientists can also keep an ear on the coral reef to monitor its diversity
animal abundance and distribution. They can also monitor impacts of human activity, including boat traffic, chemical spills, or fishery activities, as well as the effectiveness of conservation strategies. The low-cost and non-invasive sound monitoring methods play an important role in monitoring underwater ecosystems as coral reefs are faced with a warming climate and increased acidity of the oceans. This may result in coral bleaching, in which the symbiotic algae that provide food for the coral evacuate their habitat, and with that, trigger events that can cascade into biodiversity reduction in coral reefs. If the biodiversity were to decrease, the frequencies occupied would correspondingly plummet, resulting in an incoherent and patchy soundscape. Underwater noise
is thought to play a role in habitat selection for juvenile reef dwelling animals, thus sound is essential in replenishing and maintaining healthy coral reefs. Therefore, disruptions to this process could result in a vicious cycle if the sounds produced by a coral reef become less variable and saturated, and hence less attractive to these organisms. The good news is that once we understand the underwater sound fingerprints, the role they play, and what different saturations and com-
plexities of frequencies convey to marine organisms, it may be possible to broadcast these sounds and attract the larvae that are vital to replenish and restore the bleached coral reef. Acoustic monitoring is becoming an increasingly important tool in prioritising habitats to receive conservation efforts, balancing human needs with conservation, and preserving the biodiversity and ecosystem services that shape Earth. With increasing anthropogenic pressures, understanding the impact of our actions and how these can be minimised is fundamental to our sustainable existence on this exceptional planet. In the words of musician Bernie Krause, a picture is worth a thousand words, but with the unparalleled insights that acoustic ecology is providing, perhaps a soundscape is worth one thousand pictures.
Whether you are graduating in 2018 or later, you’ll soon think about life after university. You may continue your studies, or go travelling, but many will look to start their career in industry. At ecm, we partner with a variety of technology based companies who are looking for people to join them. We match technically-able people with technical based roles.
You’ll have (or be aiming for) a 1st / 2.1 class degree in computer science, physics, maths, engineering or similar. Interested in a career in software development, engineering, technical consultancy, quantitative finance, machine learning, experimental physics, AI, data science, product innovation, probability or another highly numerate area. Looking for an interesting role with a start-up, SME or specialised technical team rather than joining a graduate scheme.
Our service to you is free – no sign-up fees or costs at all. Send us your CV (rest assured your details will not be sent to any company without your prior permission) and we can let you know what roles we have that could be suitable. Too early? Get in touch and see what the job market is like and stay ahead of the curve. CV@ecmselection.co.uk
Resistance is Fatal Innovative strategies for the war against drug-resistant superbugs Marianne Clemence Illustrations Eleanor Minney
irtually every single one of us will have used antibiotics at some point in our lifetime, either to treat an existing infection, or before a surgical procedure. They are arguably the greatest development of 20th century medicine, yet frighteningly, scientists are prophesying the end of the antibiotic era in the not too distant future. You’ve probably seen the headlines in the news: untreatable gonorrhoea, MRSA superbugs, extensively drug resistant TB. It doesn’t make for light reading. Antibiotic resistance is not a new problem. In fact, it first emerged in the 1940s, the same decade that mass production of penicillin began. Unfortunately, it is an almost inevitable consequence of Darwinian evolution. The bacteria living inside us randomly generate mutations all the time, and some of these mutations confer antibiotic resistance. When a patient is treated with antibiotics, these mutations provide a selective advantage; resistant bugs can continue multiplying, while their susceptible counterparts are killed. This is bad enough for the patient, but is also very bad news for public health, as these so-called superbugs can then infect other people and spread around the community. Careful management of antibiotics can slow the spread of resistance, but our track record of antibiotic stewardship to date is poor. Bacteria are now developing resistance to drugs faster than we can develop new ones. To keep untreatable infections at bay over the coming years, we urgently need to get smarter about how we use the antibiotics we already have, and accelerate the development of new therapies. In the race to develop new antibiotics, microorganisms are a good place to look. Many of the antibiotics we already have were developed from secondary metabolites of bacteria and fungi. These secondary metabolites are substances that aren’t essential for normal growth of the organism, but may confer some other environmental advantage. A classic example is of course penicillin, which is a secondary metabolite produced by the mould Penicillium. Whilst Penicillium uses penicillin to compete with other microorganisms, scientists Florey, Chain, and Heatley famously developed it into the first antibiotic. But finding products with
antibiotic potential from the thousands of genes in millions of species is no easy feat. Scientists need to develop innovative approaches to try and find new antibiotic candidates. Bioinformatic approaches, made possible by next generation sequencing, allow researchers to mine genome sequences for genes that might be of interest. Then, it is a case of resourcefulness and persistence. Researchers at the Hans Knöll Institute in Germany did just this to identify a candidate secondary metabolite in the bacterium Clostridium cellulyticum. They were puzzled to find that the gene’s product was not produced in standard lab conditions, even with the addition of several supplements. Nevertheless, they reasoned that, since the bacterium was first discovered in grass compost, supplementing the growth media with a soil extract might work—and it did. The mysterious metabolite, called closthioamide, turned out to have antibiotic properties. Closthioamide is now being researched as a potential new therapy against drug-resistant strains of gonorrhoea, offering new hope for keeping untreatable infections at bay. The discovery of closthioamide was a great example of investigative biology, but we need a more systematic survey of microogranisms to discover new drugs at the rate required to meet demand. This is where soil becomes important again. With up to a billion bacteria per gram, soil is teeming with diverse life forms, which may provide an extensive resource of potential new antibiotics. The only problem is, many of these species don’t even grow under standard lab conditions, let alone produce anything useful. Researchers at the Northeastern University in Boston, Massachusetts, have made great strides in ‘culturing the unculturable’. They grow bacteria in a system called “iChip”, in which bacteria are grown in special chambers with semi-permeable membranes. The chambers can be surrounded with soil, to replicate the nor-
MT17|Bang!|26 Image: Wikimedia Commons
mal environment of the bacteria. Using iChip, dozens of antibiotic candidates have been identified in a short amount of time. One candidate, called teixobactin, is now being developed as a novel treatment for MRSA. Finding more antibiotics would be a relief, but is unlikely to slow the arms race against the development of antibiotic resistance. One way to gain the upper hand would be to develop drugs with a narrower spectrum of activity. Most antibiotics currently in use have a broad spectrum of activity, meaning they target many different bacterial species living inside the patient, in addition to the disease-causing target. This can disrupt the balance of bacteria inside the patient, reducing the competition faced by resistant strains, and allowing them to grow and spread more easily. At the same time, these off-target bacteria themselves are subjected to the selective pressures of the antibiotics, which can cause trouble later. Take E. coli, a bacterium that normally lives in the gut, but has been associated with carbapenem-resistant Enterobacteriacae (CRE) infections. Resistant strains of E. coli may increase in prevalence in someone who overuses antibiotics to treat unrelated infections. If the bacteria spread to people with weak immune systems, or get somewhere in the body where they don’t belong, they can cause antibiotic-resistant disease in their own right. This could be avoided by just targeting the responsible bacteria during an acute infection. One exciting possibility for narrow-spectrum antibiotics is the development of therapeutic bacteriocins. Bacteriocins are toxins produced by certain bacteria, in order to kill other bacteria. Unlike conventional antibiotics, bacteriocins are only programmed to kill bacteria that are closely related to (but not the same as) the producer of the toxin. The producer would be susceptible as well, were it not for the fact that it also produces a so-called immunity protein, which protects it from its own toxin. This gives the bacteriocin producer an ecological advantage, as closely related bacteria—which might otherwise compete for the same resources—are killed off. Scientists are trying to work out how this bacterial warfare takes
place, and harness it in the development of antibiotic therapies against specific strains and species of bacteria. Encouraging results suggest that a bacteriocin targeting Pseudomonas aeruginosa, a pathogen that is problematic for patients with cystic fibrosis, protects mice from the bacteria. Another approach is to give the bacteria a lethal infection of their own, using bacteriophages. Bacteriophages are viruses which, instead of infecting human cells, like measles or influenza, infect bacteria. They are perfectly structured to attach to the surface of bacterial cells, and inject their genetic material. Once inside, they use the bacterial cell machinery to replicate themselves, until so many new bacteriophages have been produced that they burst the cell and go on to find other bacterial cells to infect. Phage therapy can be specifically targeted, since certain bacteriophages only infect certain bacteria, in the same way that viruses may be restricted to specific animals. Phage therapies also have an unusual property in that they can replicate inside the bacteria infecting the patient, meaning a lower initial dose is required. Although phage therapy has been in the scientific literature for decades, the threat of untreatable infections may result in it receiving serious attention in the next few years. Currently, an EU-funded clinical trial called PhagoBurn is underway to assess the efficacy of phage therapy in treating burn-related infections. With further research, more trials could be underway in the future. It is hard to imagine a world without antibiotics, but over the next ten years we will have to tackle the problem of antibiotic resistance to prevent such a scenario. It isn’t all doom and gloom though, and there are exciting developments in the pipeline, with those described here by no means exhaustive. No amount of research output will solve the problem of antibiotic resistance overnight, so in the meantime we must strive to improve our stewardship of the antibiotics we have left to prevent untreatable infections becoming the norm.
Solar Power-up Thomas Hornigold
ow does a solar panel work? A semiconductor like silicon has electrons grouped into two energy states. Photons from the sun fly in, hit an electron, and cause it to jump between the bands, driving a current from which we extract energy. But there are limits to how much of the Sun’s energy we can extract. The Sun emits photons at many energies, but only a limited range can be utilised, based on the distance between the low energy and high energy electron bands (called the “bandgap”). Less energetic photons can’t excite electrons; more energetic and the electron has a lot of thermal energy which is dissipated. Luckily, the photons silicon is able to use are pretty close to the energy level most frequently emitted in the Sun’s rays. Traditional silicon solar cells can make use of at most 32% of the photons in sunlight. Take factors like reflection and wires that block light into account and the figure that has been achieved falls to around 24%. This is called the Shockley-Quiesser limit. So: how can we cheat the sun? One method is tandem solar cells—you have multiple solar cells joined together with different bandgaps, which can harvest different areas of the sun’s spectrum. Currently available tandem cells can get you up to 40% efficiency under good conditions,
but they’re very expensive. Other ideas being investigated include “Quantum Dot Solar Cells.” Quantum dots are artificial atoms, made out of semiconductors; the size and shape of the semiconductor blob determines its electrical properties. With these you can set your own bandgap to be perfect for the sun, and approach the Shockley-Quiesser limit Another area that has a great degree of excitement around it is “singlet fission” materials. These are materials which can absorb a high-energy photon. The electron-hole pair travels along in an excited state, and then splits into two pairs, which each then emit a photon. You’ve effectively converted the high-energy photon, whose energy would have been wasted, into two lower-energy photons, which could then theoretically be absorbed by a conventional solar panel underneath. This offers a dream eventual application: simply paint a layer of singlet fission material onto a traditional solar cell and watch its efficiency climb. Solar panels have been falling in price at an exponential rate. The human race needs this to keep happening—we have to push past the limit.
Shockwaves in Spacetime Matthew Nicholson
ne billion years ago, in a distant galaxy, two black holes collided. The two astronomical giants had a combined mass equivalent to 67 suns, and their union released enough energy to light every star in the universe 10 times over. Einstein predicted such a colossal redistribution of mass would stretch and compress spacetime, the four-dimensional fabric of space and time combined, and generate waves within it. As it is the force of gravity that causes these huge masses to collide, these waves have been named “gravitational waves”. In 2014, Einstein’s predictions were verified, and the discovery of gravitational waves was this year awarded the Nobel Prize for Physics. The ripples created by these two black holes travelled at the speed of light until, after a time longer than our species has been on the planet, a technology in its utmost infancy detected them. The Laser Interferometer Gravitational-Wave Observatory (LIGO) measures the time for a beam of light to travel 1.4 miles in two perpendicular directions and infers from any difference between these two direction’s times if spacetime has been warped. MT17|Bang!|28
Methodology such as this has existed for many decades. However, where LIGO comes into its own is on matters of scale. The relative stretching of space that is measured is comparable to one ten-thousandth of the width of a proton, a precision equivalent to measuring the distance to Proxima Centauri, our nearest star, accurate to the width of a human hair. But this analogy does not do the accomplishment justice. To eliminate the possibility of the beam interacting with the air before it reaches the detector the tunnel was turned into a near vacuum. The detector was also suspended like a huge pendulum to counter the effects of any small, local, mass redistributions, such as a medium sized lorry driving near the detector. Only two detectors currently exist, located at opposite ends of the USA, but more are being planned across the globe and further detections are occurring every few months, offering much greater resolution. Astronomical events that were previously inaccessible with mere radio telescopes will now be well within our grasp. Another huge leap has been made in mankind’s understanding of the universe.
Crossword Rearrange the shaded letters to form the name of a chemical element, and send it in to firstname.lastname@example.org by 26th November (Sunday of 8th week) for a chance to win! One winner chosen at random from all correct entries will receive a copy of Angela Saini’s new book Inferior: How Science Got Women Wrong, and the New Research That’s Rewriting the Story. ACROSS
8 Somebody who speaks English (10) 9 Oxford University summer school (4) 10 Extremely small amount (4) 11 Spanish bull run city (8) 15 Symbol for unit of energy used in particle accelerator measurements (2) 16 Something that has already been decided before the affected parties hear about it (4,8) 18 Not physically existing (7) 20 State of things as they actually are (7) 21 Preservation of environment, wildlife etc. (12) 23 Chemical symbol for 28D (2) 26 Indicating the existence of something by suggestion (8) 27 Home of the Large Hadron Collider (4) 30 Hearing organs (4) 31 Refusal to accept or comply (10)
1 Found in the sea (6) 2 Banded quartz (5) 3 Snake (3) 4 Unit used to measure 31A (3) 5 Something used to improve (8) 6 Hormone which inhibits pain (9) 7 Equation showing that two quantities are the same (8) 12 Alcohol with three carbon atoms, used in hand sanitizer (8) 13 In a grown-up manner (8) 14 Chemical symbol for precious silvery metal (2) 17 To give off water through stomata in a plant or leaf (9) 18 Jabs (8) 19 Scented purple plant (8) 22 Symbol for chemical element about which Sia sang (2) 24 City in western France, the birthplace of Jules Verne (6) 25 Units of time equal to one billion years in geology and astronomy (5) 28 Name of element 23A (3) 29 International time standard (3)
OxfOrd ScientiSt ht18
The new name for Bang! magazine—launching next term Now recruiting writers, illustrators, and editors Contact email@example.com to get involved
out with a Bang! ...coming next term... the
The Michaelmas term 2017 issue of Bang! Science Magazine on the theme of Progress. The last ever issue of Bang! before rebranding to the Oxf...
Published on Nov 10, 2017
The Michaelmas term 2017 issue of Bang! Science Magazine on the theme of Progress. The last ever issue of Bang! before rebranding to the Oxf...