Michaelmas 2016 Issue 37
Cambridge University Science Magazine
The Doctor Will Smell You Now
Nick Lorch sniffs out new frontiers in diagnostics 8
When Plants Cry for Help
Andrew Catherall explores how plants summon insect armies 10
Out of Sync
Daniel Fernando investigates our bodies’ natural time keepers and their role in health and disease 12
Revisiting the Test Tube
Sarah Foster considers the ethics of human embryo experimentation 14
FOCUS Out of the CRISPR, Into the Clinic Bluesci traces the past, present, and future of the genome editing revolution
On The Cover News Reviews
3 4 5
Behind the Science
Liam Wilson debunks the myth of learning styles in his winning article Kimberley Wiggins examines the phenomenon of citizen science Tom Hitchcock tells the story of biological polymath Conrad Waddington Arthur Neuberger traces the development of monoclonal antibodies
Adam Barker examines the strange world of quantum computers
Synthetic Biology Special
Weird and Wonderful
Alex Bates and Geoff Ma tell a tale of biohacking and genetically modiﬁed algae Optical Illusions,Vintage Beer and Killer Flies
BlueSci was established in 2004 to provide a student forum for science communication. As the longest running science magazine in Cambridge, BlueSci publishes the best science writing from across the University each term. We combine high quality writing with stunning images to provide fascinating yet accessible science to everyone. But BlueSci does not stop there. At www.bluesci.org, we have extra articles, regular news stories, podcasts and science ﬁlms to inform and entertain between print issues. Produced entirely by members of the University, the diversity of expertise and talent combine to produce a unique science experience.
President: Alexander Bates...................................................email@example.com Managing Editor: Pooja Shetye................................firstname.lastname@example.org Secretary: Sophie Protheroe................................................ email@example.com Treasurer: Zoë Carter .................................................... firstname.lastname@example.org Art Editor: Oran Maguire .................................................. email@example.com Radio: Rebecca Richmond-Smith................................................firstname.lastname@example.org News Editor: Janina Änderbar.....................................................email@example.com Web Editor: Simon Hoyte.................................................firstname.lastname@example.org Webmaster: Mrittunjoy Majumdar...................................email@example.com Social Secretary: Jenny Easley...................................firstname.lastname@example.org
Issue 37: Michaelmas 2016 Issue Editor: Amelia J. Thompson Managing Editor: Pooja Shetye/Amelia J. Thompson Second Editors: Janina Änderbar, Alex Bates, Hannah Cornwall, Camilla d’Angelo, Darcy Evans, Sarah Foster, Chiu Chai Hao, Claire King, Robin Lamboll, Stephanie Norwood, Lucy Oswald, Rebecca Richmond-Smith, Samiha Shaikh, Adina Wineman, Suyi Zhang Art Editor: Oran Maguire News Editor: Janina Änderbar News Team: Francesca L. Boughey, Stephanie Norwood, Laura Nunez-Mulder Reviews: Sarah Foster, Tom Hitchcock, Lauren McGinney Feature Writers: Andrew Catherall, Daniel Fernando, Sarah Foster, Nick Lorch Regulars Writers: Adam Barker, Alex Bates, Tom Hitchcock, Geoff Ma, Arthur Neuberger, Kimberley Wiggins, Liam Wilson Focus Team: Sophia Ho, Alba Landra, Laura-Nadine Schuhmacher, Amelia J. Thompson Weird and Wonderful: Zoe Carter, Hannah Cornwall, Joy Thompson Production Team: Alex Bates, Nick Lorch, Anna Lombardi, Stephanie Norwood, Oran Maguire, Amelia J. Thompson Advertiser: Julie Skeet Illustrators: Adam Barker, Alex Hahn, Fong Yi Khoo, Oran Maguire, Asal Moine Cover Image: Fong Yi Khoo, Asal Moine
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License (unless marked by a ©, in which case the copyright remains with the original rights holder). To view a copy of this license, visit http://creativecommons. org/licenses/by-nc-nd/3.0/ or send a letter to Creative Commons, 444 Castro Street, Suite 900, Mountain View, California, 94041, USA.
Food for Thought eVerY sePtemBer Harvard University hosts the IgNobel awards, for research at the very fringes of science that ‘makes people laugh, then think’. Though awarding a prize for the systematic study of bellybutton ﬂuﬀ may seem light-hearted, it has the same aims as science journalism at its best: to delight, to inform, and to make its audience think. In keeping with this proud tradition (albeit sans bellybutton lint) this issue’s writers explore scientific topics ranging from the cutting-edge to the strange and unusual, providing novel and thought-provoking perspectives along the way. Andrew Catherall shows us a plant’s view of the world and makes us marvel at their intricate strategies for biological warfare. Nick Lorch explores the frontiers of medical diagnosis, a bizarre realm where pigeons spot abnormalities on X-ray images and rats sniﬀ out tuberculosis, while Daniel Fernando examines the molecular machinery that make our biological clocks tick – and the medical problems that ensue when our bodies and their environments are out of sync. Advances in biomedicine are often controversial, not just scientifically but also ethically. Earlier this year, Cambridge scientists found a way to keep human embryos alive in the laboratory for far longer than was previously possible, and here Sarah Foster discusses both the moral and legal implications of such a discovery. This work came hot on the heels of yet more studies in human embryos, which were the proof of principle for the new genome editing technique, CRISPR – a technology now often mentioned in the same breath as designer babies. In our Focus, we cut through some of the hype surrounding CRISPR and take a serious look at its potential medical applications, whether diagnostic tool or HIV treatment. Along the way, we trace the evolution of the modern genome editing toolbox, and later in our Synthetic Biology special we delve deeper into the world of genetic engineering and how a team of Cambridge student ‘biohackers’ is optimising CRISPR technology in plants. The theme of science in society continues in the rest of our Regulars. Kimberley Wiggins looks at the phenomenon of citizen science and shows us that scientific thinking need not be confined to a laboratory. Arthur Neuberger takes a historical approach in his discussion of monoclonal antibodies, which takes us to the contributions that Cambridge scientists made in this field a few decades ago, while Tom Hitchcock steps further back in time to consider the amazing career of Conrad Waddington, developmental biologist and general polymath. Last – but definitely not least – Bluesci congratulates the winner of the inaugural Neurojournalism completion, Liam Wilson, whose winning article on ‘neuromyths’ is published on page 20. The competition is the brainchild of CamBrain, Cambridge’s student-run neuroscience society, and promotes high-quality writing on neuroscience topics. Science, whether in the laboratory or in the pages of a magazine, is sometimes strange, often thought-provoking, and always fascinating. The millennia-old beer discussed so enthusiastically on page 28 is all three – now thats truly food for thought!
Amelia J. Thompson Issue 37 Editor
On the Cover the world of molecular genetics is a strange one. The field itself is easily defined: it is the study of gene structure and function, at the level of DNA and associated molecules. The concepts involved, however, are notoriously hard to visualise. Cells and tissues have traditionally been more accessible to the scientific and artistic imaginations; putting them under a microscope yields valuable information and, often, stunningly beautiful images. Genes, however, being a way for cells to encode information, are as much concepts as they are biological objects – and depicting them in a way that is both biologically and conceptually accurate is still no trivial task. Here, talented artists Fong Yi Khoo and Asal Moine have teamed up to take us on a visual tour of the machinery behind a nowfamous concept in molecular genetics, CRISPR gene editing. Although only discovered a few years ago, the CRISPR mechanism has already transformed the study of genetics at the laboratory bench, by providing a way for scientists to edit (i.e. cut and paste) DNA sequences more accurately than ever before. Khoo and Moine zoom in from cells and tissues, in purple, to the level of individual chromosomes, and finally to the CRISPR machinery itself, which is shown caught in the act of cutting a strand of DNA. Their black and white, ‘abstract’ view of the CRISPR apparatus helps us come to grips with the concept of DNA editing, while at the same time highlighting that we have some way to go before we will have a truly ‘full colour’ view of molecular genetics.
FONG YI KHOO, ASAL MOINE
News ORAN MAGUIRE
Genes and Virginity
Check out www.bluesci.org or @BlueSci on Twitter for regular science news and updates
THE AGE at first sexual intercourse (AFS) is strongly
determined by environmental factors, such as levels of parental monitoring, religious belief, and peer pressure. It has been shown that a younger AFS is associated with negative life circumstances such as living in an unstable or socially disadvantaged family. Now, a research team led by Cambridge University scientists have found that AFS is also partly genetically determined. Using genetic and personal history data of over 380,000 people, the team investigated how AFS correlates with other key landmarks in a reproductive lifespan. They discovered that AFS is partially predictive of onset of puberty and menopause, as well as age of first child birth and number of children. Furthermore, the scientists found 38 specific genetic variants associated with the AFS. As with most multifactorial traits or diseases, each gene alone may have a negligible effect – but cumulatively, they do influence reproductive behaviour. The scientists then went on to look into the functions of the genes most likely affected by the genetic variants they had found. Some of the genes influence reproductive traits, such as ESR1, which encodes the oestrogen receptor. Others are key in neural development and activity. CADM2 is one of these, and has previously been linked to risk-taking behaviour, a trait that is in turn associated with younger AFS. Overall, this study begins to build bridges between the genetic and environmental factors that influence human behaviour. The lead scientists stated that they hope their findings will help plan more effective public health campaigns to positively influence reproductive behaviour. LNM
Culturing Human Embryos
OVER THE COURSE of nine months, a human embryo will
develop from a single cell into a fully formed infant. For a long time scientists have used in vitro culture methods to keep embryos in petri dishes and study how they grow and organize themselves. However, so far this has only been possible up until the seventh day of development, at which point the embryo would normally implant into the mother’s womb. Now scientists at Cambridge University and the Rockefeller University in New York have succeeded in developing a culture system that allows embryos to develop past this important stage. By providing the embryos with a surface to attach to and covering them with a special nutrient-rich solution to keep them healthy, the researchers were able to keep them alive in vitro up to thirteen days, which is close to the legal limit of 2 weeks. Using this technique, the researchers were able to show that embryos are able to organize themselves into early tissues even without external influences, such as signals from the womb. They also found that one important step in this process, the formation of the pro-amniotic cavity, is not, as was previously thought, caused by programmed cell death. Instead, the embryo’s cells actively organize themselves to allow the cavity, which is crucial for the subsequent setup of the body plan, to form. Professor Zernicka-Goetz, the lead researcher in the study, said in a press release, “This new technique provides us with a unique opportunity to get a deeper understanding of our own development… [and to] understand what happens, for example, during miscarriage.” The scientists also hope that future research using this technique will lead to increased success rates of In Vitro Fertilization (IVF) techniques. SN
Exoplanets and Ultracool Stars ORAN MAGUIRE
IS THERE LIFE outside our solar system? This question has occupied humans for a long time. Now, an international group led by astronomers from the University of Cambridge have found a new place to look in search of the answer. The researchers discovered three planets orbiting an ultracool dwarf star dubbed TRAPPIST-1, in the Aquarius constellation just 40 light years away from Earth. These exoplanets are Earth-sized with short orbits in the range of days to months and are the first to be discovered around a star so small and so dim. TRAPPIST-1 is similar in size to Jupiter, emits mainly infrared light and is much cooler than our Sun. The TRAPPIST telescope in Chile was used to observe TRAPPIST-1 for three months. During this time, regular dips in the brightness of the star were observed indicating multiple transiting objects which the scientists interpreted as three planets orbiting the star. They then
analysed the collected data more in depth. For example, they examined how the planets’ atmosphere affects the light from TRAPPIST-1 that we receive on Earth, which indicates how suitable for life the planets are. The astronomers inferred that the planets’ temperatures and sizes are similar to Earth and Venus. They also concluded that the two planets closest to the star are on the inner edge of the star’s habitable zone, the planet-star distance where water would be liquid on the surface. However, the fact that a planet is located in the habitable zone does not necessarily mean that it is Earth-like or that water or life are found there. NASA’s James Webb Telescope, which is due to launch in 2018, will allow astronomers to look at the atmospheres of these planets in more detail to try and determine whether there is any water or other biological remnants.” FLB
Reviews MICROBE WORLD
The Signal and the Noise - Nate Silver
Like a modern Oracle of Delphi, Nate Silver and his website FiveThirtyEight have built a reputation for unerringly accurate predictions, correctly calling all fifty states in the 2012 US presidential election. In The Signal and the Noise, Silver examines the art of prediction more closely, interviewing experts on earthquakes, basketball, and economics. Silver focuses not only on pinning down the key factors required to produce high quality predictions, but also on how we can learn to improve them over time. He delves into where predictions have succeeded (meteorology) and where they have most obviously failed (the 2008 financial crash), and diagnoses some of the key reasons behind these outcomes. The Signal and the Noise offers a careful explanation of statistics and probabilistic thinking, tempered with enough anecdotes, examples, and graphs to make it an approachable read for even the least mathematically inclined. Rarely do books feel as relevant and far reaching, and rarer still as refreshingly honest and modest about the problems faced in trying to understand the world. The Signal and the Noise is a perfect fit for fans of Moneyball and Freakonomics, or for anyone who wants to better understand how to see a little bit further into the future. TH
Death on Earth - Jules Howard
Following ideas from his first book, Sex on Earth, Jules Howard playfully addresses the mysteries, science and wonders of both life and its ultimate demise. Contrary to the title, this is a feel-good book that celebrates life whilst keeping the burden of death securely shrouded under the mysteries of nature. Such mysteries include the aptly named immortal jellyfish, the empathetic scrub jay and the unsightly naked mole rat (the photographs of which will be greatly appreciated). From the very first chapter (where the author suggests an eye-opening and resounding definition of the concept of ‘life’), through his personal experiences and wacky encounters with experts in the field, and on to the biological mechanisms underpinning our ultimate fate, this book is upbeat and easy to read. More autobiographical than a typical popular science book, Death on Earth is memorable and undoubtedly worth a casual read. By revealing and suggesting the direction of future scientific study of such a fundamental aspect of our lives, Howard offers an open and insightful discussion of why, when and even if ‘Death on Earth’ is enviable, and how humanity is creating new possibilities to avoid that fate - after all, who wouldn’t want to live forever? LM
A Mathematician’s Apology - G. H. Hardy
Written in 1940, near the end of Hardy’s life, A Mathematician’s Apology is a profoundly sad reflection on an aging mathematician’s perceived loss of mental acuity, as well as a testament to the beauty of pure mathematics and an eloquent defense of its ‘uselessness’. Hardy (b. 1877) was an unquestionably brilliant mathematician – his name decorates critical theorems in fields as diverse as population ecology (Hardy-Weinberg equilibrium) and number theory (Hardy-Ramanujan asymptote theory), though he is perhaps best known for his work in number theory and his mentorship of Srinivasa Ramanujan. Hardy’s ’apology’ is only one in the philosophical sense of the word; his essay provides a reasoned argument in defense of pure mathematics. The more applied mathematics need no justification: ‘the mass of mathematical truth is obvious and imposing… [its] applications…obtrude themselves on the dullest imagination.’ But pure mathematics, far removed from bridges and bombs, requires a more subtle explanation, and in the Apology Hardy seeks to provide one – complete with mathematical proofs and a delightfully caustic denouncement of popular science writers. Together with its elegant and sometimes heartbreaking introduction in the Cambridge Edition (by Hardy’s friend, the physicist-turned-fiction writer C.P. Snow), A Mathematician’s Apology offers some insight into a great mathematician’s mind. Its musings on mathematical beauty and its defense of intellectual exploration will be appreciated by mathematicians and non-mathematicians alike – and will surely resonate with those readers who already agree that ‘pure mathematics [is] a rock on which all idealism founders’. SKF
The Doctor Will Smell You Now The moisture on a dog’s nose helps it trap odourants. Its nose separates incoming airflow into breathing and scenting streams and it has a much larger olfactory bulb than humans. These factors help it detect some odourants in parts per trillion
Nick Lorch sniffs out new frontiers in medical diagnosis
DOCTORS USE EVERY SENSE at their disposal to diagnose their patients – even senses that are not their own. Ancient Greek pioneers noticed that bees would flock to sweeter urine, negating the need to taste it oneself when testing for diabetes. While diabetes is now detected with blood tests, smelling a patient can still be a test for other conditions. Smelling a patient’s breath, for example, can give a trained nose the same insight as a chemical analysis of their blood. There are many well-established reasons a doctor might want to smell you. They may be checking for ketones on the breath to spot a diabetic in danger of ketoacidosis. Or for liver failure, where chemicals from gut bacteria that are normally broken down by the liver reach the blood, and then the breath. George RR Martin’s work A Song of Ice and Fire poses us another diagnostic challenge: Reek, a character who constantly smells, despite fastidious hygiene. This is suggestive of trimethylaminuria, or other defects of metabolism. Smell has become a ground-breaking field in medical research. Joy, a retired nurse, made headlines round the world when she claimed to have noticed a change in her husband’s scent after he suffered from Parkinson’s disease. The odour change was subsequently proved in a blind trial, subjects smelling T-shirts of Parkinson’s disease patients or unaffected controls. Parkinson’s disease affects movement and mood, which could of course change someone’s hygiene and therefore smell. However, other cells are subtly affected too; scientists often use white blood cells from Parkinson’s disease patients to study their metabolic or genomic abnormalities. So, it is feasible that their might be a subtle change in the cells of the sweat glands, which
The Doctor Will Smell You Now
Joy might have detected. A larger study is planned, but nothing concrete has been published yet. Outside of medicine, NASA employs a ‘chief sniffer’ to prevent harmful volatile chemicals being released from equipment sent into space. Animals have a lot to offer smell research in science and medicine, including keener noses and the ability to be trained. Sniffer dogs already help the police and are increasingly being trialled in the clinic too. Simple tests like this would require only minutes. A major goal in smell research is detecting cancers early, and there are many studies where trained dogs have shown real results, whether detecting lung cancer from breath, or bladder cancer from urine. There is also a lot of insight into which of the many, many volatile organic compounds in breath, blood, or urine they are responding too, for example, formaldehyde in the case of lung cancer. However, it is not an exact science. Pentane in the breath can be produced in extremely different diseases: asthma, inflammatory bowel disease, liver disease, lung cancer, rheumatoid arthritis, schizophrenia, and many others. There are also technological sniffers for detecting disease including E-Noses. These contain carbon nanotubes which detect organic compounds that bind and alter their electrical properties, in a much more sophisticated way than simple breathalysers. E-noses can recognise the “smellprint” of multiple compounds specific to different diseases. In one study, E-Noses could differentiate lung cancer from chronic obstructive pulmonary disease. However, for now, the dogs appear to be in the lead. They have higher sensitivity and specificity for detecting cancers. It seems quite likely that in the future we will be requesting lab reports from Labradors. Another example of animals aiding medical diagnosis involves landmines and tuberculosis (TB). Both can, fortunately, be sniffed out by the Giant African Pouched Rat, Cricetomys gambianus. APOPO, a non-profit organisation based in Tanzania, trains “HeroRats” to detect landmines. In 2009, they also trained rats to detect TB in sputum samples. The rats do not catch TB, and can be trained just like dogs, but are cheaper and do not require a specialised handler. Michaelmas 2016
Globally, there were 9.6 million new cases of TB and 1.5 million deaths in 2015, which is more than resulted from HIV. While drug-resistant TB threatens the UK, the WHO reports that only 3.3% of cases world-wide are drug-resistant. Therefore, the majority of global deaths could be prevented with proper diagnosis and treatment. The gold-standard in TB detection uses the gene-copying polymerase-chain-reaction to detect active TB, and can check for antibiotic resistance genes quickly. However, in Tanzania, most diagnoses are made by examining sputum under the microscope, or by waiting weeks for a bacterial culture to grow. APOPO’s rats are certainly much faster: 1 rat (with three handlers) can check 140 samples in an hour, while 1 microscopist can only check 20-40 samples per day. However, in a head-to-head study, the rats’ results seem less useful. Specifically, the rats gave a lower positive predictive value (0.42) than the microscopists (0.96). This is the chance that a reported diagnosis of TB is correct. However, the rats were more sensitive; they correctly identified more of the TB-positive patients, which led the scientists to conclude the rats could be used for second-line screening, alongside microscopists. APOPO’s website boasts that their rats have to date detected 7,445 additional cases of TB in samples already checked by a human under a microscope. Taking a brief diversion from the wonderful world of smell, another diagnostic talent animals can offer is image recognition. A recent study asked whether pigeons could do the job of lab-based doctors. The answer: sort of! An American research team, led by Richard Levenson, investigated whether pigeons could Michaelmas 2016
spot breast cancer, either on a breast x-ray or on a slide of cells, stained and magnified. The pigeons were a trained with a set of cancer-free or cancerous images; if they pecked on the sample images containing a cancer, they were given a food reward. After a few days of training, the pigeons were able to choose the cancercontaining slides from a new set of images they had not seen before, despite the images being standardised to remove obvious clues such as mean brightness or colour. The pigeons had apparently extrapolated rules about the shape and organisation of cells in cancers, and then applied them to the new images, just like pathology trainees. This is far beyond the simple memorisation that is normally associated with animal training. The pigeons failed to master a more difficult task: distinguishing between benign and cancerous masses on the mammograms. However, they succeeded at an easier one: spotting micro-calcifications, a dense spot on a mammogram, and an indicator of breast cancer. It is truly fascinating that such an important task which might take a pathologist years of training to master, can be learned by a pigeon, over a week, with similar levels of accuracy. Levenson further improves their accuracy by ‘flock-sourcing’, or polling multiple pigeons on the same image. When more of the pigeons agreed, they were more accurate, rivalling human pathologists. Levenson hopes that his research could be used to improve image-classifying software to aid radiologists in detecting breast cancer. Sadly, it seems there are no plans to use pigeons’ remarkable skills in the hospital just yet. However, the remarkable work done by sniffer dogs, and APOPO’s rats shows just how useful the keen noses of animals can be.
A pigeon’s brain is smaller than the tip of your index finger. Despite this they are able to categorise sophisticated visual information
One of APOPO’s giant pouched rats in training. Training only costs $7,300 per rat, which is almost four times less than for a dog
Nick Lorch is a 4th year medical student at Emmanuel College.
The Doctor Will Smell You Now
When Plants Cry For Help When attacked, plants can summon a personal insect army to protect themselves. Andrew Catherall explores how
Pseudomyrmex ferruginea, the acacia ant, will aggressively defend its patron from other insects such as leaf cutter ants, and even mammals such as goats
EACH DAY, plants are subjected to a constant, sustained, and organised assault by armies from every branch of the tree of life. Herbivores tear at their bodies, fungi squeeze into their roots and microscopic bacteria strive relentlessly to infect the plant and steal its food supplies from within. Superficially, things do not look hopeful for diseased plants: they have no brain, central nervous system, muscles or tissues, and most are literally rooted to the ground, unable to move away from their assailants. But, contrary to popular portrayal, plants do not passively accept this fate. In a remarkable twist of evolution, many plants have found a way to recruit their own hordes of willing soldiers: a phenomenon called ‘indirect defence’. When attacked by an herbivorous, sap-sucking or egg-laying animal, a plant can rapidly initiate the production of a suite of odourants specific to its attacker. These chemical distress signals are released into the environment to summon willing bands of predatory invertebrates that quickly swarm the plant, overpowering and devouring the plant’s attackers. By treating the enemy of its enemies as its friends, plants have access to a remarkable mutualistic relationship. But how do they summon a mob? How can brainless, blind and sessile organisms identify the predator they are being attacked by and then create and transmit signals that can summon specific invertebrates kilometres away? The biochemistry and physiology of indirect defence remain unclear for many plants, but are relatively well characterised for the lima bean. Using odourant signals, the lima bean can defend itself against herbivorous spider mites by recruiting carnivorous mites that prey on spider mites. The herbivorous spider mite attack triggers the rapid production of jasmonic acid, a central defensive
The flower of Phaseolus lunatus, the lima bean plant, needs protection from two-spotted spider mites which inject their saliva into the plant, dissolving it
When Plants Cry for Help
hormone in plants. Jasmonic acid changes the electrical potential across cell membranes, leading to the synthesis and release of highly volatile carbonbased chemical distress signals (odours), into the environment. The odour concentrations released by the plant and the sensitivity of the carnivorous mites’ odour receptors allow the detection of the plant’s distress call from kilometres away. Exactly how plants like the lima bean 'recognise' the identity of their attacker and secrete odours specific to their herbivorous predator is far from fully understood. Specificity may be mediated by the lima bean’s detection of distinct chemicals in the feeding spider mite’s saliva. The way a herbivore feeds can also activate different plant hormonal pathways. Sap-sucking herbivores trigger salicylic acid pathways in the damaged plants, whilst those that chew leaves like spider mites trigger jasmonic acid pathways. This high-speed distress call is key to damage limitation, as Dr Marcel Dicke, the discoverer of indirect defence, suggests that “If you were a plant under threat of colonisation by a ravenous herbivore and you ‘knew’ that this herbivore has a natural enemy, then you would ensure the influx of the predator as soon as the herbivore colonises you.” How did this specific odour-receptor interaction develop in evolutionary terms? The production of plant odour compounds is metabolically complex. Dr Heil, a researcher at CINVESTAV, in Mexico, thinks it unlikely that the plants “evolved the capacity to make all these compounds and then waited for tens or hundreds of millions of years until the insects had evolved the receptors”. Probably, these compounds originally had very different functions, but have since been successfully employed in indirect defence. Once the predatory mites arrive and overwhelm
Plants can tailor their distress signals to the type of predator attacking them and to the help they need
the spider mites, the lima bean tries to retain them to defend against further attacks. To this end, specialised plant glands called extrafloral nectaries secrete a nectar for the lima bean’s saviours to feed on, promoting a mutualistic relationship. Another plant, the bullhorn acacia, also produces food to retain its saviour, in this case the acacia ant. The nectar, and nodular protein-and-lipid-filled Beltian bodies that swell on the plant surface, completely fulfil the acacia ant’s nutritional requirements. The ants are so specifically adapted to their niche that colonies have lost their ability to survive on their own, illustrating one side of a highly complex mutualism that ensures loyalty to the plant. In return, the acacia ants enthusiastically consume plant predators and removes seedlings of other plants from the acacia’s vicinity. The constant production of nutritious secretions is metabolically costly for a plant, diverting resources away from growth and repair. Why do this if an attack-triggered recruitment system, as used by the lima bean, would function just as well? One possibility may be that the bullhorn acacia’s distress call is just too weak. As a single worker ant patrols the plant, picking off opportunistic predators, it can release alarm odours that trigger whole-colony responses. This could boost plant distress calls and promote fast army recruitment. However, this odour combination is by no means detected only by the ants. Ecologists have noted that even cattle steer clear of the acacia for a few days after smelling these odours, suggesting that odourants might influence the whole community associated with the bullhorn acacia. As Dr Dicke notes, “If the plant starts to produce volatiles that attract the enemy of the enemy, this is public information that can be used by anyone, including herbivores, pollinators, [and] enemies of the enemy [e.g. acacia ant] of the enemy [herbivores]”. So how does the biological community at large respond to these chemical cues? Do these cues, or the responses to these cues, change with drought or salt stress? The truth of the matter is that nobody
really knows. Even the fundamental biochemical mechanisms of indirect defence are poorly understood. As stated by Dr Heil, “For plants that use extrafloral nectar, the biggest question is simply how plants control its secretion: we know surprisingly little on how plants make nectar!” Dr Dicke, Dr Heil, and coworkers are now investigating these questions in the lab. Moreover, could we harness these interactions in agriculture to boost the productivity of crops and protect them from predation? Dr Dicke certainly thinks so: “In 1990 I laid out how we could exploit this [indirect defence] in agriculture. However, in agroecosystems, we have not selected crops for maximum interaction with biological control agents and so we may have 'whispering' crops instead of 'shouting' crops. By selecting 'shouters' we may improve biological control beyond the successes that are there already.” In collaboration with plant breeders and biological control practitioners, Dr Dicke and colleagues have been able to discriminate between whisperers and shouters, and have shown that shouters do better in an agricultural setting. To fully exploit indirect defence, it may be necessary to selectively return some lost traits to commercial crop species. A faster solution may be to identify and synthesise compounds with direct repellent or antimicrobial properties for use in agriculture. Dr Heil and colleagues hope that a knowledge of indirect defence could be directly applied. They currently work on “transferring this knowledge to small holder farmers in Mexico, hoping that [their] observations will help [the farmers] to produce safer and healthier organic beans.” Both Dr Dicke and Dr Heil are clear in their enthusiasm for applying indirect defence in an agricultural context. Through their and others’ research, understanding how to speak this plant-invertebrate chemical language is fast becoming a reality. Andrew Catherall is a 1st year Natural Science student at Homerton.
When Plants Cry for Help
Out of Sync Late nights: necessary evil or wake-up call to health problems?
The day-night cycle orchestrates patterns of activity within our bodies
Out of Sync
SHIFT WORK IS NOW a part of our daily lives. It dictates the working hours of many industries, from security and transport to healthcare. Emergency health care services, doctors on call and nurses on night shifts in wards are necessities in the modern medical system – but ironically, shift work itself can damage our health. In the last 50 years, a wealth of evidence has accumulated that links shift work to a large number of serious diseases including heart disease, cancer and type 2 diabetes. The mechanism that underlies these conditions is the same one that causes jet lag after a long-haul flight or that groggy feeling after an all-night party: getting our circadian clocks out of time. Circadian clocks are a fundamental timekeeping mechanism found in every cell in the human body. They allow us to match key biological processes in our bodies, like metabolism or brain activity, to the rhythms of our external environment. For humans, the most important environmental signal is light, or more specifically, changes in the day-night cycle. The retina of our eyes contains a specialised receptor cell type that detects blue light. This sends a signal to the central circadian clock in the brain, a region in the hypothalamus called the suprachiasmatic nucleus (SCN). The SCN in turn sends both nervous (i.e. electrical) and hormonal signals to our ‘peripheral’ clocks, found in every cell and organ in the body, in order to synchronise them. Peripheral clocks can also respond to other specific cues, like food intake, without affecting clocks in tissues with unrelated functions. Such careful regulation of biological clocks enables us to regulate body functions – whether absorbing nutrients, repairing damaged tissues, or clearing out cellular waste – in harmony with the external environment, across the whole body and at the scale of individual cells simultaneously. Thus, we are naturally more active during the day, working, exercising and feeding when it is bright. At night, though, we fall asleep and consume less energy. Circadian clocks ensure that
mutually antagonistic processes, such as the buildup or consumption of stored sugars, are temporally segregated to generate energy when we need it, and conserve it when we sleep. Circadian clocks are not unique to humans. In fact, most organisms have biological clocks that are almost identical to our own. For example, it makes sense for plants to have a mechanism ensuring that photosynthesis occurs only during the day, when there is light, and to prevent its unnecessary use at night. Studies of genetic mutations disrupting circadian clocks – in organisms ranging from plants to worms and even single celled algae – have shown time and time again that regulating circadian rhythms is a fundamental biological process, no matter how apparently simple or complex the organism. What is the driving force behind biological clocks? We are now uncovering the molecular and genetic machinery that keeps our circadian rhythms ticking. For example, cell imaging and reporters of gene activity have revealed the presence of circadian ‘feedback loops’ controlling gene function. These are caused by proteins that temporarily switch off the genes encoding them, whose levels oscillate over a roughly 24-hour
Daniel Fernando explores the molecular machinery that makes our body clocks tick.
period; complementing this mechanism are other proteins that further strengthen these oscillations. Understanding how the biological machinery of circadian clocks regulates normal body function, is a crucial first step towards finding out how their dysfunction leads to disease. One example of this is in cancer, which is characterised by uncontrolled, inappropriate cell proliferation. Circadian clocks have been shown to control the expression of a number of ‘cell cycle’ genes regulating cell division, such as cdk4, p53 and c-Myc, thus restricting the majority of cell division to certain times of the day. This is understandable, as cells in the human body would have a preference to replicate their DNA and divide during the night, when there is a lower exposure to DNA-damaging UV radiation than during the day. Since disruption of cell cycle gene activity is already strongly linked with cancer, disturbing circadian clocks could also be a step towards disease. Experiments in mice completely lacking one or more clock genes supports this idea. For example, several clock genes have been shown to encode tumour-suppressing proteins. When these genes were deleted, the mice had a greater incidence of tumours overall, and when they received grafts of cancerous tissue, the ‘foreign’ tumours grew faster than in control animals. This parallels the epidemiological evidence in humans of increased incidences of various cancer types, particularly breast cancer, when the human equivalent of these genes are mutated. The story becomes more complicated when we consider that circadian clocks can have different effects across different tissue types, and that the clocks themselves can vary in different organs. For example, mice with liver-specific defects in the clock gene BMAL develop low blood glucose levels; pancreas-specific defects in BMAL, however, produce a diabetic-like state of reduced insulin levels and high blood glucose. Thus, a single clock gene may play multiple roles in metabolism depending on the tissue in which it is active. In line with this evidence, there is a higher incidence
in shift workers of the metabolic syndrome – a term referring to a collection of risk factors (high blood pressure, obesity, high blood sugar levels) that lead to diseases such as diabetes, stroke and cardiovascular disease. This suggests that a contributing factor to metabolic syndrome could be the collective dysfunction of clocks found in the liver, pancreas and adipose tissue. The initial identification of circadian clocks as regulators of the cell cycle (when working properly) and their potential as cancer promoters (when disrupted) has led to the development of a field called ‘chronopharmacology’. Since several major metabolic processes in the body peak in activity at certain times of day, drugs can be administered when they would have both the greatest therapeutic effect and the lowest toxicity. Animal models and clinical trials on existing anti-cancer drugs such as cisplatin have shown that ‘chronomodulating’ these drugs produces better survival rates and greater tumour reductions compared with standard regimes. Further research into the molecular basis of circadian clocks may lead to the identification of drug targets for circadian dysregulation and the development of ‘chronotherapeutics’ to reduce the risk of certain diseases in shift workers. Indeed, such ‘chronomodulation’ could apply not only to cancer and disorders of a circadian origin, but to any drug for any disease. Even statins, a type of cholesterol lowering drug, have a greater effect when administered specifically in the evening. Research into chronobiology and circadian clocks is already yielding exciting translational benefits. Given that an increasingly large proportion of our workforce is involved in some form of shift work, this work could help protect the health of shift workers – and also, more broadly, help us explore the interactions between our modern lifestyles and our genes. Daniel Fernando is a 2nd year second year medical student at Corpus Christi.
Understanding the circadian regulation of physiological processes could help us target drugs more efﬁciently
Out of Sync
Revisiting the Test Tube Research using human embryos must weigh the potential benefits against ethical concerns
Currently, human embryos can legally be grown in culture for no more than14 days after fertilisation
Revisiting the Test Tube
Sarah Foster considers the ethical implications of growing human embryos in the lab for longer-term experiments STILL REELING from a flurry of discussion and soul-searching in the wake of the first attempts at human genome editing in 2015, the world of human embryo research now faces another controversial breakthrough. Two groups of researchers, including several Cambridge scientists, have grown human embryos in culture for an unprecedented 13 days. The researchers were striving to gain a better understanding of the process of implantation, wherein the embryo attaches to the uterine wall to gain access to maternal supplies of oxygen and nutrients. Implantation is critical for normal embryonic development, but very little is known about how it is regulated or the changes that take place within the embryo during the process. This is because embryos at the implantation stage could not be kept alive outside the uterus – until now. In their papers, published this year in the journals Nature and Nature Cell Biology, the researchers describe how they adapted a technique initially developed for culturing mouse embryos to grow human embryos for far longer than was previously possible. The researchers thereby imaged previously unseen stages of human embryonic development. However, by pushing the boundaries of human embryo research, the researchers have also awakened abeyant concerns regarding the ethics underpinning the entire field. To better understand the issues, let’s take a brief detour into the early development of a human embryo. It all begins with the fusion of a sperm cell and an egg cell in a woman’s fallopian tube, creating a zygote. The freshly minted zygote then starts off on a 4-5 day journey to the uterus. During its journey, the zygote begins to divide to form a ball of cells known as the blastocyst, which is comprised of three cell lineages: the epiblast which will form the foetus, and the trophoectoderm and primitive endoderm, which both support embryonic growth. By around six days after fertilization, the blastocyst begins the process of implanting into the uterine wall, a critical step which is accompanied by profound molecular and cellular changes. By
around 14 days after fertilization the blastocyst has developed further, with cells and tissues displaying more complex and clearly visible 3-D organization. One structure that features prominently in bioethical discussions of human embryo research is a cell mass known as the primitive streak. The emergence of the primitive streak features prominently in bioethical discussions about the use of human embryos. It is the basis for an internationally recognized ethical standard, known as the 14-day rule, which prohibits keeping human embryos alive in culture for longer than 14 days, or the beginning of primitive-streak formation, whichever comes first. In the UK, the 14-day rule was codified in 1984 by the Warnock Committee of Inquiry into Human Fertilization and Embryology and is now a central tenet of the Human Fertilization and Embryology Authority’s regulations on the use of human embryos in research. Pursuant to this standard, the Nature Cell Biology and Nature studies lasted only 13 days. While this guideline has existed for years – it was first proposed in the US in 1979 by the Ethics Advisory Board of the Department of Health, Education, and Welfare – until very recently it was considered impossible to keep human embryos alive for anywhere near 14 days in culture. The 2008 Cambridge Textbook of Bioethics, for example, notes that “…it is not possible to culture an embryo in vitro for more than five or six days, a limit not likely
The new simulated attachment culture technique now allows researchers to follow mammalian embryonic development right up to the legal time limit ORAN MAGUIRE
to be exceeded soon…” scientific exploration of ever more advanced stages Why is the primitive streak so significant? One of human embryonic development. On the other common argument is that after the formation of the hand, there is concern that continuing to push the primitive streak, the blastocyst can no longer divide boundaries of scientific exploration by moving this into two pieces to form twins (as sometimes happens law further and further back, will we weaken our with pre-14 day embryos). Hence, 14 days might respect for human life and open the door to, as the be considered to be the approximate time when President’s Council on Bioethics puts it, “a coarsening an embryo becomes, without question, a distinct of our moral sensibilities”? Yet others might argue individual. It is thus a point of ‘moral significance’ at that research into human embryonic development is which the embryo becomes worthy of greater respect. something which, the like the atom bomb, once seen The formation of the primitive streak has been a as possible will inevitably be pursued. successful public-policy guideline because it is an From stem cell research, to human genome early enough cut-off to alleviate public concern, while editing, to new proposals to synthesize from scratch not appearing overly oppressive to researchers – in a functional human genome, research is increasingly large part because at its adoption growing embryos pushing the bounds of the public’s – and of many in vitro for 14 days seemed a very distant prospect. scientists’ – comfort. There is an increasing urgency However, with the recent advancements in human in the need for comprehensive, and preferably embryo research, scientists can now study human international, guidelines on human embryo research. embryos in vitro up to and beyond implantation with While it is a challenging mandate, hopefully, scientists, potentially large biomedical implications. Longer ethicists, and policy makers together will be able to cultures might help scientists to unravel the process address these needs with some sense of finality. Any behind miscarriages, increase in vitro fertilization decision must take into account two main goals: rates, and better understand stem cell differentiation, supporting research but, so far as is possible, showing for example. There is therefore an imperative need to sensitivity towards different ethical views. revisit the 14-day rule. Another reason for re-examining longer experiments is that the argument for the 14-day rule could be considered quite arbitrary. Whilst simultaneously recommending the rule on the grounds described above, the 2002 report of the President’s Council on Bioethics, Human Cloning, and Human Dignity noted that “we are not persuaded by the argument that fourteen days marks a significant difference in moral status.” In fact the repeated offering of this argument inspires more cynicism than confidence, as it seems that legislators are attempting to mask a merely practical standard behind the veneer of a strong ethical argument. Legislation is of course an amalgam of what is possible and (hopefully, in an ethical society) what is right, and also what it is right to legislate (though few would dispute that human embryo research falls into this latter category). It is unreasonable to expect legislation on something as complicated as the use of human embryos in research to reflect an ethical consensus, just as it is unreasonable to expect a diverse society to ever reach a uniform ethical consensus on such an issue. And there are of course myriad difficulties inherent in the attempt to extract a simple moral tenet from biological facts like the progression of embryo development. However, the fact that the 14-day rule is somewhat arbitrary does not necessarily make it a bad law. Some are concerned that if we admit that the 14-day rule was simply a useful standard - a morality of convenience, if you will - that Sarah Foster is a ﬁrst year PhD student at the we open the door to a ‘slippery slope’. There may well Department of PDN. be profound medical and scientific justifications for
Revisiting the Test Tube
ORAN MAGUIRE, ASAL MOINE, FONG YI KHOO
IN 2012, a single addition to the genetic toolbox took the bioscience community by storm. Collaborators Jennifer Doudna and Emmanuelle Charpentier showed that molecular components of a bacteria’s self-defence system could be used to edit the genome: just like a pair of scissors, an enzyme called Cas9 could cut DNA at unique locations, indicated by short DNA sequences called CRISPR spacers. Thus, the CRISPR system was born and soon became a household name, with a wave of new modifications and follow-up publications rising like a tsunami. Speculation on where this wave will break has been both enthusiastic and cautious – and returns time and time again to one question. Have we been given a tool that could edit human DNA, and if so, are we ready for it? Despite the scientific & media frenzy surrounding CRISPR, however, the basic principles underpinning it were already well-known before the seminal 2012 paper. Most importantly, the idea of cutting and pasting genetic material has been central to modern molecular biology and genetics from as far back as the 1970s, when restriction enzymes (the original molecular scissors) were first discovered – and again, bacteria were the organisms that helped us do this. Bacteria are attacked by viruses known as phage, which inject their DNA into bacterial cells in an attempt to insert it into the bacterial genome. Restriction enzymes defend bacteria against phage by cutting up foreign DNA at specific sequences, while sparing bacterial DNA (which is marked by easily recognisable chemical modifications). In molecular biology, this specificity means that different pieces of DNA cut by the same restriction enzyme can be joined together. As a result, researchers now had the ‘scissors and paste’ they needed to add, delete, or modify genetic material. This led to many breakthroughs in our understanding of gene structure and function. As new tools were added to the molecular biologist’s kit, molecular scissors were refined into a three-module approach: ‘finder’, ‘fitter’, and ‘cutter’. The ‘finder’ module recognises a sequence in the genome, while the ‘fitter’ brings the ‘cutter’ close enough to the DNA to make a cut. Crucially, the ‘cutter’ module usually cannot work without ‘fitter’ activity. This approach gave researchers greater control over all steps of the genome editing process. Zinc finger nucleases and TALENS, both of which were developed in the early 2010s, are good examples of this approach and are the immediate predecessors of CRISPR. Both share the same ‘cutter’, derived from the restriction enzyme FokI. The ‘finder’ of zinc finger nucleases were generated from a specific module in another class of proteins called transcription factors,
which recognise and bind specific DNA sequences. Zinc-finger nucleases come in pairs, and must come together for the ‘cutter’ to work; this is analogous to ‘fitter’ functionality. TALENs function similarly to zinc finger nucleases. Their ‘finder’ and ‘fitter’ is a protein first identified in the bacterium Xanthomonas (yet another bacterial contribution to the genome editing toolbox). This protein, called a TAL effector protein (TAL), both recognises and binds specific DNA sequences. The advantage of TALENS is that the TAL protein can be engineered to target any part of the genome. Unfortunately for researchers, though, both the zinc finger and TALEN methods of genome editing are often toxic to the cells being targeted. Suppressing the cutter’s activity after its job is done is also difficult; if it continues to cut uncontrollably, this can lead to random genetic alterations known as off-target effects. Then, just a few years after the advent of TALENS and zinc finger nucleases, molecular genetics entered the CRISPR era. Just like the molecular tools that preceded it, the CRISPR system has its origins in the bacterial self-defence kit, though the mechanism is slightly more complex. After a phage virus attacks, many bacteria can kidnap parts of the viral DNA, cut them into small pieces, and insert them within an array of DNA repeats (called CRISPR sequences) in their genome. During the next viral attack, the whole array is transcribed into a long RNA molecule, which is then cut at the CRISPR sequences into shorter RNAs, termed CRISPR targeting RNAs (crRNAs). These crRNAS bind to a DNA-cutting enzyme called Cas9 (the ‘cutter’ in the CRISPR system), and since they can align with specific sequences of the invading viral genome, they act as the ‘finder’. The final element in the system, the ‘fitter’, is a small viral DNA sequence called a PAM domain; if this mutates, the finder/cutter complex cannot bind or cut the viral genome. (Of course, this provides a route of escape for the virus, which thus continuously evolves new PAM domains. The resulting evolutionary arms race between bacteria and virus has given researchers a large variety of different Cas-PAM combinations to exploit for experiments.) So what exactly is the significance of the CRISPR mechanism? For a start, it means bacteria have much more sophisticated defence systems than we thought: the cell keeps both a record of past attacks (the storehouse of DNA repeats) and a method of defence against future ones (the crRNA/Cas9), both concepts that underpin human adaptive immunity. For molecular biologists, it provides a highly specific holy trinity of finder/fitter/cutter that uses RNA molecules to guide the cutter to its target – unlike zinc
Glossary DNA - Deoxyribonucleic acid, a molecular chain made up of building blocks called nucleotides (A,T,C,G), the ordering of which form a code that can be deciphered by molecular machinery in the cell to make RNA, then protein, the complex molecules that perform or underlie all the biological work our bodies conduct. RNA - Ribonucleic acid, usually produced from DNA and uses mostly the same building blocks. RNA is less stable than DNA, and so is not typically used to carry genetic information. ‘Messenger’ RNA is the most well known. It transmits a message from DNA in the nucleus to molecular machinery throughout the cell, and is decrypted to produce proteins. Others RNAs can inﬂuence DNA, by binding to exposed nucleotides. Transcription - The process by which genetic information stored in stable DNA is copied into a temporary intermediate, mRNA, which can freely exit the nucleus. Because many copies can be produced, many proteins can be translated from these ‘mRNA’s in tandem.
Restriction enzymes, the original molecular scissors, have been part of the molecular geneticist’s toolbox since the 1970s
Bacterial immunity: restriction enzymes cut foreign DNA at viral motifs
finger nucleases and TALENS, both of which used proteins. In their landmark 2012 paper, Doudna and Charpentier used artificially synthesised versions of the guide RNAs, thus pioneering a process that is efficient, cheap and easily scaled up. So much so, in fact, that the applications made possible by CRISPR gene editing might be limited only by our imagination. CRISPR has already been taken up by many fields of research as diverse as medicine, biotechnology, environmental and agricultural science. It is also a valuable tool used in the laboratory to expand our knowledge of the role of any gene we wish to study. Given the mass enthusiasm surrounding CRISPR since its discovery and development as a gene editing tool, evidence of such achievements grows rapidly each year: in fact, the paper that first described the mechanism behind CRISPR/Cas9 has since been cited by almost 800 publications. One of the earliest applications of CRISPR was improving the efficiency of generating animal models to study specific diseases. Until now, this was a notoriously laborious, time-consuming, and expensive process, which involved interbreeding across several generations to produce mutant offspring. This often took up to a year and could easily cost thousands of pounds to complete. CRISPR has since been used to knock out genes in yeast, flies, fish, rats, monkeys, and human stem cells – just to name a few. Researchers have also begun using CRISPR in a similar way to study disease transmission: for example, CRISPR can introduce antimalarial genes into mosquitoes, which prevents them from carrying the malaria parasite. Of course, human diseases are very rarely simple, and may be influenced by many genes at once. Even more important in this context, therefore, is CRISPR’s ability to produce animals with multiple mutations. It
does this by targeting multiple genes simultaneously (simply by providing several different guide RNAs at once) – which is in stark contrast to the multiple rounds of genetic modification required by traditional methods. For example, using CRISPR, mice with multiple mutations can be produced within 4 weeks with an efficiency of 80-95%. This new-found ability to affect numerous genes in parallel has also provided a useful method to screen the whole genome for lossof-function mutations, which could be involved in disease. Compared to previous methods of genomewide screening, CRISPR can allow any DNA region to be sequenced, including regulatory elements that can have just as large an impact on gene activity as the gene sequences themselves. This has already enabled the investigation of tumour suppressor genes in in vivo mouse cancer models, and the discovery of genes essential for Hepatitis C virus infection, including those required for cell-free entry and cellto-cell transmission. Alongside the rise in CRISPR/Cas9 use as an everyday tool in the genetics laboratory, there has been increasing interest in its therapeutic applications – which promises to bring CRISPR right out of the lab and into the clinic. Arguably, CRISPR’s greatest potential, and certainly one of its most widely touted assets, is the ability not just to understand but to correct genetic defects that are the cause of countless, and often devastating, human diseases. Such an idea may not be far from reality, and has been demonstrated on numerous occasions in several experimental systems. Initial studies in human cell culture have already established the principles of gene correction using CRISPR; for instance, CRISPR has been used to replace mutations in the gene encoding the CFTR receptor protein (which cause cystic fibrosis) with normal CFTR sequences. This restored normal protein function in the cells. Researchers are already attempting to develop CRISPR for adult gene therapy. For example, in 2014
Hao Yin and colleagues announced the first use of CRISPR gene editing in adult mice. Their work was published in the journal Nature Biotechnology, and described how they corrected a mutation associated with the lethal metabolic disease tyrosinaemia. Although the success rate in this experiment was low – only 0.4% of the cells treated contained the corrected version of the gene – research groups and companies aiming to develop CRISPR gene therapy are now been set up around the world. These efforts are paying off. In January this year, three research groups reported in the journal Science that CRISPR could partially heal mice with Duchenne muscular dystrophy – a devastating disease in humans, which can result in loss of movement, paralysis and premature death. DMD is caused by mutations in the dystrophin gene, which encodes a protein essential for muscle function; in people with DMD, muscle cells cannot produce a working version of dystrophin, and eventually deteriorate. The researchers used CRISPR to remove the harmful (i.e. mutated) sections of the dystrophin gene, allowing the DMD mice to begin producing the normal dystrophin protein again. While they weren’t completely cured, they did perform better in movement tests than untreated mice, and the improvements in muscle function were – encouragingly – long-lasting, which is one of the long-term goals of gene therapy. DMD is a fitting example of the kind of horrendous disorders that can benefit from gene editing to prevent disease progression. Similarly, disorders that previously required the implantation of healthy, donor-derived cells may now be treated by extracting diseased cells or tissues from patients, correcting defects by inserting therapeutic genes to replace those that are faulty, and re-implanting these healthy cells back into patients. This approach is known as ex vivo therapy, and has been combined with CRISPR gene editing to repair mutated haemoglobin-β genes (which cause a severe anaemic
disorder known as β-thalassemia) in patients’ own stem cells. In the past year, ex vivo CRISPR gene editing has also achieved what was once deemed impossible: cutting out HIV DNA from the genome of infected white blood cells. Experimental studies found that this reduced virus production and removed the reservoir of viral DNA latent in the host genome. Unfortunately, though, this use of CRISPR came with its own shortcomings: a 2016 study by Liang and colleagues showed that HIV could become resistant to a CRISPR attack. Strategies to overcome this could include targeting many sites in the viral DNA at once, to reduce the chance of any viruses escaping. However, this genetic ‘carpet-bombing’ approach could cause more problems than it solves: although CRISPR can indeed target multiple sequences, too many targets could increase the chance of the Cas9 cutter having off-target effects on the host’s genome. The possibility of off-target effects raises a serious concern with DNA editing, whether using CRISPR or any other method: the fear of inadvertently introducing harmful mutations, or the insertion of therapeutic genes in the wrong location. The latter can have especially disastrous consequences, such as accidentally switching on genes which encourage uncontrolled cell growth, leading to tumours, or likewise switching off tumour suppressor genes. This was in fact a very real problem after a gene therapy trial many years ago. The treatment targeted X-linked severe combined immunodeficiency (X-SCID), or ‘bubble baby’ disease’, a condition where the immune systems of affected children are so incapacitated that they need to be isolated
Finder, ﬁtter, and cutter: the key principles behind the CRISPR machinery
Zinc ﬁnger and TALEN systems are the immediate precursors of CRISPR
CRISPR adaptive immunity cycle in bacteria.The CRISPR mechanism allows bacteria to ‘remember’ pathogens that have previously infected them and mount a faster, more efﬁcient response upon reinfection
in ‘bubbles’ immediately after birth. The trial involved inserting the normal, functional gene using viral vectors, which are artificially weakened and supposedly harmless versions of viruses that can still infiltrate cells and deliver their DNA load. However, within a few years of the treatment, three of the ten children included in the trial developed leukaemia due to the random insertion of the therapeutic gene, which subsequently affected the activity of growth regulating genes in close proximity. While CRISPR does offer significantly higher specificity and more precise genetic changes than previous methods of gene therapy, similar viral vectors are still used today to introduce the Cas9 and gRNA tools into cells in laboratories and clinical trials worldwide. Even in the age of CRISPR, the dangers of erroneous gene alteration and insertion continue to be a major concern of any gene therapy approach. Of course, any discussion of gene editing in humans must eventually extend beyond the immediacy of therapeutic approaches, to include the more general
– and ethically fraught – question of human genetic modification at any stage in life. Genetics research has spent decades without a way to carry out genome editing easily and accurately in any organism, let alone humans, but with the advent of improved editing technology like CRISPR, we can no longer avoid the issue so easily. In fact, UK researchers at the Francis Crick Institute have already received permission to begin CRISPR genome editing experiments in human embryos: on the 14th of January this year, the Human Fertilisation and Embryology Authority approved Kathy Niakan’s application to use the technique in order to investigate the role of key genes in early human development. Are these experiments the first real step towards ‘designer babies’ or eugenics? Not all scientists believe so. Peter Braude, emeritus professor of obstetrics and gynaecology at Kings College London, was quoted in Science as believing that Niakan’s work will help us understand both normal embryonic development and problems of pregnancy, such as failure to implant or miscarriage. Another sigh of relief for those fearing the spectre of designer babies comes with the results from a Chinese research team, led by Junjiu Huang, published in the journal Protein & Cell in April, who were the first to have reported the use of CRISPR gene editing in human embryos. The accuracy with which CRISPR edited the embryonic genome was far from ideal: out of the 54 embryos that were tested, only four gained the correct gene insertion. However, even these four had off-target mutations. As long as CRISPR creates too many of these off-target mutations, we might confidently assert that this application of the technology is unpredictable, and the potential impact on patient welfare would be too dangerous to warrant human genome editing so early in life. But would it ever be ethical to do this in the first place? Many people would say that the unethical action
CRISPR has many applications in research and medicine
to take would to be to deny parents the right to rid their children of debilitating diseases. Take, for example, the father quoted in a 2016 Nature news article as saying, “CRISPR is a bullet train that has left the station — there’s no stopping it, so how can we harness it for good?”. However, as soon as you move away from majorly disabling or lethal conditions, the line between illness and normal variation suddenly becomes unclear. That parents should edit out any inconvenient characteristic, be it deafness or a predisposition to obesity or alcoholism, goes against the ideal of an inclusive society. It also calls into question the right that parents have to influence not only the quality of life of their child, but also of future generations to come. Indeed, the potential heritability of CRISPRinduced changes is already an issue: studies using CRISPR to modify mosquitoes and fruit flies demonstrated that the changes could be passed to successive generations. In humans, this could raise serious questions of consent. Should we, for example, use CRISPR to modify the genes of embryos, without any idea of if they would have consented or not? Or
should we be moving away from this avenue altogether? CRISPR could be improved to use in vivo in adults, to combat diseases like HIV or Huntington’s, a procedure to which an adult can fully agree? This might seem a moot point, given that even CRISPR has turned out to have problems with the specificity of insertion and difficulty in introducing the corrected gene into adult tissue; CRISPR therapy would appear to have a long way to go before it can safely and effectively be used to cure genetic diseases in humans. Or does it? Researchers are moving quickly towards improving the specificity of the technique. In January this year, one research group announced a tweak to the Cas9 cutter that reduced its off-target mutation rate, and in April another team were able to improve its efficiency at swapping out single DNA bases. However, these improvements weren’t fool-proof, with the accuracy of the system hovering at a ‘best’ of 75%. This is still not nearly enough to safely consider its use in viable human embryos or adults. Will human gene editing result in a population of children with custom-designed looks and IQs off the charts? No. At least not in the conceivable future. But the recent discovery of CRISPR raises the question of what other gene editing technology is still waiting to be found. One gram of soil generally contains between 100 million and 1 billion bacteria, all battling it out to divide faster and survive longer than the rest; could one of these be host to an all-new, improved version of CRISPR? We have only begun to scratch the surface of the microbial biodiversity on our planet, and continued research into what else lies out there could yield a discovery that is as big as CRISPR - or bigger. Sophia Ho is a research assistant at the Department of Medicine.
CRISPR’s full therapeutic potential is yet to be realised
Glossary CRISPR spacer - A short DNA sequence that matches a 20 nucleotide region of Cas9’s guide RNA in order for Cas9 to cut the DNA.
crRNA - CRISPR RNA, which is transcribed from a sequence ﬂanked by CRISPR spacers in a DNA sequence.
TALEN - Transcription activator-like effector nucleases. These are restriction enzymes that cut DNA at speciﬁc DNA sequences. The protein has a cleaving domain, which can be swapped out or altered, and a recognition ‘TAL’ domain which can be modiﬁed by investigators to target speciﬁc genes.
PAM - Protospacer adjacent motif. A sequence of nucleotides (NGG) in target DNA that is recognised by Cas9, that appears near the CRISPR spacer.
Alba Landra is an undergraduate at Selwyn College. Laura-Nadine Schuhmacher is a PhD student in the Deparment of Pharmacology.
CRISPR’s multiple effects may allow whole genomic screening
Amelia J. Thompson is a PhD student in the Department of PDN.
What the VAK? Winner of CamBRAIN’s Grey Matters writing prize 2016, Liam Wilson, debunks the visual, auditory and kinaesthetic model of learning WERE YOU EVER ASKED at school to figure out what kind of learner you are? Perhaps you were given a questionnaire that pegged you for a visual learner who needs pictures and diagrams,or an auditory learner who just likes a good old lecture. Or maybe you had a flair for the dramatic, and decided that kinaesthetically acting out osmosis in biology was the best way to learn. These three learning styles form the basis of the visual, auditory, and kinaesthetic model (VAK), which proposes we learn better when information is presented to us in our preferred sensory modality. Unfortunately, whatever learning style you decided (or were told) was for you, it is almost certainly irrelevant to the way in which you actually learn. The truth is that this concept is a neuromyth – that is, a myth about the brain that makes no sense from a neuroscientific perspective. Unfortunately, education is particularly rife with neuromyths, such as that old chestnut about us only using 10% of our brains, which has embarrassingly fuelled even modern science fiction from the film Limitless to Lucy. The VAK neuromyth in particular has persisted for decades, despite having been repeatedly debunked by research, reviews, and even a surprisingly comprehensive Wikipedia page. In fact, according to a recent review by Howard-Jones published in Nature Reviews Neuroscience, over 90% of teachers in
CamBRAIN is a young-investigators’ society at the University of Cambridge. We aim to foster collaboration and communication amongst people working or or just interested in neuroscience. In addition to academic talks, presentations and workshops intended for personal and career development, we run a series of social events which allow our members to build successful networks. We laucnhed ‘Grey Matters’ as a way to get our members to think about misconceptions in neuroscience and how science communication can be used
the UK, the Netherlands, Turkey, Greece and China believe that presenting information in a student’s preferred sensory modality improves learning. So how has this neuromyth persisted for so long? And why are we so enamoured by something so nonsensical, which science has explicitly rebuked? Perhaps it is because this theory is so temptingly intuitive, and is readily supported by personal experience or anecdote. Many of us can probably conjure up examples of when we repeatedly failed to learn a difficult concept, but it just ‘clicked’ after we drew a mind map or made up a rhyme. Depending on the successful strategy, we might be led to believe that we learn best when information is presented in a specific modality, but it is not likely that any one strategy can be applied to all types of learning. Indeed, a study by Krätzig and Arbuthnott published in the Journal of Educational Psychology in 2006 showed that there is no correlation between a person’s supposed learning style and their performance on a corresponding memory test. On the contrary, according to a 2002 study by Constantindou and Baker published in Brain and Language, people learn better when information is presented in multiple modalities, for example as words and pictures, rather than just words. And this makes far more sense when we consider what we know about how densely interconnected the brain is. While it is true that particular parts of brain are highly specialised for processing sight, sound and touch, all of these areas are highly interlinked, allowing us to integrate information from across all of our senses. This multimodal approach to learning also fits much better with other scientific theories, such as Darwin’s theory of natural selection: if some of our ancestors had needed to interact kinaesthetically with a lion to learn about it, they would not have passed on many genes to the next generation. And there you have it: the VAK neuromyth debunked. So next time you are at a talk and the speaker insists that they have included plenty of cartoons and video clips for all the visual learners in the audience, just tell them to VAK off. Liam Wilson is a fourth year PhD student in the Department of Psychiatry.
When Citizen Science Works ORAN MAGUIRE
Kimberley Wiggins gives us the story of an email that led to a medical breakthough
THERE ARE MANY HALLMARKS of a great scientific mind: the ability to think outside the box, the capacity to see connections between seemingly unrelated situations, and the aptitude to ask relevant questions and think up a way to answer them. Although scientists try to nurture and develop these skills throughout their careers, you do not have to be a trained scientist to have a scientific mind. ‘Citizen Science’ is a term that describes the contribution of non-professional scientists to research. Most cases involve researchers asking the public for help collecting information, which can generate larger data sets covering a greater geographical area. For instance, the ‘Globe at Night’ programme by the National Optical Astronomy Observatory involves participants counting how many stars they can see from their current location to create a global map of light pollution. These projects can also be used to analyse huge data sets with a lower error rate than a computer. For example, scientists at Cancer Research UK designed the online game ‘Genes in Space’, allowing multiple people to independently analyse 46000 sets of breast tumour samples to identify faulty regions of DNA. Citizen science can also work in the opposite direction, with non-scientists proposing a hypothesis and enlisting the help of professionals to test it. This was the case for Chris Hempel, a mother-of-two and businesswoman with a subscription to Nature. In 2010, Chris came across an article about cholesterol in atherosclerosis published by Eicke Latz’s group at the University of Massachusetts. Atherosclerosis is a condition that precedes heart attacks and strokes. During the early stages of disease, fats accumulate along artery walls at sites called ‘plaques’, which leads to the recruitment of immune cells. The article described how cholesterol crystals deposited in plaques directly activate a group of proteins within immune cells. This then drives inflammation that can contribute to the progression of disease. The mention of cholesterol interested Chris because her daughters both have Niemann-Pick type C disease - a rare but serious cholesterol storage disorder. The genetically
inherited disease results from the mutation of the gene NPC1, which encodes a protein that regulates the transport of cholesterol within a cell. The defective protein causes the inappropriate accumulation of cholesterol inside cells. This affects several major organs, including the brain, and is a leading cause of juvenile dementia. At the time of the article, Chris’s children were being treated with a drug called cyclodextrin, which is used in the food industry to remove cholesterol by making it over 300,000-times more soluble. Chris emailed Professor Latz, asking whether cyclodextrin might also be able to clear the cholesterol crystals in atherosclerotic plaques, thereby preventing disease progression. This email sparked years of research that was recently published as a paper in Science Translational Medicine, on which Chris is an author. Latz and colleagues used an established mouse model of atherosclerosis to show that treatment with cyclodextrin led to smaller plaques containing fewer cholesterol crystals. A longstanding problem facing the atherosclerosis field is that preventative treatments often have limited use clinically. This is because people only realise they have cardiovascular disease after a heart attack or stroke, by which time the atherosclerosis is advanced. However, Latz’s group found that cyclodextrin could not only prevent plaque formation, but also reduce the size of existing plaques. This finding, coupled with the fact that cyclodextrin is already safely used in humans to treat other conditions, means that this drug represents a real candidate for preventing and treating cardiovascular disease. Chris’s story shows that you don’t need multiple science degrees to think like a scientist. She made the connection between her daughters’ extremely rare disease and the biggest causes of death in the developed world, and in doing so she identified a treatment with the potential to save many lives in the future.
Businesswoman Chris Hempel identified an intersection between two different areas of research, and her insight may represent an important advance in treating cardiovascular disease
Kimberley Wiggins is a PhD student in the Department of Medicine.
Tom Hitchcock tells the story of one of biology’s last great Renaissance scientists
Waddington saw himself as a Darwinian, although he drew issue with some Darwinian thinkers of his day. His intention was to build on Darwin’s work, in particular by emphasising the extensive interactions genes can have, rather than seeking to replace it as he is sometimes portrayed
Behind the Science
KNOWN AS THE FATEHR OF EPIGENETICS, Conrad Waddington made a number of key discoveries in developmental biology and was the first to propose the notion of the ‘epigenetic landscape’, a metaphor for the different pathways towards, and influences on, a developing cell’s fate. He also was a keen Morris dancer, took great interest in modern art and published books on moral philosophy. What then was the academic life behind this distinctly renaissance biologist? Waddington was born in 1905, the son of two tea planters, and spent his early years in Kerala, India, before moving back to England at the age of four to live with his aunt, uncle, and Quaker grandmother. He enjoyed collecting rocks and insects, first experiencing science through the local pharmacist and distant relation Dr. Doeg, who introduced Waddington to a range of subjects including Chemistry and Geology. Later, at the exclusive Clifton College in Briston, Waddington learned chemistry from the noted writer of science textbooks, E.J. Holmyard. Here, he was introduced not only to modern Chemistry, but also to the work of the Alexandrian Gnostics and the Arabic Alchemists, leaving the young Waddington with a lasting interest in metaphysics. He went on to Cambridge, studying Natural Sciences at Sidney Sussex College and graduating with a first class degree in Geology. Despite his academic excellence, he spent most of his time reading not geology but philosophy, and in particular the work of A.N. Whitehead. Whitehead had been at Cambridge in the early years of the century, and his work continued to inspire Waddington throughout his life. After his degree, Waddington remained at Cambridge, working towards a PhD on the structure of ammonites; however, he soon dropped out to pursue embryology. Waddington’s interest in embryology was sparked by an experiment conducted a few years earlier in Germany by Hilary Mangold and Hans Spemann. Spemann and Mangold were studying the development of the embryonic body plan – for example, the axis running from head to tail. They had grafted a section from one developing embryo onto another, and were able to induce a second embryological axis. In other words, the grafted section, which they dubbed the ‘organiser’, induced formation of what was essentially
a twin embryo conjoined to the host embryo. The striking implication was that a powerful signal emanating from this organiser controlled the fate of surrounding tissues. Waddington wondered whether a structure similar to Spemann and Mangold’s organiser existed in higher vertebrates, such as birds and mammals. After meeting Dame Honor Fell, a pioneer of techniques used to grow and study tissues in culture (i.e. in the laboratory environment, outside of the whole organism) and director of the Strangeways Laboratory at Cambridge, Waddington moved to Strangeways to begin his work on vertebrate embryos. His grafting experiments, using culture techniques based on Fell’s, were highly successful, and he published a series of papers in the 1930s detailing how the grafting of a region called Henson’s node not only generated a second axis in a duck, but that that region could be transferred from duck to chick, and chick to rabbit. This demonstrated that the same organising signal existed in both birds and mammals. Attempts to purify the chemical carrying the organising signal failed; we now know that the signal is present only in minute quantities out of reach of the technology available at the time. However, Waddington soon realised the futility of that particular task, and moved on, this time across the Atlantic and to the genetic revolution occurring in the laboratory of pioneering geneticist Thomas Hunt Morgan. Even during his Cambridge days, Waddington had developed an interest in genetics, becoming close friends with Gregory Bateson, son of William Bateson, who founded the Genetics Department. But it was during his year in the USA that he first began to truly link his understanding of embryonic development to genetics. Working in Morgan’s lab at Caltech, he moved his focus onto the development of the fruit fly, Drosophila, and started to draw links between the embryonic organiser and the action of genes, which could also strongly control the identify of an entire bodily region. Like Morgan, Waddington began to see development as the result of an ongoing dialogue between the genes and the cellular environment in which they reside: a series of binary decisions, in which different cells proceed along a series of ‘paths’ before reaching their final differentiated states. Michaelmas 2016
began to explore his other interests further, writing monographs on the relation of art and science, and the epistemology of complex phenomena, as well as a number of popular science books. Waddington’s eclectic mix of work left some questioning his legacy when he died in 1975. However, the arrival of the modern evolutionary/ developmental biology field (‘evo-devo’), and an increasing emphasis on mathematical modelling and understanding development quantitatively have made Waddington’s ideas increasingly relevant today. His epigenetic landscape, for example, has become the basis for models of cell fate and potency. But beyond his discoveries and ideas Conrad Waddington should perhaps best be remembered for his holistic approach to science and life. A knowledgeable philosopher, a connoisseur of the arts, founder of a commune, a scientific communicator, and a pioneer of mathematical modelling in biology. Conrad Waddington was not only a great scientist, but maybe one of biology’s last great polymaths.
“With only a little imagination we can see the gene as sitting at the centre of a radiating web of extended phenotypic power.” — Richard Dawkins, The Selﬁsh Gene
Waddington described these processes as epigenetic, and began to visualise them as a series of bifurcating valleys. The metaphor compares a cell to a ball rolling down a series of interconnected valleys to come to rest at the stable, lowest point. This is analogous to a cell moving through developmental time and making developmental decisions, just as the ball rolls down one of several potential valleys, as it transitions from an undifferentiated early embryonic cell to a mature cell (e.g. a neuron or a bone cell). In contrast with a real landscape, the topography of the epigenetic landscape is shaped by an organism’s genes and their interactions with each other and the environment. He asked his good friend and painter, John Piper, to illustrate this for him, building on some of the plaster models that the biochemist Joseph Needham had built whilst at the Theoretical Biology Club in Cambridge. Thus, from these sketches the metaphor of the ‘epigenetic landscape’ was born. After his time in North America, and a stint in the Royal Air Force during the Second World War, Waddington became chair of Genetics at Edinburgh University, setting up a new research institute for animal breeding and genetics. Whilst at Edinburgh he continued to flesh out his conception of epigenetics, and added mathematical rigour to the landscape concept. It was from this work on the landscape that the third great strand of Waddington’s work came, in the form of a progression from genetics and embryology to the level of evolutionary biology. Waddington had found that in the main, despite variations in genetics and environment, the majority of organisms develop in a fairly homogenous fashion. This robustness and resistance emerged naturally from his epigenetic landscape; however, he found a number of instances where certain environmental factors could induce specific traits, such as the disappearance of a fly wing vein in response to high temperature. Waddington found that by interbreeding flies that exhibited this trait, he could produce offspring with the trait – even without exposure to the original temperature change that had induced it. He named this process genetic assimilation, and described how the original unusual phenotype had been ‘canalised’ so that it was produced under all conditions, not just the novel environmental ones. Waddington believed this to be a crucial new addition to Darwin’s ideas on evolution, and a potential way that genetics, evolution and development could be synthesised. The latter part of Waddington’s career produced fewer experimental breakthroughs, and instead he spent a great deal of time promoting theoretical biology, the highlights of which were four symposia at the Villa Serbelloni on Lake Como in Italy. Among the ideas that came out of these were Lewis Wolpert’s notion of positional information. He also
Tom Hitchcock is an undergraduate at Sidney Sussex College.
Behind the Science
30 Years of Therapeutic Monoclonal Antibodies Arthur Neuberger examines the history of these remarkable drugs and the Cambridge contribution to their story
The rise and rise of monoclonal antibodies: an example of a rapid translation from basic research to widely used therapy
An antibody is a complex Y-shaped protein-based molecule (often a glycoprotein, i.e. a protein with a sugar group attached) that specifically binds to a designated target, its antigen, which in the majority of cases is a foreign and infectious agent in the human body. A partial digestion of antibodies by a protein digesting enzyme (protease) called Papain reveals that antibodies can be divided into three fragments: Two identical Fab fragments, which each have a specific binding site for the same antigen, and the constant Fc region. An antibody in complex with its antigen can then trigger immune response actions, for instance the lysis of cells or phagocytosis. In the latter case, a foreign and potentially harmful particle (e.g. a bacterial cell) is destroyed following its ingestion by a specialised type of cells called phagocytes. The term ‘monoclonal’ indicates that this type of antibody is obtained from identical B-lymphocytes, a type of white blood cell in the mammalian immune system, which were themselves clones of a unique parent B-lymphocyte cell. Because of this monospecific origin, each monoclonal antibody has two identical antigenbinding regions against a specific antigen. Antibodies were first discovered by von Behring and Kitasato in 1890, when they analysed the blood serum of animals that were immunised with various infectious agents. They provided the first evidence of antibody functionality by neutralizing diphtheria and tetanus. They also were the first to describe antibody specificity by successfully demonstrating that the tetanus antitoxin cannot neutralize diphtheria toxin and vice versa. It was only in the 1960s that a group of scientists discovered that antibodies could only be produced by lymphocytes, as a reaction to infection and immunisation (e.g.
“A SINGLE MONOCLONAL ANTIBODY will only be the starting point of a variety of man-made secondary antibodies, each manufactured to fulfil a special requirement.” (Georges Köhler, 1984). Georges Köhler’s vision of a market for monoclonal antibodies (mABs) started to become reality two years after he was awarded the Nobel Prize in Physiology or Medicine, for the “discovery of the principle for production of monoclonal antibodies” together with his colleagues, Niels Jerne and César Milstein. In 1986, the US Food & Drug Administration (FDA) approved the first mAB for clinical use in humans, Orthoclone OKT3. Nowadays, more than 30 years after César Milstein forecasted in his 1984 Nobel Prize lecture that there was “no danger of a shortage of forthcoming excitement in the subject”, the monoclonal antibody industry is indeed a highly exciting multibillion dollar business. In 2012, cumulative global sales exceeded a $50 billion annual margin and mAB drugs like Humira are the top-ranked best-selling drugs in consecutive years. Most of these drugs are worth more than $6 billion in global annual sales today; interestingly, though, the breakthrough technology for mAB discovery and engineering remained unpatented and the earliest antibody technologies were available for moderate value. The first reported mAB deal was a licensing agreement between Centocor and Wistar Institute in January 1980; it was ‘only’ worth $100,000! More recently, a leading mAB producing biotech firm, Genentech, was acquired for over $46 billion in 2009, and licensing agreements are worth many million US dollars and more. mABs have now been successfully developed for the entire spectrum of common indications, although most are approved for the treatment of cancer and inflammatory diseases.
vaccination), and a decade later that Milstein and colleagues developed hybridoma technology to produce large quantities of monoclonal antibodies. Making a hybridoma involves fusing mouse myeloma (cancer) cells with specific B-lymphocyte cells from a mouse that has been immunised with the desired antigen. This produces a cell line with the immortality of the cancer cell and the antibody-producing capacity of the B-lymphocyte. In 1986, 11 years after the discovery of hybridoma technology, the first mouse-derived or murine monoclonal antibody, Orthoclone OKT3, was approved by the FDA. It was simultaneously the first monoclonal antibody to be approved for clinical use in humans, and helped to avoid acute transplanted organ rejection in kidney transplantation patients by suppressing their immune system. This very first proof for a therapeutic use of mABs was the starting point of the mAB drug industry. However, it took almost another decade until this pharmaceutical submarket fully established itself on eye height with the dominant market of small molecule chemical drugs. A major roadblock that had to be passed first was the increased immunogenicity of murine mAB drugs. Patients’ immune reactions against the ‘foreign’ murine proteins, especially against the Fc region of the mABs, were very common in those days. In the same year of Milstein, Köhler, and Jerne being awarded the Nobel Prize, Morrison and colleagues described a novel technique for the production of less immunogenic, chimeric monoclonal antibodies that were hybrids of mouse-derived Fab fragments and human Fc fragments. This technique reduced the immunogenicity significantly but not fully. In 1997, the first chimeric monoclonal antibody, Genentech’s rituximab, was approved for various indications, including different types of lymphomas. Immunogenicity, however, was still a major problem in clinical development and postclinical observations. Therefore, until the turn of the millennium, each therapeutic mAB approval was rather exceptional and considered to be a huge success for this emerging industry. What fostered this industry’s transition into a flourishing submarket, with an average of 4 successful approvals per year nowadays? The success of this industry has both its scientific and commercial routes in Cambridge. Whilst biochemist Milstein (Fred Sanger’s Michaelmas 2016
Antibodies speciﬁc to a particular ‘threats’, here a virus, are produced by B-cells. Antibody molecules bind to the virus and recruit other immune cells to neutralise it.
Arthur Neuberger is a PhD student in the Department of Pharmacology.
student) was a Cambridge scientist who laid the scientific foundation for the existence of this industry, it was the pioneering work of two other Cambridge scientists, Sir Gregory Winter and Herman Waldmann, that led to its full commercial take-off. In 1986 already, Sir Gregory Winter and his colleagues described a technology that would later entirely revolutionise the monoclonal antibody biotech business: antibody humanisation. Humanised therapeutic monoclonal antibodies tremendously reduced the immunogenicity problem, since only the antigen-binding regions of such mABs are murine whereas the rest of the antibody is fully human. The fruits of this work could be harvested 15 years later, in a collaboration between Winter and Waldmann, when Winter’s technique was applied to Waldmann’s B-cell chronic lymphocytic leukemia mAB drug Campath-1H (alemtuzumab). In 2001, this was the first humanised mAB to be approved. Shortly afterwards, in 1990, Sir Gregory Winter’s lab described a technique using phage display for the generation of fully human monoclonal antibodies. Here, a gene that encodes the protein of interest is inserted into a coat protein gene of a phage (a bacterial virus), resulting in the protein’s display on the virus’s surface when it is co-expressed with the viral coat gene. This allows for an efficient screening of engineered human-compatible antibodies against millions of potential antibody targets. It took, however, a further four years until fully human monoclonal antibodies were derived from transgenic mice with human immunoglobulin genes. Today, more than 50 mAb drugs have been approved for therapeutic use, with three new approvals already in the first half of 2016. At the current approval rate of approximately four new products per year, 70 monoclonal antibody products will be on the market by 2020, and combined worldwide sales will be approaching $125 billion. With the hybridoma technology being a “by-product of basic research”, as Milstein stated in his Nobel Prize lecture, the success story of the mAB industry “thus represents another clear-cut example of the enormous practical impact of an investment in research which might not have been considered commercially worthwhile, or of immediate medical relevance. It resulted from esoteric speculations, for curiosity’s sake, only motivated by a desire to understand nature.”
From bench to bedside: the monoclonal antibody Rituximab, used to treat autoimmune disease and cancers of white blood cells
Computing’s Quantum Leap Adam Barker discusses how qubits could grant us more computing power than ever before FORTY YEARS AGO, Gordon E. Moore, the co-founder of Intel, made a bold and seminal observation: “the power of computers and their circuitry had doubled, and would continue to double, every 18 months”. And indeed, over the past 50 years we have seen computers shrink from warehousesized megamachines, to hand-held, touchscreen devices, which contain more computing ability than the entire Apollo programme. Still, our appetite for quicker graphics processing, greater memory or computers capable of simulating systems of far greater complexity continues to grow. The high-performance computing facilities here in Cambridge, or other centres of research, are capable of simulating the motion of a billion objects under classical mechanics, such as in the Millenium Simulation performed at Durham University, but fall short when attempting to solve the quantum mechanical equation for the 6 electrons present in the carbon atom. In this era of ‘big data’, our appetite for greater computing power continues to grow. Moore’s prediction was based on rapidly-advancing improvements in transistors, the ‘brickwork’ of
computing infrastructure. Transistors produce the ‘binary code’ output used in the computing operations, with millions of these tiny devices crammed into modern computing ‘chips’. These have continued to become smaller, faster and better-insulated, leading to exponentially-growing performance across all computing devices. However, as circuitry becomes smaller and improvements become much harder to extract, the exponential growth Moore predicted may begin to stall. As a greater number of transistors are jammed into chips, their speed and capability become limited by the size of the atoms themselves! To return to a growing computing environment, a revolutionary change in the infrastructure of a computer is needed if we hope to avoid the limitations imposed by using macroscopic objects in our computing framework. One fairly recent advance in computer architecture, which attempted to solve this problem, utilises the benefits of ‘parallel computing’. Multi-Core Processors and Graphics Processing Units (GPUs) are capable of performing many computing operations simultaneously by using multiple, less-powerful processors, in addition to the usual Central Processing
The exponential growth of computing power that we have seen for so many years is beginning to saturate
can instantly sort unstructured datasets whilst another can optimise processes relying on multiple variables, such as finding mixtures of elements to produce a high-temperature superconductor. Solving these problems with a classical computer requires bruteforce and patience, as the computer has to work through each individual data point, often never arriving at the optimum solution. A main question in the fields of quantum computing and quantum information processing is, what objects will be successful qubits in quantum computers? The inherent requirement is that the objects retain their quantum nature. Hence, macroscopic objects where the quantum properties are no longer apparent, are not suitable. A Canadian technology company, D-Wave Systems, have now produced devices containing more than 2000 ‘superconducting qubits’, which rely on quantum annealing. This is a sieving process, whereby the system is lightly disturbed in order to extract desirable pieces of material from a disordered mixture. The nature of these devices makes them extremely useful in solving optimisation problems with several in commercial use, with NASA recently purchasing the latest ‘2X’ device. The increase in the power of a quantum computer, as we add more qubits to it could keep up the exponential growth of computing capability that we desire. With many leading technology firms and research groups exploring possible qubits, perhaps it will not be long until we see a widely commercialised quantum computer. Within 20 years, the limitations that we now face in simulating the world around us may all be removed, thanks to the quantum peculiarities of a handful of atoms.
Adam Barker is a 4th year physicist at Pembroke College.
Unit (CPU). Parallel computing is a valuable resource for performing multiple similar tasks, such as producing high-definition graphics. Despite this, it does not provide any advantage where localised high-performance computing is required; for example, solving a quantum mechanical equation, a situation where a ‘quantum computer’ is predicted to excel. Quantum computing and its underlying methods of operation rely on the quantum-physical nature of microscopic objects. In our everyday macroscopic world, binary systems remain binary: cats are either alive or dead! A classical transistor is at any point a ‘1’or a ‘0’, producing a long array of evolving binary code within a computing process. The many billions of atoms that make up this one transistor collectively produce only one of two possible outcomes at any time, a seemingly vast atomic over-staffing! In the mysterious quantum world, however, things are very different: in a famous thought experiment by the physicist Erwin Schrodinger, cats are allowed to be both alive and dead, or at least until we observe them. As a result, a single quantum bit or ‘qubit’, which could point ‘up’ or ‘down’ on a sphere of possible configurations, can at any point exist in a mixture of the two states. Multiple qubits can form combined states: if we have a number N of ‘entangled’ qubits, the whole system can be in any mixture of the 2N configurations. Here, ‘entanglement’ means that the behaviour of any qubit in the system is influenced by every other qubit, even if there are no apparent forces or connections between them. Einstein dubbed this effect ‘spooky action at a distance’, an amazing feature of the underlying relationships between particles in the quantum world. Entanglement allows the qubits within a quantum computer to work together to solve complex problems, as opposed to classical transistors, which operate in solitude to produce their bit in the binary code output. To compare the two approaches: current transistors contain around 1016 atoms, with around 109 transistors in an average chip. To provide a comparable amount of computing, a quantum computer would only require around 30 entangled qubits! Ignoring the infrastructure required to hold or manipulate these qubits, this is a vast reduction in atomic requirement, along with fewer limits on the speed of processing. It is the key feature of a ‘quantum speed-up’, whereby adding one qubit to the computer leads to a factor-of-2 increase in computing power, that distinguishes a quantum computer from its classical counterparts. The evolution of qubits in the superposition of the two possible states leads to insightful processes, or quantum algorithms, outside the capability of classical computing architecture. For example, one algorithm
The power of qubits: a binary ‘1’, ‘0’ and a mixture of both
Children of the Lesser Gods
Alex Bates looks at amateur designers of life in the lab and their offspring, while Geoff Ma tells us what the Cambridge iGEM team is up to this year
‘Katered’, bacterial bio-art that glows an alarming red when stressed
Member of the public gets a whiff of engineered banana scented bacteria, Techfest 2012, IIT Bombay, Mumbai, India
"most of the science process is hidden behind institutions,” says geneticist-artist-engineer Howard Boland, “we’re about getting in at that, and bringing it out into a new kind of engagement.” Boland does not look at genes like a scientist does. Where they see the blueprints of life, he sees a raw material that he can take, break apart, and mould into something new. To him they are as clay, only a means to an end, and a bizarre one at that. Somewhere out there, the bacteria he has engineered as works of art are respiring, pulsating and dying. His kindred hobbyists are interested in reprogramming bacteria for a plethora of other reasons: to glow green, make medicines, detect diseases, and create cheap commercial chemical compounds. They are tinkerers, creating synthetic life by altering bacterial DNA codes with bits of genes bought online, and they do all this outside of real laboratories, operating as ‘outlaws’. They may be in the realm of science, but they are not reigned over by funding bodies, allied to institutional fiefs or tied up in the aristocracy of academia like many professional scientists. Within the scientific feudal system lies a treasure trove of complicated tools and technology previously inaccessible to those without the know-how. Now, however, genetic engineering has become public domain. Basic biological equipment is becoming increasingly cheap and easy to use, and people are using it to engage with science on a level that high school programmes, museum exhibitions and BBC documentaries could never achieve. Citizen science movements have become popular, and of these the biohackers have become the most famous, or perhaps more accurately, the most infamous. Indeed, with their new tools, they are the lesser Gods of microbial life, practicing ‘synthetic biology’ to change its inner workings. Synthetic biology is genetic engineering streamlined and standardised, seeking to ‘copy and paste’ gene ‘parts’ into living cells to make them work for our benefit. In the same way as screw threads, British standards and unit operations led to the industrial revolution this endeavor is catalysing a biological one. Instead of new steam engines we are seeing new synthetic life, and not all of it is spawned from professionals. Yet, even with professional biologists, this is
controversial. They may not see themselves as playing God but that has not stopped the idea becoming extremely pervasive. This stigma makes it seem like a Bladerunner dystopia is the geneticists’ end-point. “But it’s not. People hearing about genetic engineering think one thing – we’re going to get some glowing trees – but they don’t think about what is actually involved,” explains Boland. He seems more like an artist, Dutch, soft spoken, in a smart blazer. But he owns a lab-coat. He has exhibited synthetic life internationally. “They miss out the processes involved and it’s there that we have the real meat of synthetic biology.” The Cult of the Biohacker | “Our lab is through here – it used to be a darkroom,” says Simon, biohacker and drifter from The London Hackspace, a makers’ den situated in Hackney. He does not offer his surname. The ad hoc bio-lab is tucked in a basement corner, perhaps because the centre’s other more conventional hobbyists do not trust it. Outside sits an old ‘autoclave’. Simon describes it as ‘a science oven’ used to decontaminate equipment. Beyond it sits a massive metal box that looks like an old IBM mainframe computer. “That’s our flow cytometer, using lasers it’s meant to count the cells we make” says Simon cheerfully. He is unkempt and very enthusiastic. “We got it from Imperial College London when it broke. I don’t think we can fix it.” He beams crookedly. The lab itself would be no lab by institutional standards. There is a bacteria-stocked fridge in one corner, in another a pile of aged electronics and in the others racks of test tubes and textbooks. On the work surface sits a ‘polymerase chain reaction’ machine, a ‘photocopier’ for mail-ordered DNA. It is secondhand, yellowed and clunky. “It looks more like a cash register,” jokes Simon, drawing attention instead to the microscopes, gel electrophoresis machine and a bottle of highly carcinogenic ethidium bromide, all of which are used to visualise individual genes. There are cans of Becks on the worktops and a stale sandwich on the floor. It smells of sawdust. There are no windows. Most of the people that use this space have never received any formal training. They use money from the centre’s membership fees to order genes from synthetic
Glowingplant.com, a biohacker company, plan to sell glowing plants like this one over the Internet
The Adam and Eve of Biohacking | “It is because science gives us the power of manipulating nature that it has more social importance than art,” Bertrand Russell once opined. However, biohacking has given art that power too. Genetic manipulations of living cells can change an organism’s nature from a simple survival machine to a stunning piece of bio-art, and because the technical processes are becoming simpler, living bio-art has begun to boom. Standing in Boland’s makeshift artist’s studio-lab is quite different from Simon’s ex-darkroom. It is closer to University College London’s gleaming workspaces. There are lab coats on pegs and Gibson pipettes on racks. In the fridge, one finds flasks of modified bacteria. Some of these fluoresce in response to stress. From others one gets a whiff of banana. Boland, who has a PhD in ‘Art from Synthetic Biology’, notes that “ethically such art is really interesting. I was working on a project recently where I used the heart cells from mouse foetuses Engineer CJ Ong getting involved to make a pump that swam in water. While it’s in Cambridge-JIC 2016's wet-lab still alive you open the foetus up and scrape off the heart cells. When finished the animal dies. One of the technicians I was working with refused to do it for the purpose of art.” Soon, he plans on making a sperm powered Ferris wheel. For artist Charlotte Jarvis the ethics are more clearcut. She had considered using human embryonic stem cells as part of her art, but had decided to less controversially produce them from her blood and skin. Nevertheless, she is not afraid of challenging public opinion. “There is a pervasive idea that we shouldn’t mess with nature. By doing it for art, we’re being braver. It’s saying ‘this is what I believe in’ and nailing our colours to the mast.” Jarvis smiles at that, she is a little smitten and certainly empowered by the science to which she has pinned her colours. She sprayed apples from The Hague with DNA crafted to encode
CAMBRIDGE-JIC IGEM 2016
Always Expect The Inquisition | The worry in the general public is that these projects would not survive an ethics committee either. What if these biohacker cultists took it into their heads to practice the darker arts of genetic engineering? This is why the FBI has become increasingly involved in the biohacking scene. They sponsor iGEM to increase their influence and run workshops to ‘educate’ biohackers on public security. “We engaged with DIYbio because we understood post 9/11 that we had to be more proactive," say Edward You and his underlings from the Weapons of Mass Destruction Directorate so frequently at outlaw biology summits and iGEM jamborees that it is starting to sound like a mantra. There is a certain amount of paranoia and the FBI will treat outlaws with the sword. In 2004, American professor Steve Kurtz was suspected of bioterrorism because bacterial Petri dishes were found at his home, after his wife had died of an unconnected heart attack. Victor Deeb had his home-lab demolished by FBI
and CIA agents after an un-related fire broke out in his kitchen. He had been working on producing less hazardous materials to replace those currently used in beverage cans. The problem with biohacking is largely not one of malice. Rather, while biohacking is often bombastic and somewhat whimsical, the materials used are dangerous and unintended consequences could be serious. For example, how could biohackers keep well meaning genetically modified foods separate from wild-grown food when even Monsanto cannot achieve that? “The danger is more to themselves. Not all chemicals smell and repel you…but they can seriously hurt you,” says Boland, “Ethidium bromide is extremely toxic.” But then, danger is an ever-present companion for any outlaw.
biology libraries and use them in simple gut bacteria. “We’ve made them glow. Others here think we’re a safety hazard,” says Simon almost proudly, “but we mostly just sit around, drink and plan.” MadLab in Manchester hosts a similar community biology lab that reaches out to the wider public with workshops that, for example, show one how to extract DNA from strawberries. In the US, big communities like GenSpace and BioCurious have been around a lot longer. All of these groups rely on cheap off-the-shelf wetware, makeshift hardware and skill swapping. Often professionals teach these skills on. University College London recently invited Simon’s biohackers to its labs, to improve their chances in the international genetically engineered machine (iGEM) competition, an annual amateur synthetic biology extravaganza. But the phenomenon is not just a case of aristocratic, academic philanthropy. Many biohackers can fend for themselves. Although in the UK stricter genetically modified organism regulations require a licensed workspace, in the US synthetic life is being made in off the radar settings: sheds and attics. However, genesis in the garage is also not a story about toppling the way research science is currently conducted. Biohackers are never going to be able to achieve what fully trained professional scientists, with their greater funding, better equipment and larger knowledge base, can. A lab like Simon’s can barely make E.coli grow, let alone glow. No one will be growing a designer baby in the sink anytime soon. What biohackers have to their advantage is total freedom within their means, and so the ability to work on completely unique projects that would not survive the cannon fire of an institutional funding committee.
CAMBRIDGE-JIC IGEM 2016
This is part of the low cost biolistics gun that CambridgeJIC 2016 iGEM team is making
the entirety of the declaration of human rights, and ate one - accepting the forbidden fruit of biotechnological knowledge. “But it’s not about education,” Jarvis says, “you need to look to iGEM if you want that.” If bio-art is the Adam of biohacking, then iGEM was born of its rib. Every year teams of biohackers, A-level pupils and university students slave away the summer months to design a synthetic life form capable of benefitting humanity. Although its participants are mostly drawn from the student serfdom, iGEM is a grassroots powerhouse that operates outside the institutional fiefs insofar as the projects are student led. That means some are completely wacky. iGEM has seen teams suggest making a plastic island out of Pacific debris using oceanic bacteria, creating biological light bulbs, growing light controlled tissue and synthesising palm oil products. The University College London 2013 team attempted tackling Alzheimer’s disease by reengineering brain cells. “We all use the ‘BioBrick’ scheme,” explains student Andy Cheng from the University College London iGEM team. “All the genes are in a certain format. It is basically like we’ve all decided to use Lego instead of Megablocks. We get formatted parts from a ‘library’ and add our own parts to it as part of the project. That’s how we keep community science expanding collaboratively.” Then he grins ruefully. “Actually it’s a lot harder than Lego. Lego doesn’t suddenly die.” From the Garden to the Garage | Crowdsourcing science in the biohacker way is unlikely to be a game changer or world destroyer in its own right, but its indirect effects in terms of proliferating ideas and enthusiasm could be powerful. There is a monetary barrier to good science, but there is no such obstruction to good scientific ideas. Then again, one never knows for certain. In 1975, the first Apple computer was constructed on a wooden workbench in a Californian garage. Since then Apple has been trying to deck out the world in sleek white and chrome. It may be that biohacking will give rise to a more gelatinous future, embedded with wetware. At any rate, since the field of genetics started with Gregor Mendel, one monk studying peas in his garden, it is certainly fitting that it enters this new chapter with a cult operating out of garages.
Alex Bates is a 1st year PhD student in Neuroscience at the MRC Laboratory of
The Cambridge-JIC iGEM 2016 Team | If you were to ask “who here did iGEM?” in a room full of synthetic biologists, a forest of hands would rise in response. iGEM is mostly an undergraduate grassroots movement, but it is now populating the entire field. Given the crazy ideas it often germinates one might even say it is overgrowing its field. The 2016 iGEM Cambridge-JIC team, of which this writer is a part, consists of ten Cambridge undergraduates, comprising five engineers and five natural sciences students. JIC stands for the John Innes Centre in Norwich, an research facility with a particular interest in botany and in which the Cambridge iGEM approach is firmly rooted. Standardisation: The bedrock of the iGEM | As DNA technology matures and advances, there is a growing need to share and acquire purposefully designed DNA sequences that encode ‘new’ bits of machinery. Innovative creations often become the baseline for further research, and it is critical that these are made available to the scientific community. This is, however, not a new concept. Noticeably, in other fields, development is built on a standardised platform. The traditional example of en masse standardisation in action would be the industrial revolution, but there is another good one in how certain operting systems have come to dominate the computer and smartphone markets. How can DNA be standardised? Who decides on the protocol to be used and will industry really adopt it? And there is the question of rationale – what, why, and ultimately, who? Within the community, standardisation is often debated, both in terms of rationale and practicality. Certainly, Cambridge’s Plant Science’s Department has its foot in the door through the OpenPlant Initiative; and has been supportive of creating a common syntax for sharing of ‘biological parts’. So, how does the iGEM competition fit into all of this? The iGEM foundation has a ‘registry’. This is a library of parts that allows free distribution and access for iGEM teams and other researchers. A ‘part’ is simply a short section of DNA which has some specific purpose. It could encode some bit of of a protein but it does not have to, it could instead be a regulatory sequence that controls when and how a gene is active of example. However, in order for a part to be used for a specific purpose, it may need to be joined with other parts and fused into a plasmid, a ring of DNA, and inserted into an organism, usually a species of bacteria. The relationship between the engineered plasmid and the bacterial chassis is similar to that competition
iGEM’s Grows into Plants | In recent developments, the iGEM foundation decided to expand their registries, by expanding into plants. This has led to the creation of a new standard for plant parts – the PhytoBrick standard. Moving in this direction has not been problem free. Using plants causes significant complications (our first discovery this year) compared with using bacteria. It is simply a problem of compatibility. Parts made for one species may not be compatible with another; which does somewhat undermine the principle of the registry. Crucially, DNA tools needed to make the plasmid can be unique to each species.
The Cambridge-JIC 2016 Project | We identified a good opportunity to work with the new PhytoBrick Standard, somewhat influenced by Cambridge Plant Science’s involvement with the OpenPlant Initiative. We decided to work with chloroplasts, which can be considered the photosynthetic power plant of the cell. Since our summer project is terribly time limited, algae became the obvious chassis choice, due to its high growth rate. Drawing from expertise within the department we decided to focus our efforts on the chloroplasts of Chlamydomonas Reinhardtii. Three things you need to know about Chlamydomonas: - It has one big chloroplast. - Cell density can double in 8 hours. - It is green. We notice that chloroplast DNA transformations in plants are extremely useful, for example in bioenergy research, and yet seem to be poorly utilised by industry. In short, the photosynthesis occurs in the chloroplasts, which gives them the capability to produce yields several hundred times greater than conventional DNA transformations. A literature investigation concluded that the poor utilisation of chloroplasts might be due to the lack of a standardised procedure. Consequently, we set about trying to establish a toolbox for chloroplast transformation. We hope that our work will be
beneficial for scientists around the world. Our approach is novel in our aim to express Cas9, which is a relatively newly utilised protein that allows scientists to accurately modify DNA in the chloroplast of Chlamydomonas. The FOCUS article in this issue explains the great promise of Cas9 technology. Our aim is to reduce the time required to convert all the DNA in an alga’s chloroplasts and hopefully bypass the toxicity of Cas9 normally associated with plants. In the process of this, we have created a number of PhytoBrick parts for the iGEM registry. Since the Phytobrick registry is new, we are confident that our parts and work will be of use to future iGEM teams and researchers. In order to supplement our genetic toolbox, we decided to produce some actual hardware; an open source ‘gene gun’ used to ‘shoot’ DNA coated pellets into cells, and a chlamydomonas growth facility. We hope that these tools will help scientists from smaller labs explore chloroplast research in an affordable way. But it is still a competition. Running a successful and innovative project is not all it takes in iGEM. Presentations and judging occurs at the iGEM “Giant” Jamboree in Boston at the start of November. Finally, I would like to add in a little plug for joining iGEM. I have thoroughly enjoyed working on the Cambridge-JIC 2016 iGEM team, and have been particularly excited by this project. I have gained new skills and have made a contribution to our team’s efforts in the competition. It is definitely something I would recommend. It is a fantastic opportunity and, most importantly, it is a fantastic experience. If you want your hand to be in that forest keep eye out for a call for applications later this year for Cambridge iGEM 2017. If you would like to follow out teams work check out http://2016.igem.org/Team:Cambridge-JIC Petri dishes with agar medium to support the algae
CAMBRIDGE-JIC IGEM 2016
between software and hardware. These iGEM DNA parts are special because their ends are designed to fit together like pieces of a jigsaw, which theoretically makes them easy to use. The key concept here is convenience. Although sharing sequences can be an effective method of distributing knowledge, it becomes difficult to actually utilise the parts in the lab, and there is still no guarantee of compatibility. Perhaps there will come a day where DNA synthesis of large sequences becomes favourable, but until then, a physical registry provides the convenience required.
Geoff Ma is part of Cambridge-JIC 2016 team and an Engineering Student at Emmanuel College.
Weird and Wonderful A selection of the wackiest research in the world of science
A TREASURE-TROVE of pottery from ancient China
just yielded its greatest treasure yet: a 5,000-yearold beer recipe. Archaeologist Jiajing Wang and colleagues were working on a 3,000 BC site in North China, when they discovered two pits that looked a lot like breweries. Both pits contained distinctive pottery vessels resembling brewing equipment from later periods, as well as pottery stoves (probably to keep the booze warm while it fermented). Analysis of the beery residue on the pots revealed traces of millet, a seed called Job’s tears, and various tubers – as well as barley, which surprised the researchers, since their ancient beer predated the first known use of barley as food in China by 3,000 years! This means that the cultivation of barley spread much faster across Europe and Asia than was previously thought. Whether or not people in China were eating barley 5 millenia ago is still unknown, but one thing is certain: they were drinking it! Wang and colleagues published their results in the journal PNAS earlier this year, complete with recipe. Applied microbiologists and home-brewers take note. JT
Optical Trickery AMAZINGLY, THE EYE and visual system of the brain can focus captured light onto the retina, send information via electrical messages down the optic nerve, and produce an image…all within a tenth of a second. To process light into an image of the world so quickly, the brain uses several tricks, and these are exploited by optical illusions. The key is that our perception of the world relies on creating a figure in our mind that may not correspond exactly to how it actually appears. One type of optical illusion is caused by mechanisms within the light-perceiving cells of the retina. These cells can inhibit signals in neighbouring cells to enhance the contrast at the edges of shapes. In the famous Hermann grid illusion, this so-called lateral inhibition mechanism means we perceive grey dots at the intersections of a white grid on a black background. Lateral inhibition occurs less at the central point of focus of the retina, so that when we look directly at an intersection, the grey dot disappears. Another type of optical illusion involves
Weird and Wonderful
A Good Vintage discrepancies between the light that is picked up by our retina and the image we create in our brain, since the brain uses contextual information when it interprets visual stimuli. For example, perception of depth is integrated from multiple sources, such as overlapping objects or converging lines. This results in the illusion that two parallel lines seem to warp when superimposed on converging lines. Studying optical illusions has helped us understand how the human brain perceives the world and fills in missing or ambiguous information. They reveal ‘tricks’ that are essential for our brain’s ability to convert a two-dimensional image on our retina into a 3D representation of the world – which illustrate the successes, rather than failures, of the visual system. ZC
Misidentified Flying Objects TRY TO CATCH a ball with one eye closed.
Difficult, isn’t it? For many tiny flies, a lack of depth of vision is just another hurdle of everyday life. To overcome this, some use ‘pattern matching’. For example, killer flies in captivity compute a ratio of target size to angular velocity, taking off after prey whose characteristics best match those of their usual prey, Drosophila. Thus, killer flies can be duped into taking off after larger, faster moving beads several times their size. Similarly, male houseflies will aggressively chase large distant objects to defend their territory, mistaking them for other males, whilst male hoverflies will chase fastmoving blocks of wood and dried peas, confusing them with attractive females. As a result, the flies risk capture by airborne predators. So why do they not compute absolute distance prior to take-off? They could use motion parallax – a comparison of target against background displacement when their head is moved. Small flies make short flights. To catch up with their targets, flies need to make snap decisions – hesitation could cost a meal! More complex computation also requires larger neural networks, incurring higher metabolic, temporal and spatial costs. For tiny flies, the risk of capture may be worth taking. HC
Illustrations by www.alexhahnillustrator.com
Join Write for us! Feature articles for the magazine can be on any scientific topic and should be aimed at a wide audience, normally 1000-1200 words. We also have shorter news and reviews articles. Please email email@example.com with your articles and ideas! For their generous contributions, BlueSci would like to thank: Churchill College Jesus College
If youâ€™d like to get involved with our website, or
If your institution would like to support BlueSci, please
join our web, radio, typesetting and film teams, contact firstname.lastname@example.org
email email@example.com, or visit