I,SCIENCE THE SCIENCE MAGAZINE OF IMPERIAL COLLEGE
taboo spring 2017
issue 36 I, Science 1
young minds through teaching INSPIRE PGCE programme
Become a newly qualified teacher (NQT) in just 10 months and join a network of colleagues who will help shape the future of STEM education. Places are now available for applicants who have, or will have, a Master’s or PhD degree in Physical Sciences, Maths or Engineering by August 2017. » Get hands-on teacher training in London-based secondary schools. » Receive a tax-free government bursary of up to £30,000* plus a £2,000 bursary from Imperial College London. » Work alongside Imperial researchers on bespoke activities to inspire the future generation of chemists, engineers, mathematicians and physicists.
WANT MORE INFORMATION? Informal information sessions run throughout the year. These are a chance to: » talk to the programme co-ordinators. » see the work of current INSPIRE students. Email INSPIRE@imperial.ac.uk for details and to book your place.
For more information:
www.imperial.ac.uk/inspire * The Department for Education provides bursaries for trainee teachers from the UK/EU. For eligibility details for your subject see: https://getintoteaching.education.gov.uk/funding-and-salary/overview
Teacher training with a difference Up to £32k in funding Bespoke science communication masterclasses Engage with pupils in over 10 schools
I,SCIENCE THE SCIENCE MAGAZINE OF IMPERIAL COLLEGE
Editors-in-Chief Bruno martin Madeleine Finlay Magazine Editor raquel Taylor Web Editor Lucy Timms Pictures Editor Natasha gertler Business Manager liz killen Marketing and social Media Tori Blakeman Radio Editor catherine Webb News Manager Sarah Barfield Marks Online Features Manager katharina kropshofer Events Manager judit agui TV Editor Vidish Athavale Sub-Editors Ipsita Herlekar Frances McStea Marcela Leite Helena Spooner rachel baxter emma lisle david walker
n the last issue of I, Science we invited you to join us for an escapist journey to other worlds of scientific distraction. This term we ask something more challenging of you. This term, we want you to break the silence. So grab a seat, sit back and get comfortable, because we are going to talk about the *un*comfortable. In this issue, we explore some of society’s most stubborn hang-ups, from incest, to nudity, all the way to death. The blanket of silence cast by taboos can be a powerful inhibitor of social action, stalling the progress required to solve some truly epic problems. In science, taboos can arise from the knee-jerk reaction against ideas, and even evidence, which contradict established beliefs and social norms. It takes conviction, of the sort Galileo had, to break them completely. We may not be Galileo, but without further ado, let’s tackle our first taboo: poo. On page 6, Mary Barker reveals how our reticence to discuss egestion is enabling one of the biggest development crises of our time. If you like your science contemporary and controversial, turn to pages 12-13, where Katharina Kropshofer and Gaia Stucky de Quay introduce us to the researchers who have thrown down the gauntlet at the feet of giants, calling out the dogmas in Darwin and Einstein’s legacies.
I, Science, c/o Liam Watson, Level 3, Sherfield Building, Imperial College London, London SW7 2AZ Email: firstname.lastname@example.org Printed by:Leaflet Frog, 38 Britannia Way, Bolton BL2 2HH
Our Taboo issue sees the return of the ‘Great Debate’, in which Marek Wolczynski and David Walker go head-to-head on the question of whether we should be researching the genetics of sexuality, on pages 26-27. Our final word on these tough topics comes from Henry Bennie, who turns the analytical lens on science itself, probing issues of objectivity and truth in scientific research and publishing. This has been an exciting issue to produce. Our hope is that you will find its contents illuminating and provocative. In the discourse of science, no topic should be out-of-bounds, so go forth and speak up – there’s a lot more where this came from.
madeleine and bruno
I, Science is a publication of the Science Communication Unit, Centre for Languages, Culture and Communication, Imperial College London. However, it is a student publication, and as such the views expressed in I, Science do not reflect the views of the Unit, Centre or College.
Cover Illustrator source photograph by Andreea-Otilia Suiu Interested in advertising with us? Contact : email@example.com
For our regular ‘Science Behind the Photo’ centrefold, Annabel King presents her impassioned project to destigmatise nudity from the setting of her clinical photography studio, here in London. On page 20, Aran Shaunak raises the thorny subject of weapon development in the UK, followed by Judit Agui’s appeal to westerners to consider including insects as a sustainable component of our diets. No doubt you’ll be convinced to grab some grub, so check out her tasty grasshopper stir-fry recipe on page 23!
Find more great content on our website: www.isciencemag.co.uk
We’re always on the lookout for new contributors for both the magazine and the website. If you would like to get involved as a writer, editor or illustrator please don’t hesitate to get in contact. You can email us at firstname.lastname@example.org, tweet us @i_science_mag or contact us directly through our website www.isciencemag.co.uk.
Contents | 5
keeping it in the family Social norms condemn incest in nearly every culture. Was this always the case, and will it always hold?
In the name of science: future |
pooping in poverty The psychology of disgust is holding back solutions to one of the world’s biggest health crises. Let’s talk about toilets.
Humanity has a complicated relationship with death. Will technology one day enable us to break up with it completely?
in the name of science: past Five of the most blatantly unethical experiments of history... and their uncomfortably valuable results.
Five experiments to transcend the frontiers of knowledge... if scientists’ moral compasses were due South.
a pill for empowerment
15 | gentle eugenics
| clouded judgement
| know your grubs
| the great debate
30 | lines in the sand
Sometimes evolutionary science doesn’t point to natural selection, but questioning Darwin still makes for difficult conversation.
SCIENCE BEHIND THE PHOTO Can I take a nude photo of you? A medical photographer’s view of the human body.
making a killing Investing in research and development of weapons is a moral minefield which we must tread carefully.
miss medicalisation Over history, the physiology of women has been misunderstood and mistreated. Are we still misinterpreting women’s health?
the best result Truth and objectivity. Two pillars of scientific practice on which knowledge can be safely built. Or, perhaps not…
EINSTEIN’S TEST Scientists at Imperial put theoretical physics on trial. Will evidence from the early Universe be Einstein’s undoing?
Genetic tools offer the chance to eradicate hereditary diseases. Misused, they could become instruments for prejudice.
The contraceptive pill is a modern triumph. Meet Margaret Sanger, who fought to make it happen.
breaking up with death
A short story of difficult sights and deliberate silences.
Snacking on scorpions and crunching on crickets doesn’t sound appealing, but it might just be the food revolution we need.
Should we be searching for scientific evidence proving whether sexual orientation is genetically determined or not?
Is it time to reassess the rules for embryo research?
by Sarah Barfield Marks
28 days later or the first time human cells have been successfully grown inside a non-human animal. The study, published in Cell, created a human-pig hybrid with the hope of growing human organs inside animals. Adult human stem cells were introduced into an earlystage pig embryo, before inserting the hybrid into an adult female pig. The human-pig cells successfully survived gestation, and were therefore removed at 28 days. This avoided serious ethical concerns over mature hybrid organisms whilst gaining sufficient knowledge in how these different animal cells interact. Although growing fully matured human organs is many years off, the potential this study brings to biomedical science is vast. Demand for donated organs is four times the supply. The need for safer and more effective ways of testing new medicines is great. This study provides a crucial step in this taboo field of science.
Picture: Igor Stramyk
new study, published in Nature Climate Change, has revealed that climate change impacts on endangered wildlife have been hugely under-reported. It was previously thought that merely 7% of mammals and 4% of birds on the IUCN Red List of Threatened Species are negatively affected by climate change. This new study found almost 50% of mammals and nearly 25% of birds are impacted. These animals are vulnerable to extreme events and changes brought by climate change, including habitat loss and degradation. The scientists involved in this research are pushing for wider public engagement with these figures, with the co-author Dr James Watson stating that “Climate change is not a future threat anymore”.
ow much pollution are you exposed to everyday? The chances are, way too much if you live in London. A study by King’s College London revealed that this city had exceeded the annual air pollution limit in just 6 days in January 2017. EU law states that hourly levels of toxic nitrogen dioxide must not exceed 200 microgram per cubic metre more than 18 times per year. Nitrogen dioxide (NO2) is produced by factories and vehicles, especially diesel engines. NO2 can cause respiratory problems such as shortness of breath and coughs. Negative effects are most severe in individuals already suffering from conditions such as asthma. The European Commission has issued a final warning to the UK to take action in the next two months. It remains to be seen whether these regulations will be upheld during the Brexit negotiations. Picture: Taimit
One of us
ne way of evaluating the intelligence of a machine is by subjecting it to the Turing Test. Developed by the mathematician Alan Turing, this test asks whether machines can trick living organisms into believing they’re communicating with one of their own. A study, published in ACS Science, developed artificial cells that were able to communicate with bacteria. The team of scientists created a tiny cell-like structure, capable of making RNA and proteins in response to a specific bacterial substance. Not only could these artificial cells respond to this bacterial molecule, they could produce their own bacteriallike protein, which natural bacteria could interpret. Although science is a long way off from making these ‘robot cells’ self-reliant, it provides potential for disrupting disease-causing bacteria. Studies like this increasingly blur the line between living and non-living organisms.
Pooping in poverty
oop. These four letters fill most of the world’s population with disgust. No one wants to work with poop, no one wants to talk about poop, so poop gets forgotten. Yet, we all spend three entire years of our life on the toilet. In most of the developed world, pooping is something that we don’t even have to think about. However, for the 2.4 billion people worldwide who have no access to sanitation facilities, pooping becomes more problematic. Open defecation is the result of this issue, an activity practised by over one billion people. For us, the idea of pooping in a field is disgusting, but for others it is a necessity. What is disgust anyway? Disgust is a universally acknowledged ‘basic emotion’, along with others such as sadness, anger and happiness. Darwin proposed that disgust is an evolutionary adaptation to avoid eating foods which could make us ill. Scientists have accepted this view with little argument, preferring to study the other, ‘trendier’ emotions. However, recently the popularity of disgust research has increased, and it has become viewed as an adaptive function. Disgust differs by time and place. For example, human faeces provoke disgust in most humans. However, maternal childcare instincts reduce the feeling of disgust that mothers experience towards their children’s poop. Once the child grows up, however, this effect wears off. Cultural circumstances also play a role. A large part of disgust is learned by observing others, creating a sense of what is deemed normal in that culture. So, if from a young age you have seen everyone in your village openly defecating in fields, this may not seem disgusting and it will
become part of day-to-day life. Now let’s talk about diarrhoea. In developed countries, many of us will view diarrhoea as a temporary inconvenience. However, without the hygiene afforded by a toilet, diarrhoeal disease can be a matter of life and death. For children under five, these diseases are the second leading cause of death. Even children who survive the first bouts of diarrhoea face a potentially grim future, with increased risk of malnutrition and a weakened immune system to other poop-related illnesses. Poor health is the most obvious, but not the only detrimental effect that results from a lack of a toilet. Diarrhoea causes 5.5 billion productive days to be lost every year. And this problem begins as soon as children start school. Imagine going to school and needing to poop. At first, you will find it difficult to concentrate while you try to hold it in. Finally, it will be break time, so you run out of the class to relieve yourself. However, there are no toilets to be found in the school. Your only option is to poop in front of the school building, with all your friends watching. Are you likely to keep going to school? And this is without the additional 443 million school days which are missed every year as a result of sanitation-related diseases. Now picture this occurring in schools all across developing countries. The pool of individuals who are able to apply for well-paid, professional jobs will be very limited. This is the vicious cycle of country poverty. This problem is multiplied for women. While poop is a topic that most would rather not discuss, periods are downright unmentionable for some people. Over two-thirds of girls who start their period
in India have no idea why they appear to be bleeding. So, if you miss school to avoid pooping in front of your friends, there is no way that you will attend school during your period. As a result, school attendance is far lower for girls than boys. And periods don’t just compromise education. Girls and women who struggle to make ends meet can’t “splash out” on tampons and sanitary towels when they barely have enough money for food. For them, rags and animal skins will have to do. This causes infections, keeping women from work, driving them further into poverty. For women, pooping in the open presents the additional risk by making them vulnerable to violence and sexual abuse. Toilets don’t just prevent a huge amount of disease and help a country’s economy but they also facilitate gender equality. So we just build loads of toilets, right? First of all we need money; unfortunately toilets are not a ‘cool’ thing to fund. ‘Water and sanitation’ are often grouped together, but think how many more times you have heard about the charity “WaterAid” compared to “World Toilet Day”. No one wants to talk about toilets, so the funding is not there. The United Nations’ Millennium Development Goals attempted to improve global sanitation and access to water. Whilst the goal to halve the number of people without access to clean drinking water was met 5 years before the deadline, the goal to halve the number of people without access to basic sanitation was never met. What’s more, the improvement of access to toilets in urban areas was substantially higher than that for rural areas, even though the proportion of people in rural areas without toilets is almost double that for urban regions. The Bill & Melinda Gates foundation has recently decided to invest
The psychology of disgust is holding back solutions to one of the world’s biggest health crises. Let’s talk about toilets.
by Mary Barker in water and sanitation aid, however their statement contains the following clause: “Because the innovations we support can be most immediately valuable in densely populated areas, our main focus is on urban sanitation”. This is not to say that urban sanitation does not need improvement in many places – but the half of the rural population worldwide with no basic sanitation cannot be ignored. Even when there is a will, introducing toilets in rural areas is a challenge. People who have pooped in fields far away from
their houses their whole life will not like the idea of waste being disposed of directly outside their homes. The toilets which can be built are not luxury convenience rooms but basic roofless sheds with wooden doors just tall enough to cover the body of the squatting person inside. There is barely more privacy than going to a secluded spot in a field. Thus, even families fortunate enough to receive a toilet may end up rarely using it. What, then, can we do? First, learn to talk about poop. Once we all start talking about poop, we can raise awareness of the poop problem. Once people are
aware of the gravity of the poop problem, people will eventually want to fund poop solutions. People know that bed nets help prevent malaria and that condoms prevent HIV, but what people don’t know is how many lives can be saved by the provision of toilets, alongside education in toilet use. With the Sustainable Development Goal attempting to achieve access to water and sanitation to all by 2030, something drastic needs to be done. And this can start with you. The poop problem is largely a communication problem, so talk sensibly about poop and we could all save lives.
Sculpture: Rachel Crowdy
Keeping it in the family Social norms condemn incest in nearly every culture. Was this always the case, and will it always hold?
by Helena Spooner
he cultural taboo surrounding incest is so ingrained in us that even talking about it can feel like the conversational equivalent of a damp, limp handshake – slightly uncomfortable and mildly creepy. Biologically speaking, it makes sense to avoid mating with close relatives. Humans have two copies of each gene, one from each parent, and many genetic disorders are recessive, which means both your copies have to be faulty for it to cause you problems. The more related parents are, the more similar their genes, which increases their children’s risk of getting two faulty copies and manifesting the disease. Beyond biology, Freud argued that societies have developed incest taboos to counteract the natural lust of members of a same family for one another. Another theory, the Westermarck effect, suggests that such cultural taboos arose from an innate sexual aversion to those we live with at a young age, markedly in the first six years of our lives. According to Westermarck, our brains have evolved to work out who we’re probably related to and help us instinctively avoid incest. Although there is some evidence supporting the theory, critics argue that it did not completely eradicate sexual attraction and that cultural norms are still important deterrents to incest. While aversion to incest may be partially instinctive, it has been overridden repeatedly throughout history. Pharaohs, for example, were believed to be descended from the gods, so inbreeding was a way of keeping sacred bloodlines pure. Unfortunately for Tutankhamun, whose parents were siblings, this resulted in a debilitating genetic bone disease which left him unable to walk unaided and plagued by infections (one of which probably killed him, aged just 19). Interestingly, it is thought that a high proportion of civilian marriages in ancient Egypt were also between siblings, suggesting incest was not as taboo then as it is today. However, whilst the Egyptians might have been more relaxed about incest, in Ancient Rome it was punishable by death, and when Emperor Claudius engaged in it with his niece, he was considered a disgrace.
Incest remains unlawful in the Roman Catholic Church, but when King Henry VIII established the Church of England, in 1491, he decreed that cousin marriages were in fact legal. This began a long tradition of cousin marriages within British royal families, which has not been without its consequences. Queen Victoria’s family, the house of Hanover, had been marrying their cousins since 1682, and Victoria herself married her cousin Albert. Their descendants continued to marry each other, but unfortunately Victoria was a carrier for a recessive disease called haemophilia which stops blood from clotting and means that even tiny cuts can lead to dangerous blood loss. Inbreeding allowed the haemophilia gene to spread throughout the family – tragically she lost one son, one grandson and 5 great grandsons as a result of the disease. The repercussions also spread far beyond personal losses; the reliance of her granddaughter, Tsarina Alexandra of Russia, on the monk Rasputin to treat her son’s haemophilia is
credited in part with triggering the communist revolution. Although modern royals like Prince William are shunning the tradition, the Queen and Prince Philip are in fact also related. Despite incest being a cultural taboo nowadays, it is actually legal in a number of countries, as long as it is between consenting adults. Legal regulation may seem shocking, but incest laws are usually in place to prevent offspring being born with serious genetic diseases or birth defects. This raises the question: is incest still unethical if both partners are consenting adults with no plans to procreate? Aside from the social consequences, there are arguably no biological disadvantages to childless incestuous relationships, so should we question the socially constructed moral code which makes us feel so uncomfortable with it? With contraception improving and more couples these days choosing to be childless, it will be interesting to see whether incest remains offlimits thousands of years from now.
Picture: Helena Spooner
Breaking up with Death Humanity has a complicated relationship with death. Will technology one day enable us to break up with it completely?
by Chun-Yin San
or all my love of science fiction, I try to avoid stories like H.P. Lovecraft’s Reanimator that deals with the reanimation of the dead. The thought of taking dead decaying bodies and manipulating them with medicine and technology until they come back to life, is disturbing. Not only does my vivid imagination conjure up morbid images with foul stenches and ghastly sounds of bodily juices dripping, I also find myself horrified and indignant at such desecration of the dead. Surely the deceased should be laid to rest, not resurrected and defiled.
E. Cornish, for example, claimed to have revived the fresh corpses of two dogs - after strangling them to death with his own hands - by see-sawing them to induce blood circulation while injecting drugs as part of his ‘Lazarus’ project. He later attempted to secure the corpse of an executed convict to try his methods on humans, but was
The fact that our world isn’t populated with the un-dead today is surely evidence that I am not alone in this line of thinking. But while the act of reanimation may not have much mainstream acceptance, the idea of the dead coming back to life certainly remains a source of cultural fixation. We only need to look at recent examples in modern pop culture, like (spoilers!) the resurrection of Jon Snow in Game of Thrones, to see this in play. It wasn’t all that long ago that reanimating the dead was a subject of fascination in science too, as researchers sought to push at, and break, the taboo. A famous example is that of Giovanni Aldini, an Italian physicist of the 1800s, who became mesmerized by electricity at a time when the technology was still enveloped by an aura of magic and mystique. Aldini was particularly passionate about the potential for electricity to breathe life back into the dead. In an early example of sensationalized public engagement, he electrocuted the body of a freshly executed murderer in front of a live audience. The show, which saw the corpse demonstrating an unearthly flailing of limbs, left onlookers shocked, and helped Aldini attain the Copley Medal of The Royal Society. Yet, Aldini’s work pales in comparison to the sort performed by researchers in the early 1900s. The American biologist Robert
thwarted by prison bureaucracy. It may be tempting to see these experiments as acts of irresponsible, mad science of bygone days. However, just last April, Bioquark Inc., an American biotech company, prompted the popular press to resurrect the ‘Lazarus trial’ moniker when it emerged they had been green-lighted by regulators to recruit clinically-dead patients, to test new techniques intended as a first step towards reversing brain death. Ultimately, the trial did not seem to go anywhere; Bioquark suspended patient recruitment after just seven months. One could take this to be a sign that the fascination for resurrecting the dead is waning. It’s certainly difficult to identify any research in this area over recent decades:
perhaps due to increasingly rigorous ethical guidelines. That is not to say, however, that we are leaving Death alone. If anything, messing with it through science seems more popular today than it ever has been. In a 2014 interview in The Telegraph, Peter Thiel, a Silicon Valley entrepreneur, gave this take on the idea of dying. “You can accept it, you can deny it or you can fight it. I think our society is dominated by people who are into denial or acceptance, and I prefer to fight it.” Thiel’s attitude is emblematic of a growing transhumanist movement around the globe to overcome death and achieve immortality through new innovations. And therein lies a step-change in our thinking since the time of Aldini. The question today is no longer about how to reverse death, but how to prevent it from happening in the first place. There is no doubt that the technology to do so is now a firm possibility. For example, it is now realistic to speculate that developments in nanotechnology might one day allow nano-particles bots to carry out constant repairs within our bodies, extending them well past their ‘use-by’ dates. Progress in cellular biology, meanwhile, may be putting us on the cusp of overcoming senescence, the natural degradation of cells over time that is at least a key part of ageing. But, perhaps we should be asking ourselves whether trying to make death obsolete is actually a good idea in the first place. In this, I find myself agreeing with Ray Kurzweil, a futurist, who once mused, “Death gives meaning to our lives. It gives importance and value to time. Time would become meaningless if there was too much of it.” In having a finite amount of time, we are forced to grow, to try at our ambitions and to embark on journeys before it is too late. It is confidence in finding the meaning in life, not resurrection or extension, that can truly face up to death’s taboo.
Picture: Cat Saunders
In the name
Five unethical experiments that could greatly advance science.
Let the inner evil genius take hold. Here are experiments to transcend the frontiers of knowledge… if scientists’ moral compasses were due South.
How do the nearly infinite neural connections of the human brain cooperate to determine behaviour? Much of what we know comes from studying brain injuries, which allow us only to crudely infer the functions of different areas by assessing the effects of the damage. Enter optogenetics, a technique which has been successfully used to control brain activity in mice. It involves a harmless virus which, when injected into the brain, renders ion channels – the switches that turn cells ‘on’ and ‘off’– light-responsive. By flashing focused light beams into brain tissue (via fibre-optic strands), cell activity rates can be modified instantaneously and the effects on subject faculties studied. Importantly, changes would be only temporary. Just imagine if shutting down a few cells in the amygdala would allow you to shed your worst phobias. Well, simply allow some electric gizmos to be implanted into your skull and we’ll see.
2 . Twin separation
3. Womb swap The obesity epidemic costs the UK billions
of pounds every year. But are some people genetically predisposed to be overweight? This is just one question that could be answered by switching the embryos of mothers-tobe. Many of the most important epigenetic influences on development – the ways in which gene expression is altered by the environment – take place in the womb. For instance, the babies of obese women tend to be overweight, even before birth. To determine whether epigenetics or original genetic makeup is to blame, the eggs of obese and thin women, fertilised using regular IVF, could simply be swapped between mothers. But this is just one breakthrough that could be made through such an act of selflessness; the same procedure could also be used to demonstrate the effects of pre-birth exposure to toxins such as pollutants, nicotine or alcohol. If only expectant mothers weren’t so protective…!
5. Embryo mapping
The enigma of ‘nature vs. nurture’ has tormented developmental psychologists for decades. Yet there exists one obvious underexploited resource: identical twins, who share almost 100% of their DNA. Rarely, studies have been able to track twins separated from infancy. However, it is impossible for researchers to know the many ways in which the twins’ lives may have differed from birth. To tease out the interplay of genetic and environmental influences, scientists would need to separate twins immediately after birth and raise each in a pre-designed environment, every feature of which – from diet to human contact – would be rigorously controlled and quantifiable. Such an experiment could reap enormous benefits for geneticists and psychologists alike, finally offering a scientific insight into why twins raised together often develop contrasting personalities, whilst those raised apart can turn out highly similar… at the meagre sacrifice of free will and sibling bonding.
Humans and chimpanzees share 95% of their DNA, so how did the two species turn out so different? It’s been proposed that this is simply down to developmental timing, wherein certain features of our last common ancestor (e.g. small jaws) are expressed in adult humans but only in infant chimps. How to explore this? Produce a half-human, half-chimp. The experiment could be frighteningly easy; IVF has a good chance of seeding a viable human-chimp embryo and scientists have already bridged the comparable genetic gap between baboons and rhesus monkeys. Whilst the possession of an extra chromosome pair by chimps would likely yield infertile offspring, this is not an insuperable barrier, as ‘zebroids’ attest. Chimps are born smaller than humans so it would make sense to grow the embryo in a human uterus. This experiment could shed real light on our evolutionary origins… but for its major ‘ick factor’.
Stem cell therapies hold extreme promise for conditions including Parkinson’s disease and leukaemia. But regeneration of damaged tissues depends on stem cells being introduced at the precise stage at which they are needed; that is, the right genes must be expressed at the right time. How to determine what these are? By tracking cellular genetic activity as an embryo develops into a fully formed human. In principle, we already have the necessary tools; a synthetic virus could be used to insert a detectable ‘reporter’ into an embryo’s DNA, allowing researchers to track how different genes are switched on and off as each cell divides and differentiates into a specialised tissue – heart, lung, etc. This knowledge could allow us to direct differentiation for intended purposes (e.g. beta cells for a diabetic pancreas). Unfortunately, with the associated risks, no mother is likely to allow her offspring to be used in this particular science project!
Pictures: Madeleine Finlay
e of Science
Five unethical experiments that greatly advanced science.
Delve into the dark history of scientific research, where we expose the most blatantly unethical experiments… and their uncomfortably valuable results.
1. “Acres of Skin”: Holmesburg Prison Experiments
Retin-A, the world’s leading acne treatment, was hailed as “youth in a tube” when it first hit the market in 1969, and was later found to be effective in treating leukaemia. Less well known is that Retin-A was developed literally on the backs of thousands of unwitting prisoners. For 23 years, dermatologist Albert Kligman convinced inmates at Holmesburg Prison, Pennsylvania, to serve as test subjects for chemicals ranging from relatively benign cosmetic creams to the carcinogen dioxin, in exchange for small stipends. “All I saw before me were acres of skin,” the personable Kligman told reporters. Although Kligman’s research was technically in keeping with standard protocols of the era, the main controversy surrounding the study was that of informed consent. Inmates subsequently claimed that they had not been made fully aware of the risks and sued Kligman for a spate of long-term health problems.
2. Obedience: Milgram Experiments
“Could it be that Eichmann and his accomplices in the Holocaust were just following orders?” asked Stanley Milgram as he embarked on his 1961 study into the psychology of obedience. Forty volunteers were introduced to a partner (a confederate of Milgram’s posing as a participant) assigned the role of ‘learner’, whilst the volunteer fulfilled the part of ‘teacher’. The learner was asked to memorise a list of word pairs before being strapped to a fake electric chair, control of which was handed to the ‘teacher’. Volunteers were then instructed to test the learner on the pairs, administering shocks of increasing intensity with each wrong answer. The learner, under Milgram’s instruction, gave predominantly wrong answers, feigning cries of pain accordingly. This study yielded one terrifying conclusion: 65% of volunteers followed orders to the point that – had the shocks been real – they would have killed a fellow human being.
3. Effects of Fallout: human radiation experiments
In the heat of the nuclear arms race, US researchers performed thousands of governmentfunded studies to assess the effects of atomic radiation on the human body. And not just any human bodies – almost invariably those of vulnerable minorities. These despicable experiments ranged from supplying 829 pregnant women with “vitamin drinks” (containing radioactive iron) to study how quickly the isotope crossed the placenta, to the release of radioactive chemicals over entire cities. Most conspicuous on the world stage was “Castle Bravo” in 1954, in which an H-bomb – the country’s most powerful device ever detonated – was dropped on Bikini Atoll. The ostensibly “unexpectedly high” fallout was then exploited in the underhand Project 4.1., which evaluated the radiation poisoning suffered by the atoll’s 239 residents. The fallout dispersed globally, reaching Australia, Japan, India and Europe, and even three years later, the islands remained unsafe for habitation.
4. Conditioning: the Monster Study
For his entire life, Wendell Johnson was plagued by severe stuttering. It was the lacerating impact of his affliction that spurred his “Monster Study”. In 1939, pervading theory held that the causes of stuttering were physiological. However, Johnson, at the University of Iowa, was convinced otherwise. Determined to prove that stuttering could be learned by any child, he performed a six-month study on 22 orphans. The children were placed into control and experimental groups, after which half were given positive speech therapy, praising their enunciation, whilst the others were belittled relentlessly: “…don’t ever speak unless you can do it right”. Unsurprisingly, the effects of this bullying soon manifested as speech defects, withdrawal, and point-blank refusal to speak. So whilst Wendell irrefutably demonstrated the psychology of stuttering, the lasting impacts of his study quite rightly came back to bite by way of a multi-million dollar lawsuit.
5.The Unconsented Cell Line: the immortal life of HeLa
In 1951, at John Hopkins Hospital, a 31-year-old black farmer named Henrietta Lacks died from an aggressive cervical carcinoma. In her final hours, a biopsy was taken from Henrietta’s cancer without her knowledge or consent, from which was born modern medicine’s most ubiquitous cell line: HeLa. Unlike other cells of the time, which died within a few generations in culture, HeLas had the remarkable property of being “immortal” - capable of multiplying indefinitely. Their discovery launched a multi-billion-dollar industry and has made invaluable contributions to science, including development of the polio vaccine, IVF, and ongoing research into AIDS and cancer. Yet for over 20 years, whilst laboratories worldwide disseminated and capitalized on these “wonder cells”, the Lacks family, struggling in poverty, were left in the dark about the events surrounding Henrietta’s death, and saw not a penny of the profits.
Defying Darwin by Katharina Kropshofer
ithin biological science, there is a man who represents what Mozart is to classical musicians, what Jamie Oliver is to amateur cooks and what Lena Dunham is to a new generation of feminists. The man’s name? Charles Darwin. Questioning him and his established theories is quite possibly one of the biggest taboos in science. Words of critique are often associated with creationism, an evolutionist’s sworn enemy.
Darwin was a strong defender of the thesis that competition between individuals allowed only the winners to survive and reproduce. Conversely, scholars like Wallace believed that it was adaptations to environmental pressures that guaranteed species variety. This idea links to the theories of Jean-Baptiste Lamarck, whose theories are normally considered incompatible with Darwin’s and therefore rejected. He stated that the influence of circumstances (or “L’influence des circonstances”) leads to animals adapting to their environment during their own lifetime.
Moving away from molecular insights, our social behaviour can provide us with some more examples that throw into question the classical Darwinist approach. Could cooperation be the key to understanding how species evolve?
However, as the 158th birthday of On the Origin of Species approaches, it’s time to revisit this hugely influential book within the context of the social changes and scientific developments that have transpired since its publication. This assessment demands a critical review of Darwin’s legacy. Is it taboo to say that Darwin has left us with many gaps to fill?
Martin Nowak, mathematician and biologist at the University of Harvard, is using game theory – often applied to economics – to produce mathematical models about conflict, cooperation and zero-sum games. They can be applied to the way organisms interact on different levels, like single cells cooperating to form multicellular organisms. In cooperation between species and individuals, this follows the logic of “tit for tat”, or reciprocity.
In 1831, on board the famous HMS Beagle, Darwin came up with his key theories. However, he waited another twenty years to publish them, and they may never have been published at all, if it weren’t for the naturalist Alfred Russel Wallace. Wallace sent Darwin a letter suggesting that natural selection could lead to species change over time, an idea almost identical to Darwin’s. It was this unexpected letter that led Darwin to quickly publish his findings, catapulting him to success. What made him wait? Was it fear of the established and popular views of the Church? Or was it clear to him that his theories were still missing s omething?
Lamarckism in the 21st Century
“In the distant future I see open fields for far more important researches,” Darwin wrote. He was apparently well aware of the potential to expand his research, but also of the flaws of his theory. He didn’t know a lot about the variations and mechanisms for change between generations. Instead, it was the Austrian monk Gregor Mendel who established the basic rules of inheritance, considered valid at the time.
Epigenetics studies the non-genetic mechanisms (notably a chemical process known as methylation) capable of altering living DNA. The genetic changes can then be stored and passed onto the following generations. It means that malnourished rats can give birth to undersized pups by passing on some acquired traits. It also means that elements outside the genome can regulate how genes are expressed, activated and de-activated. This is especially interesting when it comes to psychology, and the effects of traumatic experiences. Researchers from Emory University have conditioned male mice to be afraid of a specific smell. The surprising thing? Their offspring responded to the scent in the same way. The only way this response could have been transmitted is via epigenetic changes in the mice’s sperm.
Lamarck’s theory is known as soft inheritance and is regaining popularity among the scientific community today. Though Lamarck is often portrayed as an ‘old fogey’, his ideas were progressive: “On our planet, all objects are subject to continual and inevitable changes (…) They take place at a variable rate according to the nature, condition, or situation of the objects involved.” For a long time it didn’t seem like variation in phenotype (the non-genetic makeup of organisms) could lead to heritable change in genotype, but the approach of epigenetics makes this notion very plausible.
According to Nowak’s research, some organisms will even relinquish their own benefits to ensure the success of a companion. In terms of fitness, this means indulging our counterpart and so could be considered to oppose ‘natural selection’. Although Nowak himself is a controversial figure in the field of Evolutionary Biology - not least because he is a staunch christian - but he also proposed interesting mechanisms by which cooperation may become favoured in evolution over being selfish. He draws on research of reciprocity to suggest a mechanism of ‘reputation’: an organism may help another due to the expected compensation by someone else. ‘Cooperation, not competition’: a possible extension to Darwin’s theories of inheritance and evolution. Also generally wise words to live by.
Einstein’s Test by Gaia Stucky de Quay ver the centuries, shifting paradigms and shattering dogmas have paved the path of science. For every iconic scientist who establishes a new theory, another inevitably comes along to erase what was once written in stone. Copernicus, Pasteur, and Darwin, among countless others, produced some of the most revolutionary research in history, only to be met with dismissal, scorn, and even accusations of heresy.
he was ostracized by the orthodox medical establishment, who believed that ‘gentlemen’ could not transmit disease, and his life-saving findings were ignored for decades. The tragedy of Semmelweis is a tale of the past, but the rejection and ridicule of outlandish theories still persists today. In 2011, the OPERA experiment announced that it had observed neutrinos violating the speed limit set by Einstein. The experiment involved firing neutrinos and measuring the time it took them to travel the 730 km from the CERN laboratory in Geneva, Switzerland, to the lab in
Albert Einstein challenged centuries of scientific thought with his general theory of relativity and work on quantum mechanics, and today much of his work describing our universe is accepted as fact. Take, for instance, the speed of light, which is 299,792,458 metres per second. It is one of the most fundamental constants of nature. But physics is evolving at a fast pace; it expands, taking in more fields, and sharpens, becoming more precise and certain of the ways of the universe. In this age of ruthless scientific advancement, will there come a time where Einstein is brought down? And more importantly – will we resist it?
People naturally fear the new. In fact, they reject it with such frequency that there’s a psychological syndrome to explain it: the Semmelweis reflex. This phenomenon describes our reflexlike tendency to reject new information when it contradicts our established beliefs, originating from the tale of Ignaz Semmelweis. A young doctor in 1840’s Vienna, Semmelweis noticed a connection between mothers dying after childbirth and doctors performing autopsies before delivering babies. Although germ theory was not then known, Semmelweis made the simple suggestion that doctors performing deliveries should wash their hands. In Semmelweis’ hospital, mortality rates drastically decreased when doctors washed their hands with a chlorine solution. Amazingly,
Gran Sasso, Italy. When the neutrinos arrived 60 nanoseconds early, the OPERA team made headlines and shocked the world. An entire century of physics was about to be overturned. During this time, the scientific community doggedly scrutinized the results. Heinrich Paes, a physicist at Dortmund University, explained that neutrinos could be taking a shortcut through space-time, by travelling from Vienna to Gran Sasso through other dimensions. However, only a few months later a loose fibre optic cable was found to be the culprit. When the neutrino anomalies were debunked, the
eccentric theories of Paes, along with many others, were torn apart, and support for Einstein’s relativity theory increased tenfold. Professor Ereditato, who led the experiment, was even forced to resign. Ultimately, if you plan on waging a war against an unwavering paradigm, the stakes are high, and the fall great.
Testable predictions Here at Imperial College London, the battle continues. Contrary to Einstein’s laws, researchers have suggested that in the early universe, just seconds after the Big Bang, the speed of light could have been higher. Professor João Magueijo, from Imperial’s Department of Physics, along with Dr. Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could test this claim. The structures in the universe that we observe today were formed from fluctuations in the early universe. Tiny differences in density early on caused the distribution of galaxies, clusters, and much more. Magueijo and Afshordi believe that these fluctuations were influenced by variations in the speed of light. These fluctuations have left a record in the ancient radiation from the Big Bang, termed the cosmic microwave background (CMB), which stretches out over the entire observable universe. Using their theory for a variable speed of light, Magueijo and Afshordi are able to predict this record in the form of a ‘spectral index’. Soon enough, it will be possible to compare the index obtained from their model with the increasingly precise CMB readings from cosmologists. Their prediction is a very precise 0.96478. The current estimate of readings is around 0.968 – close, but not yet close enough. Magueijo explains: “The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today”. Let’s hope that when the decisive reading comes through, there are no errors or loose cables getting in the way!
Picture: Karolina Jankiewicz
A Pill for Empowerment The contraceptive pill is a modern triumph. Meet Margaret Sanger, who fought to make it happen.
by Sophie Protheroe
n the nineteenth century, contraception was a taboo topic. Respectable women were expected to display modesty and sexual innocence. This pretence was dangerous for women in more ways than one. Firstly, contraception was not only controversial, it was illegal. In the US, the Comstock Law of 1873 forbid the sending of contraceptives, or information about them, in the mail. This meant many women were forced to resort to desperate measures to avoid unwanted pregnancies. Common methods of inducing miscarriage included drinking gin, ingesting lead or even ‘falling’ down the stairs.
contraceptives. This was a risky undertaking and she was arrested on several occasions.
The middle classes in nineteenth-century society adhered to clear gender roles, and so a woman’s role was defined by motherhood. There was a notion that the female body was designed solely for reproduction, and that the ovaries were the ultimate source of femininity. Rudolf Virchow, a German physician, declared that “All that we admire and respect in a woman as womanly is merely dependent on her ovaries.” Physicians examined the ovaries in search of the female ‘principle’, and in the 1930s, aided by the new field of biochemistry, sex hormones were identified and chemically isolated.
In 1951, Sanger approached Gregory Pincus, an American reproductive biologist, and persuaded him to start research on hormonal contraceptives. Women were so desperate for birth control that many wrote personally
By the 1950s, Sanger had won many legal battles, but her work was not finished. Women were still restricted by the limited birth control options available to them. There had been no advances in contraceptive methods since the invention of the diaphragm in 1842, and the subsequent introduction of the rubber condom in 1869. Sanger was searching for a ‘magic pill’ that could provide women with the contraception they needed. She had a vision of a ‘one-size-fits-all’ contraceptive that could be used by women of any age, race or class.
to Sanger offering themselves as participants in clinical trials. From this work, multiple variations of a female contraceptive pill were created and four large scale clinical trials were organised. ‘Enovid’, containing progesterone plus a little oestrogen, was the most successful and became known as ‘the Pill.’ The Food and Drug Administration approved the Pill in 1957 as a treatment for severe menstrual disorders, with the side effect of preventing pregnancy. Unsurprisingly, an astonishingly large number of women began reporting that they suffered from severe menstrual disorders. Finally, in 1960, the Pill was approved for contraceptive use. Although the Pill may look modest, an enormous amount of labour went into the development of this tiny tablet, and today the Pill still remains as a potent symbol of female empowerment.
Knowledge of the chemicals involved in reproduction offered new possibilities for contraception, and the development of female contraceptives was pioneered by the women themselves. Women began to demand the right to exercise choice over their own bodies. Margaret Sanger, an American feminist and birth control activist, believed that the most important threat to women’s independence came from unwanted and unanticipated pregnancies. Sanger was motivated by personal tragedy; her own mother had died of tuberculosis at the age of 50, weakened after giving birth eleven times and having seven miscarriages. Sanger had also witnessed the horrors of five-dollar, back-alley abortions whilst working as a nurse in poor communities. She began to devote more and more time to campaigning for better contraceptives. In 1914, Sanger coined the term ‘birth control’ and began providing women with information and
Sculpture : Amelia Owens
Genetic tools offer the chance to eradicate hereditary diseases. Misused, they could become instruments for prejudice.
by Andris Piebalgs
ugenics has a dark history. In its simplest definition, it refers to the notion of improving the human gene pool through direct intervention. But the term brings up bad memories, as a movement arose in the early 20th century which attempted to ‘purify’ the gene pool by forcibly sterilising patients suffering from mental illness. It is seen as an unearthly practice that violates human rights and enshrines the idea of a ‘perfect’ individual. Recently, scientific developments have allowed us to read and edit DNA to an unprecedented scale and accuracy. This technology offers the chance to eradicate genetic diseases, while raising fears of ‘designer babies’, with physical traits that could be hand-picked by parents. These fears have been brought about by two recent developments in the field of genetic engineering. Firstly, there has been a massive drop in the cost of DNA sequencing due to the adoption and increased commercialisation of Illumina dye sequencing. In 2001, it cost around $100 million to sequence a genome. Nowadays, it can cost as little as $1500. Secondly, the advent of the CRISPR/Cas9 genome editing technique has allowed for researchers to edit DNA quickly, cheaply and effectively. It has been a breakthrough for scientists studying disease, allowing them to complete work in a few weeks that before would have taken them an entire year. Researchers are already making big strides towards improving healthcare. Currently, pregnant women can undergo a procedure known as amniocentesis whereby a small dose of amniotic fluid is extracted to genetically test the foetal cells for chromosomal conditions such as Down’s, Edward’s or Patau’s syndromes. In fact, embryos created using in-vitro fertilisation (IVF) can be genetically pre-screened to test for any chromosomal abnormalities, allowing doctors to select only those they deem fit to be transferred to the womb. This procedure sparked debate when, in 2014, Richard Dawkins claimed that it is “immoral” to consciously choose to have a child suffering from Down’s syndrome. The Down’s community
Sculpture: Lois Liow
hit back, with the advocacy group Saving Down’s saying that it is right for parents to have a choice and that “there is nothing more demeaning to someone living with a disability than to be told that your life is of such little worth”. Additionally, with the use of CRISPR, scientists now have the tools to seamlessly change the genetic code in human embryos to weed out several harmful mutations at a time. However, there are concerns that editing the germline (the genetic code that can be passed onto subsequent generations) may be dangerous. In fact, Jennifer Doudna, one of the inventors of CRISPR, has called for a moratorium on editing the germline: “It is very important to consider the unintended genetic consequences of making an intended change, because there are all sorts of genetic interactions that occur in cells during cellular development, especially in humans but also other organisms as well.” It may take many years until making genetic changes to human embryos will be deemed safe. While many are excited by the technology’s potential in eliminating disease, others are concerned by its power to summon up the demons of the past. It is foreseeable that, in the future, parents may be able to choose the personality and physical characteristics of their offspring according to their own prejudices. Dystopic visions pop into people’s’ minds: a colourless future where everyone looks, thinks and acts the same. Some conjure up Brave New World, where the gulf between the rich and poor is exacerbated by genetic differences. Although at the moment this seems unlikely (as many traits, such as intelligence, are not controlled by a single, easily modifiable gene), many still oppose the use of genetic modification. The Economist remarks: “There are those who will oppose CRISPR because it lets humans play God. But medicine routinely intervenes in the natural order of things—saving people from infections and parasites, say.” Throughout recorded history, humans have been using tools to improve their lives. We have to ask, where does the line between medicine and meddling fall, and whose responsibility is it to decide?
Science behind the photo by Annabel King
Can I take a photo of you nude?
linical photography is the healthcare specialism that documents medicine’s weird and wonderful. Photographs of patients are routinely used for teaching, dermatology, surgical planning, and audit. As part of the medical record, clinical images are strictly non-sexual, but is photographing nudity still taboo? To investigate, I asked volunteers to strip off for a clinical photograph of their torso.
judgemental or even complimentary of others’ bodies, participants were often highly critical of their own. Nudity and vulnerability seem inherently linked.
Objectivity and repeatability are imperatives in clinical photography. These are achieved through standardised lighting, alignment, positioning and scale. Standardisation extends to the subjects themselves: they are asked to remove all clothing and jewellery, tie back their hair, and their face must be expressionless or excluded completely. The resulting image should offer no insight into the person behind the flesh. Volunteer’s anonymity was assured, but many remained hesitant to bare all.
While nude photos are commonplace in the media, we are rarely exposed to a representative diversity of human bodies. Instead, a distorted impression of reality is created, where all bodies are young, toned and flawless. Participants who felt that their body failed to conform to the socially constructed norm tended to be less willing to undress for the camera. I believe that all bodies deserve to be accepted, not measured up against an unrealistic standard.
Medical professionals become desensitised to nudity through repeated exposure. The numerous torsos in this series are arranged to mimic this effect. Prospective volunteers agreed that individuals become insignificant amongst the many, but some still felt uncomfortable imagining their own body in the line-up. Though non-
The taboo surrounding nudity encompasses issues of gender, age, body-confidence, culture and spiritual belief. It is evident in some of humanity’s oldest stories: in the Bible, Eve becomes ashamed of her nudity after eating the forbidden fruit. Body-shame today is alive and well.
Is clinical photography taboo? Perhaps. But it could be key to challenging cultural norms about the body. In this ongoing project I would like to represent the many diverse body shapes, skin colours and textures that exist. My purpose: to promote a broader acceptance of bodies in all their forms – ultimately, liberation from the taboo of nakedness!
Clouded Judgement A story of difficult sights and deliberate silences.
by Silvia Lazzaris
n the morning of 6 December, 1952, Mrs Huntington opened her bedroom window, lazily meandered through to the kitchen, and started to prepare her usual bowl of porridge. It was a particularly cold December and she could feel that the damp, freezing air was making its way down to her bones. Mr Huntington had already left. Nothing about that morning impressed itself on her particularly. For her, it was just like many others. Mrs Huntington went out from her luxurious flat in Trevor Square, Knightsbridge, and stepped over the road to wait for the bus to take her to work. She could sense something strange in the air, but wasn’t sure exactly what. The noise of a motorcycle got closer and closer. The sound stopped right behind her. Close. Too close. Despite everything, Mrs Huntington did know where things should be. She knew, for example, that a motorcyclist shouldn’t be on the pavement. “Which way to Knightsbridge tube stop?” asked the rider with a voice broken by panic. “Well, seeing as you’re on the pavement, carry on 20 yards, and you’ll go straight down the steps into the station!” replied Mrs Huntington with her unique tone that managed to perfectly blend perplexity, cheerfulness, and indignation. “Oh my god, I’m so sorry, today I just can’t tell where I am!”. “Oh, you don’t know where you are!” she thought. Instead, she said, “Are you alright?”. But the rider had pushed his bike onwards and he could not hear her anymore. “How rude!” she thought, mildly disappointed, yet somewhat amused. Mrs Huntington’s job required her to be invulnerable to rejection, made her willing to work on reluctance, gave her the ability to break the barriers of the body, and helped her to deeply understand human reasons. Mrs Huntington didn’t have a common job, nor an official one. Her job meant sitting, talking, asking, answering, then bundling it all up together to give a sense to everything, and writing it carefully out on her special typewriter. In the evening, Mr Huntington, a journalist for the Daily Telegraph, would read her notes from work which she had left on the table. He would consider if they held a good story inside. If they did – and usually they did - he would put them in his briefcase in the manner of someone who is borrowing money without permission, and have material for the next morning’s article. Mr Huntington liked to tell himself that her job was to be his unofficial assistant. Mrs Huntington had fallen in love with him because he smelt of a good man. At first, she hadn’t been able to detect his sloppiness, but when she did, she realised it simply meant that she couldn’t expect anything too horrible from him. He cared about his work – up to a point. He did his work well – up to a point. He liked people – up to a point. People liked him – up to a point. Because of Harry’s ‘up to a point’-ness, the union of the Huntingtons was enshrined in an impossibly unspeakable truth: that, without her, he would never have been anyone. In front of her, cars were crawling along in a traffic jam. A little farther, a mother was scolding her son with a neurotic tone “Today you need to stay close to me! Do you understand?”. The air was thick with something, and passed with difficulty through the nostrils. It was material, tangible. Every breath was like inhaling on a cigarette. A bite of coal. Mrs Huntington was born in London and had never gone anywhere else for longer than a fortnight: this fact made her very used to dust and the strong odours of the city. Indeed, she loved them, they were part of her identity. But on that morning, of December 6th, there was something different in the air.
The bus hadn’t arrived yet. Time was sitting still, like the air around her. Mrs Huntington too felt still. The people moving around her were leaving an aura of stimuli. An old lady slowly shuffling past was wearing a tuberose perfume. A gentleman coughing loudly had just drunk peppermint tea. A girl, perhaps quite young, had jingling jewellery that was slamming against her chest with each of her steps. The smell of coal impregnated Mrs Huntington’s nostrils again. Across the street, two children were pretending to be on a secret spy mission, but both were coughing and spluttering between scenes. Someone slammed the door of a car. The driver got out and began shouting incomprehensible words with an ebbing voice and withdrawing footsteps. Mrs Huntington thought this was strange, although admittedly nothing particularly out of the ordinary for a Londoner. And yet, she wondered, something was slightly off. “Excuse me”, she asked to a young man standing a few meters away, “Do you know when the bus should arrive?”. “With what’s around today, I don’t have the foggiest”. She had the feeling that he wasn’t looking at her, that he seemed distracted. Well, there was definitely something peculiar, on that morning. Perhaps an accident? Or a fire? “I’m sorry to bother you, but could you elaborate a little bit what is it, that is happening today?” “Lady, seriously don’t you see the problem?” Oh, there was the problem. When the problem is not being able to see, either Mrs Huntington utterly understands what it means, or she doesn’t know what you’re talking about at all. White, black, red, green; For Mrs Huntington these words are empty of meaning. People usually like to ask her if she sees all black, or all white, or any kind of shadow, but she never knows what to answer. She could try to explain that, for her, without sight, there’s no such thing at all. There’s no black, nor white. Light does not exist. The world of Mrs Huntington is made up of sounds, smells: tangible substances that make up an atmosphere dispersed around things and people. She doesn’t need to look upon a face to figure out whether it is beautiful, or if it is lying. She doesn’t have to lie beholden to the eyes in order to understand the emotional states of the people around her. She doesn’t even need to observe the colour of an apple to see if it’s rotten. There is a whole realm of fundamental details that you can smell, touch, hear, feel in the body, without sight, to understand the world. Often, Mrs Huntington felt concerned that other people might lose these essential details just because they were constantly overwhelmed by having to see too much. Always, people got distracted looking at the irrelevant surface of things. They relied on the sign of a shop to orientate in the space, they focused on a glimpse of a smile or a raised eyebrow, misunderstanding the tone of voice – and by doing this, they missed the real truth of it all. Mrs Huntington sought these eclipsed meanings with a peculiar kind of investigative strictness, pulling together information and building a structure that would give sense to the world. Until that moment, however, Mrs Huntington had missed something very important: London was, in fact, muffled by fog, as if the whole city had been picked up and dipped in whipped cream. She couldn’t know, until that moment, that everyone was blinded apart from her. But now she could see what was so strange about December 6th. A scream interrupted the revealing exchange with the young man. A group of people were shouting confusedly, a little farther from the two of them. With difficulty, she got up from the bus stop bench, and walked toward them. “A doctor! He needs a doctor!” they shouted. An old man was lying on the ground. “A doctor, quick!” Mrs Huntington cried out with them. She turned to a woman “What’s happening?” “He’s choking!”. He was coughing to death, and they were coughing with him, despaired, screaming like a bunch of crazed monkeys. “The hospital! A doctor! Help!”. Then, finally, a long gasp from the ground plunged them into silence.
She ran home and called Harry. Everything was fine. He had had a mad day, he would’ve told her once he was back at home. No, he wasn’t coughing. Yes, he was fine. How was she? She was fine. A little shocked, she had to tell him what happened. Yes, he was about to leave work in a moment, it might take him longer, because of the fog. Mrs Huntington went to her room and picked up the suitcase with her Braille typewriter. She took it into the kitchen, and placed it on the table. She crossed her legs, then straddled them, continuously crossing, then straddling, and crossing again, whilst leaning her elbows on the table. Eventually, she got up, sat back down, and put the pieces of information of what she had smelled and heard together, trying to make sense of the sick air, and of that man’s gasp. “I have a story for you” she said welcoming her husband home. “The paper’s already full.” he replied with an awkward satisfaction. Apparently, the fog allowed people to hide, to be unrecognisable, to escape easily. London was in chaos, in the hands of pickpockets on their bikes, in the hands of thieves climbing up the façades of the buildings: looting, beating, killing. Ambulances couldn’t rescue many people, there were too many cars lying abandoned in the road. Mrs Huntington felt all the excitement of her husband, his compassion blended with the enjoyment he felt when dealing with disasters. Yet, she was also feeling embarrassed for him. “What did you want to tell me?” he asked, still revelling in the particulars of his own stories. “A cough” replied Mrs Huntington. A cough? Yes, the air was making everyone ill. It took away a man’s breath, his life, as he lay on the street. “Well, humidity makes people sick! But coughing doesn’t kill you, you don’t suffocate on the spot! He must’ve already been ill” Harry rebutted. “But still: whilst it remains, the Telegraph should be advising everyone not to go outside.”, argued Mrs Huntington over Harry’s ignorance. But why would the fog persist? “Because it smelled like the pit of a fireplace, Harry, it wasn’t fog, it was coal.” Harry implied she was being catastrophic, as always. Mrs Huntington never was catastrophic though, she handled her senses with care and balance. “Anyway, how could London continue
www.isciencemag.co.uk Picture: Kalyani Lodhia
to function if everyone barricaded themselves at home? What’s the sense of making people panic over a bit of fog?” went on asking the dear, sloppy, husband. Maybe it is worth the panic, suggested Mrs Huntington. How? For example? “Well, if the coal factories are poisoning the air…” She stopped for a moment. She already knew what his answer would be. She already knew how her thoughts would sound like nonsense to the majority of people. She went on anyway. “Perhaps, for example, we should ask whether the coal factories are so necessary?”. “Don’t be ridiculous”, he replied with the classic tone he always used to try and patronise her. “It’s impossible to live without coal”, he said. Other words sounded like they were stuck in his throat, as if he was trying to think about something, trying to picture something in his head. He concluded “It’s impossible to even imagine a working world without coal energy. You just don’t know how these things work, Emma”. At that last sentence, Mrs Huntington decided to drop the discussion. She got up from her chair at the kitchen table and began to prepare the dinner. In any case, she would’ve left on the table her straightforward story of the choking man. Then, accepting or refusing the truth would’ve been his decision - the predictable decision of a sloppy man. In the end, Harry, as always, would decide to close his eyes to unpleasantness. He would decide to shut his mouth, take his fingertips off his typewriter, and consciously avoid the hard path. He was a coward, a victim of the desperate need for social recognition: what was true - or rather beneficial - to the relevant people, was true to him. Stick to the approved version, that was his journalistic golden rule. For this reason, Harry often deliberately made himself mute, deaf and blind, to those truths which were too painful, too awkward, too difficult to mould into something graspable. Not so for her. There was definitely something big going on for Mrs Huntington, on that December 6th: she had never felt less blind in her life.
Making A Killing
Investing in research and development of weapons is a moral minefield which we must tread carefully.
by Aran Shaunak
illing people isn’t a nice thing to talk about, yet in the UK we spend over £35 billion on our military every year. A lot of that money is going into research and development to produce new weapons of war. It’s not something we discuss often, but is spending vast amounts of money to find more efficient ways to kill people actually ethical? The cutting edge of 21st century weapons development is on robotics. If you’ve read the news in the last five years, you’ll be aware that weaponized unmanned aerial vehicles – or, as they are better known, drones – have become the new weapon of choice among western countries. Throughout the war in Iraq and Afghanistan, predator drones were championed to great effect; they allowed precision targeting of high-value targets deep in enemy territory without risking the lives of soldiers on the ground. It sounds like a perfect solution. However, the actions of a drone are entirely dependent on the quality of the intelligence fed to it. With a simple aerial view, a drone cannot replicate the judgement required for a situation that a soldier on the ground might make before pulling the trigger. For example, imagine a drone hovering over a school. If the intelligence says that it is an ISIS training camp, that drone will likely be fired. Whereas, a group of soldiers with the same information would be able to assess the situation in a way a drone couldn’t, and would be unlikely to execute a building full of schoolchildren. If we are taking human lives, even in war, people deserve a moment of consideration before the trigger is pulled. This is certainly worth reflecting on; civilian casualties as a result of the 7-year American drone program in Iraq and Afghanistan has been hotly contested – ranging from 117 to over 800.
Yet, common sense suggests that another real benefit of using drones is the decreased stress and pain for soldiers, since drone operators can be separated from their targets by thousands of miles. However, research by the US Defence Department has indicated that drone pilots suffer the same levels of PTSD as soldiers in combat. It seems that pulling the trigger has the same effect on a person, regardless of whether your target is in your sights or on a screen. So robotic weapons aren’t perfect. What about chemical weapons? Absolutely abhorrent. Biological weapons? Arguably worse. Not acceptable to develop, make or use under any circumstances. Except, not quite: our own government has had considerable interest in bio-chemical warfare. Porton Down: the British Government’s secret country pile in Wiltshire. It was founded in WWI with the directive of researching the chemical weapons being used at that time, such as mustard gas and chlorine. After WWII, when new nerve toxins were found in German army stockpiles, it played a key role in analysing them. But Porton Down’s activities didn’t stop there. Since then it has taken those nerve toxins and perfected them, developing a particularly potent weapon known as VX. There’s even been human trials performed with chemical weapons, one of which resulted in the death of Aircraftsman Ronald Maddison in 1953. But it isn’t just chemical weapons. During the cold war, the Ministry of Defence released (supposedly harmless) airborne bacteria into the Underground to study the effects of a biological attack on London. Even today, Porton Down’s secure labs contain samples of some of the most aggressive and dangerous pathogens the world has ever seen. Ebola, anthrax and the plague are all on our doorstep. If they get out, the citizens of this country could become casualties of the
efforts we have taken to defend ourselves. So our hands are not exactly clean. In the UK’s defence, the government has since signed, ratified and obeyed the Geneva Convention which prohibits the use of chemical warfare, and despite developing VX the British have never (as far as we know) used it in war. However, to this day Porton Down still manufactures both biological and chemical weapons. Can we justify producing such weapons? Undoubtedly, there is a constant risk that such agents might be used against the British Army, so these agents must be studied in order to develop effective defences and treatments against them. For this reason, a soldier might say yes, making these weapons is justified, since the research and development done in these labs is their best chance of defending themselves from such weapons on the front line. Certainly, I would argue that since our armed forces are putting their lives on the line for the rest of us, we have a duty to protect them from biological and chemical weapons as best we can. For many, however, the ethical and the safety concerns around the research done at Porton Down are far too great to justify these lines of research. As is clear, developing drones, making chemical weapons and storing the world’s most dangerous pathogens are grey areas at best. But let me leave you with a more difficult question. It recently came to light that cluster bombs (made illegal in the UK in 2010) had been sold by the UK to Saudi Arabia and were being used in Yemen. It’s an example of a bigger problem: the UK exports around £9 billion of weaponry annually. We might feel we can trust ourselves with these weapons, but can we trust our clients? In fact, much of that weaponry is sold to regimes which are recognised human rights abusers. How do we justify that?
Picture: Lizzie Riach
Know your grubs Snacking on scorpions and crunching on crickets doesn’t sound appealing, but it might just be the food revolution we need.
by Judit Agui
n a cosmopolitan city like London, when it comes to choosing our snacks, we can be quite adventurous. It is easy to find a friend or two who has brought back chilli buffalo worms from their visit to Vietnam, or toasted grasshoppers from Mexico. Being curious, we give it a go, but never really think about eating them again. But, what if we could include insects in our daily diet, and what benefits might it bring? In the past few decades, concerns about ethical crops and food waste has made the most environmentally conscious of us try to reduce our meat consumption or turn to veganism. However, without replacing the nutrients found in meat and dairy, a vegan or vegetarian diet can be detrimental to our health, and in some cases perhaps even lead to malnutrition. Entomophagy, or the human use of insects as food, seems to be a very promising solution to all of our sustainable-eating issues. As money matters, let’s start with the benefits for our local and global economies. Insect farming is more cost-effective than conventional farming because it requires less energy, less space and less time. In comparison to beef, where only 40% of the animal is edible, crickets, for example, are 80% edible: significantly increasing the efficiency in production and use of resources. Eating insects could not only boost our economy but also help protect our crops against plagues and epidemics. Producing far less greenhouse gases, insect farming would also help to reduce meat’s massive ecological footprint, which consumes 70% of agricultural ground. Entomophagy is not only good for the environment and our economy, but also good for our bodies. Per gram, insects have as much, or even more protein than the majority of fish and meat. They also include many essential vitamins such as Calcium and B12. Not only this, but as they are taxonomically
different from humans, insects are also much less likely to transmit diseases to their human companions than conventional meats! So, if eating grubs is nutritious and environmentally sustainable, why are we not gobbling them up? One of the biggest challenges in entomophagy
is obtaining public interest. For us meateating westerners, anything with lots of legs that grows in dirt seems like an extremely unappetising alternative. Interestingly, this is not the case everywhere. Many other countries have a long tradition of dining on insects. About two million people around the world eat bugs. In Mexico, for example, it’s common to find maguey worms at the bottom of Mescal bottles as a seal of authenticity, and in Vietnam coconut worms are eaten alive to preserve their juiciness! Designing more appealing formats, such as protein bars, could be the solution to make entomophagy a reality in the west. However, Indroneel Chaterjee, a scientist at Brooks University who is challenging the stereotypes of insect eating, argues that “People are mainly interested in the novelty, they are looking for the experience and not the benefits.” This means that when it comes to eating insects we would rather try a live buffalo worm than bother with a sticky protein bar. So, even if we do try insects out, it could end up being difficult to get us to incorporate them as a viable protein source in our diet. Yet, as I have discovered, including insects in your daily meals is not really that hard. To the dismay of my flatmates and amusement of my twitter followers, I recently ordered a bag of grasshoppers from EatGrub. I must admit I wasn’t that excited when they arrived and it took me a couple of days to motivate myself to try them out. Apparently, you can grind them into flour, for batter or meatballs, or add them to a curry or salad. In the end, I decided to try them in a stir fry. The result was actually quite good, as the insects added a crunchy texture and umami taste, giving it an interesting twist. Eating insects is not going to be for everyone. If you’re a bit squeamish, you might find it tough to sprinkle crickets into your morning smoothie. However, if like me, you crave culinary challenges, grab some grubs and try out my recipe on the next page.
Picture: Leanna Crowley
Picture: Judit Agui and Leanna Crowley
young minds through teaching INSPIRE PGCE programme
Become a newly qualified teacher (NQT) in just 10 months and join a network of colleagues who will help shape the future of STEM education. Places now available for STEM Master’s and PhD graduates. WANT MORE INFORMATION? Informal information sessions run throughout the year. These are a chance to: » talk to the programme co-ordinators. » see the work of current INSPIRE students. Email INSPIRE@imperial.ac.uk for details and to book your place.
For more information:
Teacher training with a difference Up to £32k in funding Bespoke science communication masterclasses Engage with pupils
10 schools I, Science in over23
Over history, the physiology of women has been misunderstood and mistreated. Are we still misinterpreting women’s health?
by Madeleine Finlay
month ago I was sat on a bus, creeping its way from Peckham to South Kensington. I was running late that day. I’d slept through my alarm, as I often do, and had rushed to get ready – paying little attention to drying my hair or finding matching socks. But I had managed to catch the bus on time, and was finally able to sit and slow my breathing. As the big metal box encasing me trundled along the grey streets, I let my mind wander back to my bedroom, and I began to review the rush to get out the door. Damn. I knew I had forgotten something. Always,
I always forget one thing. That morning I had forgotten to take my pill. I have a specific alarm for it on my phone, but I am an expert at turning off alarms and going back to sleep. Which I must’ve done, and in the ensuing chaos of waking up late I hadn’t remembered to take it. And now I was on the bus, already late, and I was not going back. Damn. I sat, stewing over my mistake, and began to feel jealous. I felt jealous that my male friends don’t have this problem. They don’t have to wake up every morning to take a pill, or have something implanted into their arms or genitals, or go for an injection. They don’t have to think about increased risks of
cancer, stroke, depression, mood changes, weight gain… How had it come to be that my fertility had to be managed, like an illness, rather than theirs? And when would I be freed from this plague of pill-taking? My experience is common. UK statistics from 2006-2007 showed that 84% of women aged 20-24 were using contraception, and out of those, 80% were either using the pill, the injection or the implant. Longterm contraceptive use does decrease with age, but medical interventions continue throughout a woman’s life. During pregnancy, there are scans and blood tests, and childbirth has moved from the home into the hospital. In the UK, the frequency
of caesarean sections has grown from 10% of births 30 years ago, to nearly 25% of births today. And, once they hit menopause, an estimated one in 10 women in their 50s will use hormone replacement therapy (HRT), taking daily oestrogen pills. Clearly, there are advantages to the medicalisation of women’s reproductive systems. However, condoms remain the only accessable contraceptive option for men. The necessity of many procedures during pregnancy and childbirth, including the rising rate of caesareans, is now being questioned. The menopause, a natural change in hormones for women as they age, is regularly referred to as a ‘deficiency disease’, and emerging evidence shows that HRT may be far more harmful than beneficial in many cases. So, there are definite disadvantages too. You might assume that this is a modern issue. However, women’s bodies have been medicalised for a very long time. So long, in fact, that we may step all the way back to 5 BC and place ourselves amongst the ancient Greeks. We stand together in a cramped, dusty room, filled with scrolls and strange looking instruments, and watch as a midwife places a stinking, acrid substance next to a woman’s face, whilst a sweetly scented liquid sits near her vulva. A physician, overseeing the treatment, notices the confused look on your face and explains that the woman’s uterus is wandering throughout her body, causing anxiety, fainting and nervousness. He has consulted Hippocrates’ new work ‘On the Diseases of Women’ and believes the uterus has meandered upwards, and so is trying to draw it back down to its proper place. The woman on the floor says nothing, but gags slightly. Despite his incantations, the uterus keeps wandering. It wanders all the way to the Middle Ages, and so we follow it there too. It stops, and becomes filled with fluid, causing insomnia and irritability. The physicians of this age are sure it can be treated with marriage and regular sexual intercourse. Of course, the fluid doesn’t go away, and so the uterus begins on its journey again, to the 17th century. Here, within a dark and dreary room, barely lit by a few lonely candles, we are told once again that the fluid can certainly be cured by marriage and regular sex, because, we
are assured, semen has healing properties. We exchange a smile and a raised eyebrow, and leave the uterus here. Let us step forward through time once again, into the early 20th century. We sit in a doctor’s office and watch as woman after woman expresses natural emotions, interests and sexual desires. The doctor believes that they are ill with hysteria and recommends treatments such as smelling salts, bed rest, herbal remedies, orgasms, surgical removal of the clitoris and abstinence from reading. As I clasp my hands in frustration, we spin into the 1960s – and another off-white office lined with books, where we see the doctor giving out prescription after prescription for diazepam, under the brand name Valium, to all the women who are dissatisfied with their lives. Society is not the problem, it is the women, but they can be fixed with minor tranquilisers.
large, sagging breasts. We crash downwards. You’re on the bus to South Kensington with me. You notice, as the bus is swaying around the corner, that I’m scrawling something in a small leather notebook. The female reproductive system isn’t an illness with syndromes and symptoms, it’s as natural and normal as a beating heart or a pulsing nerve. Diversity in the female body isn’t a medical condition to be fixed towards a constructed ideal, the differences are beautiful and interesting and make us who we are. Society, and all of us in it, should reflect on how women’s bodies are perceived, understood and treated. The bus has stopped. You watch as I stuff my notebook into my bag, begrudgingly pull myself up and carry on with my morning.
The walls around us begin to swirl and expand outwards, and we find ourselves gliding through the years to 2017. We are sat on blue plastic chairs in a school classroom, surrounded by 12-year-old girls fidgeting behind desks, facing a whiteboard screen showing a diagram of the female reproductive system. The teacher is describing the monthly cycle. She explains to the class that we might experience PMS, premenstrual syndrome, with symptoms like bloating, breast pain and mood swings. No one looks particularly happy at this oncoming prognosis. One girl asks if there’s anything a doctor can do. She knows that syndromes and symptoms can usually be fixed with medication. The room freezes, and we have time to think about those words for a moment. Syndrome. Symptom. Suddenly this natural biological process appears to us as foreign, abnormal, frightening. We’d quite like to fix it too. The scene goes black. A computer screen lights up in front of us. We see the website for The London Clinic, the UK’s largest private hospital. We are on the ‘conditions’ section, under Problems with Breast Development. Placed in the very centre of the screen are the words: ‘The most common breast development problems that women experience result in breasts that are too large or too small, or that have sizes and shapes that are not perceived as ‘normal’.’ Medical terms flash in front of us. Hypomastia. Gigantomastia. Ptosis. Small,
Pictures: Helena Spooner
The Great Debate Should we be searching for scientific evidence that sexual orientation is genetically determined?
Challenging the choice by Marek Wolczynski ‘We are all equal!’ An ideological hope, but sadly not a reality. Widespread tolerance of each individual culture, tradition and lifestyle is yet to be realised. It could be argued that genetics is a source of inequality, as genes are, in fact, what make us all different. But could understanding genes actually help us fight for equality? Most genetic research focuses on the physical manifestations of our DNA. Instead, what if we put more effort into researching how emotional traits and behavioural differences, such as our mental health, are part of our genetically inscribed identity? Specifically, can genetics explain why we love who we do? Understanding the secrets of our DNA could lead to increased acceptance of sexual diversity, potentially reducing discrimination against homosexual people around the world. Discovering the genetic background of sexual orientation would show the connection between the physical and emotional side of humans, and finally help to dismiss
homophobic arguments, such as the idea that gay people might have been influenced into their sexuality. It could illustrate the emotional complexities hidden inside everyone, confirming the power of what is encoded in our genes. One of the first studies into the genetics of sexual orientation involved chromosome linkage among gay brothers, and showed that they tend to have more gay relatives from the maternal, rather than paternal side. This suggests that sexual orientation may be a result of genetic inheritance involving sex chromosomes. Further research led to the discovery of the Xq28 chromosome region that is prevalent in gay men, nicknamed the ‘gay gene’. So, there is already some scientific evidence to show that sexual orientation is, at least in part, dictated by our genes. This scientific understanding of sexual orientation could be used to challenge the prejudice in attitudes towards gay men and women. Revealing, understanding and explaining the genetics of love is the way to demonstrate that this diversity is
a natural and uncontrollable element of humanity. If love need no longer be considered a matter of choice, but rather as a beauty determined by nature, driven by the miraculous power of DNA, the punishment or discrimination against gay people could not be excused with arguments of emotional behaviour norms, ‘lifestyle preferences’, or chemical imbalances. Research into sexual orientation can gain a powerful insight into humans’ love and can suggest possible explanations for homosexuality, based on the patterns written into our genetic codes. Exploring homosexuality from a scientific point of view could prove sexual orientation is geneticdriven behaviour rather than choice. This has important consequences for understanding ourselves and others, which should increase tolerance in our society. Surely we must keep researching.
Picture: Maddy Dench
Understanding our undoing
by David Walker
What, exactly, are we trying to prove? What purpose would it serve to discover a genetic component which controls human sexuality? Perhaps a rebuke to the people who still say ‘lifestyle choice’ – as if such factual caveats ever have much effect on a person’s prejudice and hate. Nor does a genetic component a race make. Should a ‘gay gene’ be found its hosts would not suddenly be treated as a scattered, diasporic race to be covered under existing racial discrimination laws. Such a thing would change nothing in the political struggle for equality in progressive countries, or the struggle to survive in theocratic ones. In that situation, it does in fact present an all too real danger. Last year, the American NGO Human Rights Watch compiled evidence that at least eight countries in Africa and Central Asia continue to use forced anal examinations as evidence against men and transgender women accused of homosexuality. This repellent practice, which they claim can reveal homosexual conduct, is a hateful hybrid of sexual assault and torture with no basis in science. But, imagine if they did have science on their side; imagine if the other 70 or so other countries in the world that criminalise same-sex relations could use a simple, inescapable and untrickable genetic screen to prosecute people. At least ten nations still reserve the death penalty for such a heinous crime, not to mention the vast areas under the control of militant groups such as Daesh (self-appointed ISIS) who take special pleasure in hunting ‘perpetrators’ for public execution. Simple genetic testing, an inevitable consequence of discovering a reliable
genetic component, would make such enterprises far more efficient. For extremistcontrolled lands, population-wide screening and the chilling historical implications of identification would hardly seem a far-flung prospect. These issues do not only extend to the more barbaric corners of the world. Given that many people within our own ‘developed’ societies view homosexuality as a disorder or a bodily malfunction, it is no stretch to conceive of sexuality testing making the roster of in-utero screening we already use for many genetic diseases. Selective abortion is not even the end of it. Any system which is able to control sexuality could be targeted with drugs or even medical genetic manipulation, making the already commonplace, and often non-consensual, gay cure therapy a tangible reality. Like many, my intellectual curiosity towards the origins of human sexuality is immense. How could such a seemingly counterDarwinian phenomenon have evolved within us in such numbers? Are we a symptom of communal altruism? Epigenetic ephemera? A by-product? A mistake? How can we possibly begin to explain the diverse spectrum of human sexuality? While some of these attempts at explanations may be more complimentary than others, the implications for personal identity are utterly trivial compared to the catastrophic potential of opening this Pandora’s Box. Any desire to know the answer, be it for the sake of validation, justification or simply scientific enquiry, must surely be outweighed by what might be an immense human cost. Understanding ourselves may come at the cost of many lives – gay and otherwise. We should divert that funding to HIV research, and maybe save some instead.
The Best Result
Truth and objectivity. Two pillars of scientific practice on which knowledge can be safely built. Or, perhaps not…
by Henry Bennie
he startling red ink scribblings of Robert Millikan’s 1912 lab books brazenly flaunt a disregard for objectivity. The notebooks record each iteration of his and Harvey Fletcher’s celebrated oil-drop experiment, which isolated and measured the charge of an electron. Millikan received worldwide recognition for the experiment, including the 1923 Nobel Prize in Physics. The textbook ‘scientific method’ of dispassionately testing a hypothesis and subsequently attempting to falsify it, championed by the philosopher Karl Popper, had clearly been abandoned by Millikan. Ambitious to present the ‘best’ results, he cherrypicked the data and excluded the results which seemed erroneous and problematic; only including the data he liked. Science has often progressed through this troubling method, and many researchers admit that they know of others, or have themselves, been selective with their data in order to get their work published. Luck
Pictures of Robert Millikan’s 1912 lab book, Courtesy of the Archives, California Institute of Technology, showing his notes for the oil drop experiment which measured the charge of an electron.
Millikan later won a Nobel prize, along with his colleague Harvey Fletcher, for this work.
would have it that Millikan was correct (his result less than 1% different from the current accepted value), but this practice of selective reporting may be seriously harming science. Although Millikan consciously made the decision to select his data, even more ‘objective’ and careful scientists may fall prey to selective reporting because like all humans, scientists suffer from cognitive biases – common ways of thinking that lead to wrong but convenient answers. The most problematic bias in science is known as ‘motivated reasoning’, where scientists interpret observations to fit their hypothesis. This is likely to be exacerbated by stresses of an academic career. For a scientist to advance her career she needs to get published as often as possible, in the highest profile publications as possible. To do this, she will need to produce articles that are more likely to be published. It turns out these are articles that report positive, original, clean results (‘I have discovered’ papers), whereas negative, confirmation or unclear results (‘I have disproved…’, ‘We confirm previous findings….’, ‘It is not clear how to interpret…’ papers) are less likely to be published, and are often considered as less important, or uninteresting by journal editors. This can be seen from the content of journals; currently only 13% of papers report negative results – down from 30% in 1990. The saviour, at least at first glance, should be the process of peer review and scientific publishing, because even if an individual scientist fails to be objective, others usually have no hesitation in critiquing their results. However, the scheme is slow and is currently facing a crisis: reproducibility.
In 2016, Nature conducted a survey on reproducibility in research, and found that more than 70% of the 1,576 researchers it surveyed had tried and failed to reproduce another scientist’s experiments. More than half had failed to reproduce their own experiments. The ability to reproduce experiments is at the heart of science, and if you are unable to perform an experiment you cannot verify it, nor challenge the results. The absence of this from research and publishing is both widespread, and worrying.
scientists had falsified data at some point in their career. So, it can be easily assumed that a large proponent of fraudsters are slipping by, infecting science with false claims, claims which others will build their own work upon.
Nature asked in the same survey what researches felt had led to the issues with reproducibility; 60% of respondents said that there were two factors – pressure to publish, and to publish new findings rather than to reproduce experiments, and selective reporting. Both problems are clearly aggravated by competition for grants and tenure. Under current pressures and the looming crisis of reproducibility the peer review and scientific publishing scheme is failing to be objective. If journals don’t want to publish studies that verify results and scientists are not rewarded for verifying results are we building the great castle of science in foundations of sand?
“The number of dishonest scientists cannot, of course, be known, but even if they were common enough to justify scary talk of ‘tips of icebergs’, they have not been so numerous as to prevent science’s having become the most successful enterprise (in terms of the fulfilment of declared ambitions) that human beings have ever engaged upon.”
However, if Millikan’s cherry-picking of results and issues with reproducibility are distressing, then they are nothing compared to the unspeakable taboo in science, which goes against everything science purports to be: fraud. RetractionWatch, a blog which tracks the articles that have been withdraw from publication, reported that out of over 800,000 papers published in 2015 only 684 were retracted. Align this with the best current, but probably conservative, estimate by the Stanford researcher Daniele Fanelli: approximately 2% of
Certainly, many researchers believe that the rate of fraud is much higher than the estimated 2%, and that the daunting
Dangerously complacent or pragmatic?
- Peter Medawar, the British immunologist and Nobel laureate, 1983
job of policing and identifying fraud in science should not rely on individual whistleblowing scientists – a rather thankless task, as accusing a colleague of scientific misconduct, as historian Daniel Kevles observed, is ‘akin to pederasty among priests.’ To help catch these frauds and to spot scientific mistakes, an overhaul of the secretive and elite world of peer review and scientific publishing is needed. When
Michele Nuitjen at Tilburg University demonstrated that approximately half of all psychology papers in journals contained a statistical error.
C. Glenn Begley and Lee M.Ellis reported in Nature that they were only able to replicate six out of 53 ‘landmark’ cancer studies.
John Ioannidis, a professor of medicine at Stanford University published an incendiary paper titled ‘Why most published research findings are false’.
a scientist publishes a paper in a journal the process of peer review takes place in secret and is not discussed publicly, then any retractions, edits or comments are passed by the editors of the journal before the public is informed. Should scientists be pushing for a more transparent and open scheme? One area of physics is already there: high-energy physics. High-energy physicists have put in place a system of distributing preprints of papers through an open-access repository. The benefit of their system was highlighted in 2011 when a neutrino, a sub-atomic particle, was measured as travelling faster than the speed of light. Einstein’s theory of relativity proposes an absolute speed limit: the speed of light in a vacuum. Overwhelmingly, scientists favoured the view that there must have been an experimental flaw. The open-access to data, and the preprint of the paper, allowed and actively encouraged many physicists to scrutinise the experiment and its results. It was quickly confirmed that a fast-running clock and a faulty connection had produced the mistaken calculation for the speed of neutrinos - a result that could have taken years if left to the process of peer-review. So could such a system be a good method to ensure the identification of scientific fraud, and be part of a solution to all our objectivity problems too? Should we make scientific data public and have it checked by algorithms during, and after, the publishing process? To any working scientist this may sound unimaginably draconian, as well as creatively stifling. But, for an enterprise which is largely publicly funded, shouldn’t we do everything we can to ensure scientists remain objective and truthful?
Lines in the Sand
Progress in embryology research techniques has allowed scientists to reach the legal limits of human embryo development. Is it time to reassess the rules?
by Tim Davies useful developmental marker, as embryonic progress may not always happen at the same speed. Until recently, the rule was not a concern for scientists. Rarely were researchers able to maintain embryos past 7 days outside of the womb. This changed last year when two ground-breaking studies reported new techniques to create a chemical mimic of the womb. Using these methods, they could grow the embryos for as long as 13 days. For the first time researchers found themselves butting up against the 14-day rule.
n 1984 a line was drawn in the sand; research on human embryos was limited to 14 days of development, effectively setting the point at which an embryo becomes too human to be experimented upon. Today, however, recent scientific advances have led to calls by some researchers for the rules to be relaxed. The constraint on the experimentation on embryos to 14 days post-fertilization, the so called ‘14-day rule’, was first proposed in 1979 by the US ethics advisory board. Currently 12 countries around the world have enshrined the limit into law, while five others have 14 days written in their scientific guidelines. As is clear, there exists a solid international consensus, and many nations and scientific advisory bodies have come together to agree on this cut-off; but, why? What is so special about 14 days?
Now some scientists are calling on the U.K. government to look again at the rules, and extend the limit past 14 days. Once such advocate for change is Professor Simon Fishel, who was part of the team involved in the creation of the first IVF baby. He supports moving the limit to 28 days. Speaking with the BBC he said: “I believe the benefits we will gain by eventually moving forwards when the case is proven will be of enormous importance to human health.” In particular, many cite the impact that such work would have on our understanding and ability to treat fertility issues. But there are some that are opposed to an extension to the limit, Professor Fishel concedes: “There are some religious groups that
are fundamentally against IVF, let alone IVF research in any circumstances, and we have to respect their views.” Indeed, some fear that an extension of the 14-day rule could be the beginning of a slippery slope. Bioethicist and founder of the Centre of Bioethics and Emerging Technologies, David Jones is one critic of the idea of an extension, saying in an interview with the BBC: “It would be a stepping stone to the culturing of embryos and even foetuses outside the womb. YYou are really beyond the stage when the embryo would otherwise implant and that is a step towards creating a womb-like environment outside. People will then ask why can’t we shift it beyond 28 days.” How then should we reconcile people’s ethical concerns with a need to update the rules for embryo research, to allow for 21st century techniques? One challenge will be reaching an international consensus on where a new line should be drawn. The question still remains; whose responsibility is it to define the limits on human embryo research? Is it scientists and researchers or bioethicists and theologians? This is ultimately a question for us all, and for society as a whole to consider. It is our responsibility to recognise the views of different groups, and to weigh up the benefits to society against the moral arguments put forward. Inevitably, we must ask ourselves where we feel comfortable setting the limit.
The first reason is a philosophical distinction; fourteen days old is the first point at which an embryo can be said to have an individual identity. Before this, the embryos can still fuse and split to form monozygotic twins. Secondly, at this point a band of cells known as the primitive streak is formed, marking the beginnings of a head-totoe axis. This is the beginning of the transition from a ball of cells towards something that begins to resemble a foetus. Researchers also favour because it’s easy to identify, providing a
Pictures: Karine Gray
Don’t talk about the blight upon the college. Don’t talk, don’t stare, don’t say you’ve seen it too. The mortar’s looking mouldy But It’s Always Been This Way, (So there’s nothing more to say, no more to do). With tactful (British) silence we ignore it. But the inky stain, it only grows unsaid Creeping into shadows cast By those sleeping during class, Always hanging over my Professor’s head. I thought by being clever I’d outthink it. Didn’t notice till the splashes reached my breast. You can’t escape the hall-mark, Bench-mark, walls-marked in its spores, In the dark-bred-damp I’m drowning with the rest. In surging viscous pitch I’ve not forgotten There’s a lifeline in a laminated hall. But a line is just a string When the whole place is sinking When to let in air means knocking down a wall. Don’t talk about the people going under, Don’t talk, just swim - don’t waste your precious air. Don’t talk, just save yourself, Don’t bring up the mental health Of the college, see, it’s Always Been This Way.
27% of UK university students report having a mental health issue. Only half of these would feel comfortable talking to family and friends about it. YouGov Survey, 2016
NEW MRes in Molecular Science and Engineering
Course Structure • Introductory components ensure a broad skill-set for the integrated curriculum • Advanced core modules bridge engineering and natural sciences • Research projects are at the heart of the programme and comprise two parts: an integrated collaborative industrial placement (3 months), and further consolidation and honing of research at Imperial (3 months).
Are you a scientist wanting to focus on the practical application of your research? Are you an engineer wanting to better understand the fundamentals of molecular science? Are you excited by the possibilities at the interface of molecular science and engineering?
Do you want to gain practical experience in industry? A unique 1 year Masters of Research, starting in October 2017. Find out more and apply: www.isciencemag.co.uk 32 I, Science
www.imperial.ac.uk/imse/education • email@example.com