EU Research Summer 2018
Stephen Hawking 1942-2018
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
Editor’s N S
cience means different things to different people. That’s probably why it is cherry picked so selectively, ignored or challenged by those who have political agendas.
It doesn’t help that there are inherent problems with science. For example, it doesn’t stand still for very long and it updates, a lot. One thing we know about people who run things, they will often fear challenging information that upsets the status quo, and that can put scientists in the firing line.
Poor old Galileo Galilei, the 17th Century astronomer, called by some ‘the father of scientific method’, dared to state his belief from observations, that the Earth revolved around the sun. For this, he was forced to spend the end of his life under house arrest as a criminal, as the powers that be were furious he was challenging their belief that the Earth was the centre of the Universe. If a scientist is not towing the party line and simply in the wrong place at the wrong time, they can also find themselves being targeted. Albert Einstein, a brilliant mind, a non-practicing Jew and a pacifist, was working on the General Theory of Relativity in the US when Nazis burned his books in public, along with other literature that was declared ‘un-German’.
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years.
Science can also be easily warped into propaganda and used for flag waving purposes. The space race between the USSR and the USA to get the first man on the moon became President Kennedy’s legacy. He said in a famous speech ‘it will not be one man going to the moon … it will be an entire nation. For all of us must work to put him there.’ You could argue that the desire for dominance of space became a far more political goal than a scientific one, in nature.
He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
Today there are similar challenges for scientists. Environmental science, even in 2018, continues to be under fire from skeptics and some politicians. Sometimes, consumers too, will opt for political views that keep them comfortable. Suggesting we ought to change how we do things can upset a great many people used to their lifestyle, wealth and long embedded habits. Big industries like coal, oil, palm oil, transport, cattle farming – it can seem too problematic to propose upheaval for these industries to prevent a global catastrophe for future generations. If we can’t do it and seriously, maybe we can’t, it will be the scientist, not the politician who we’ll call on to innovate and save what they can of our collective future, as we fall foul to consequences. It should of course, be the scientist who we listen to in the first place. The hardest truth is that politics, for all its power, cannot change scientific realities.
Hope you enjoy the issue.
Richard Forsyth Editor
Contents 36 DROPSA
The Dropsa project has been developing a cost-effective, integrated approach to pest management that will help protect crops and boost the European fruit industry, as Dr Neil Audsley explains
4 Research News
EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation
10 Stephen Hawking EU Research takes a look back at the life and career of the eminent physicist Stephen Hawking, who died recently. He not only achieved several important breakthroughs, but also helped to bring science to the wider public through his books and writings
14 BIOSUSAMIN Dr Francesco Mutti tells us about the Biosusamin project’s work in designing and developing biocatalytic cascades for producing amines, in particular chiral amines
16 HOTSPOT Whole genome duplication can help to enhance the adaptability of a lineage, yet there are serious biological challenges in dealing with twice as many chromosomes. We spoke to Dr Levi Yant about his work investigating how different organisms adapt to having a doubled genome complement
18 GPS-BAT Researchers are currently using miniature sensors to investigate how bats make decisions on food foraging. Dr Yossi Yovel tells us about the GPS-BAT project’s work in monitoring the behaviour of individual wild bats within a colony
19 Improving the
Conceptualisation... Professor Mirjam Sprangers tells us about her work bringing together different areas of research to improve the conceptualisation of quality of life, and the sensitivity by which it is measured
We speak to Gianni Medoro, Maximilian Sergio and Elena Bevilacqua about their work in developing the DEPArray™ technology, a new approach to cell isolation that could open up new possibilities in precision medicine
Professor Haike Antelmann tells us about her research into specific protein modifications caused by ROS, molecular switches that play a role as oxidative stress defense mechanisms in bacteria
26 Angela Terry Interview
In an interview with Richard Forsyth, environmental Scientist, Angela Terry, talks about her latest initiative to help prevent a runaway global temperature rise by inspiring consumers to make greener choices.
30 PHLOEMAP Understanding how plants respond to climate variability is central to forecasting how they will respond to future change, as Dr Elisabeth Robert and Dr Jordi MartínezVilalta explain
Dr Hartmut Stalb, Dr Heather McKhann, Professor Michael Bruford and Dr Marta Pogrzeba talk about the work of FACCE-JPI in supporting transdisciplinary research into interconnected challenges around climate change, sustainable agriculture and food security
34 LANDDAPP Researchers in the LanDDApp project aim to gain deeper insights into LDD in the North West Province of South Africa by assessing the extent of bush encroachment, a process often linked to LDD, as Dr Elias Symeonakis explains
The Amazon region holds great interest to climate scientists, because large amounts of CO2 enter and exit the atmosphere here, with strong impacts on the climate system. Now researchers in the ASICA project aim to gain a more precise understanding of the extent of CO2 uptake by the Amazon rainforest, as Professor Wouter Peters explains
42 INTERTRAP Loess deposits cover about 10 percent of the earth’s land surface, and are an important resource in reconstructing palaeoclimate. We spoke to Dr Alida Timar-Gabor about the INTERTRAP project’s work in investigating and analysing loess samples from three different continents, with the aim of developing improved dating methods
44 Food Security Population growth, climate change, excessive harvesting, the destruction of natural resources, urbanisation and desertification – all increasing the pressure on agriculture and feeding people. What kinds of solutions can science offer to ease the strain?
48 SMART-E Evolving fields such as Artificial Intelligence, big data analytics, embedded systems, cloud and human-robotic interactions will play a part in the 4th Industrial Revolution – an era dubbed ‘Industry 4.0’. This is why the SMART-E project created a training and research programme, to advance robotics in manufacturing
51 SI-DRIVE Social innovation projects help to change the way we live and work, yet the field itself is relatively under-researched, now the SI-DRIVE project is taking a fresh look at the topic. We spoke to Jürgen Howaldt, Christoph Kaletka and Antonius Schroeder about their work
Researchers in the SCPs project are drawing inspiration from nature as they aim to develop new methods of synthesising sequence controlled polymers, which could have interesting new functions, as Professor Rachel O’Reilly explains
56 VIRMETAL The Virmetal project aims to accelerate the discovery of new materials by means of multiscale modelling strategies that will enable scientists to design, process and test advanced metallic alloys in silico before they are manufactured, as Professor Javier Llorca explains
58 Magneto The Magneto project is attempting to develop new smart composite materials with unusual magnetomechanical properties. Project lead, Kostas Danas, carries out experiments to give magnetic properties to polymers, which could lead to applications in many sectors
61 OIO Professor Patrick Legros tells us about the OIO project. By bringing together insights from the industrial organization and organization economics literature, the new approach shows how the market performance and firms organizational choices are co-determined
62 Relativism Professor Martin Kusch and his colleagues aim to explore the historical roots of modern forms of relativism, and to determine which forms of the doctrine deserve a sympathetic reconstruction and defense
64 Rule & Rupture Government institutions often do not work effectively in weak states, and some of the most central aspects of people’s lives seem to be governed outside statutory institutions, as Professor Christian Lund of the Rule and Rupture project explains
70 Political Science Independent
Science is seen as a tool for influence and that makes it powerful and vulnerable at the same time, depending on who is using it and the point they intend to make. By Richard Forsyth
We spoke to Dr Dennis Hetterscheid about the work of the Cu4Energy project in studying molecular copper catalysts for water oxidation and oxygen reduction, reactions which are central to the performance of fuel cells
Stephen Hawking 1942-2018
74 CySTEM The ERA Chair for the Eastern Mediterranean (CySTEM) project aims to enhance Cyprus’s research capacity in solar energy, as Professor Manuel Blanco and Professor Costas N. Papanicolas explain
76 SUPERCELL ICREA Professor Neus Sabaté and her team are developing a new kind of fuel cell that can power diagnostic readers from a bodily fluid being analysed. These and the diagnostic devices in the SUPERCELL project are made from paper
77 HYP Dr Jim Mallinson tells us about his research into the origins of ha ṭha yoga, and how it evolved into the practices that we see in yoga studios across the world today
80 Ottoconfession Researchers in the Ottoconfession project are taking a fresh look at the process of differentiation between Sunni and Shii Muslims in the early 16th Century, as Dr Tijana Krstić explains
82 Corruption Roots Researchers are using an approach built on behavioural ethics to probe the roots of corruption, which could help inform the development of policy that encourages ethical behaviour, as Dr Shaul Shalvi explains
84 IOW The nature of armed conflict is changing, as normative and technological changes result in individuals playing an increasingly prominent role. Researchers in the IOW project are investigating the wider impact of this shift, as Professor Jennifer Welsh explains
86 MULTI-POP We spoke to Professor Nate Bastian about the Multi-Pops project’s work in studying globular clusters, which could lead to new insights into how galaxies are assembled
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
EDITORIAL Managing Editor Richard Forsyth firstname.lastname@example.org Deputy Editor Patrick Truss email@example.com Deputy Editor Richard Davey firstname.lastname@example.org Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks email@example.com PRODUCTION Production Manager Jenny O’Neill firstname.lastname@example.org Production Assistant Tim Smith email@example.com Art Director Daniel Hall firstname.lastname@example.org Design Manager David Patten email@example.com Illustrator Martin Carr firstname.lastname@example.org PUBLISHING Managing Director Edward Taberner email@example.com Scientific Director Dr Peter Taberner firstname.lastname@example.org Office Manager Janis Beazley email@example.com Finance Manager Adrian Hawthorne firstname.lastname@example.org Account Manager Jane Tareen email@example.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: firstname.lastname@example.org www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
The EU Research team take a look at current events in the scientific news
UK receives less EU Research funding since Brexit vote British Universities raise the alarm after a two hundred million pound drop in funds. The share of European Union research funding going to the UK has dropped significantly since the 2016 Brexit vote, UK government figures reveal. In September 2016 – the earliest available figures, and three months after the referendum – the UK had won 15.6 per cent of all funding from the multibillion-euro Horizon 2020 research and innovation programme since it launched in 2014, excluding Euratom nuclear research. According to Times Higher Education analysis of the latest government figures, the UK has received just 13.4 per cent of the roughly €12 billion (£10.4 billion) up for grabs from September 2016 until March 2018, meaning that its record at winning money appears to have worsened significantly since the vote. The latest figures also show the UK winning a lower share of the “excellent science” pillar of Horizon 2020 since the vote. This is likely to further raise concerns that uncertainty over the UK’s future in EU framework programmes has made it harder for UK-based researchers to win money from sources such as the European Research Council.
Last September, Universities UK raised the alarm over earlier statistics that showed the beginning of a downturn in success at winning money, and urged universities to clarify to researchers that they were still eligible for money. There have been mixed signals as to whether UK researchers have been shut out of EU funding since the vote. In 2017, UK-based researchers recorded a bumper year for advanced ERC grants, the most lucrative on offer, winning 66 in total and regaining the lead position from Germany. But other figures show a sharp drop – amounting to half a billion euros – in the value of joint Horizon 2020 projects led by the UK. There have been particular concerns that Brexit uncertainty will deter European researchers from pursuing a career in the UK. Dr Gardner pointed out that for ERC consolidator grants in 2017, not a single grant winner was choosing to move to the UK to carry out their work. By contrast, other countries including Germany, France, the Netherlands and Switzerland will all benefit from ERC grantholder immigration.
EC to Fight Fake News with Power of Blockchain
The European Commission (EC) has recently named blockchain technology to be part of its framework to combat the spread of false information online The EC has identified blockchain as a critical part of what it will call the Code of Practice on Disinformation, which it intends to introduce by summer 2018.
electronic identification, authentication and verified pseudonyms...”
According to a recently published press release by the EC, blockchain technology is one of “emerging technologies which are changing the way information is produced and disseminated, and have the potential to play a central role in tackling disinformation over the longer term.”
The recent press release follows a report published in March by the EC High-Level Expert Group (HLEG) that calls for more transparency from online platforms to fight the spread of false information online. The commission’s next step is to develop the EU-wide Code of Practice on Disinformation that is set to be published by July 2018.
The EC says blockchain applications can help provide transparency, reliability, and traceability of news on the Internet. The Commission added that distributed ledger technology (DLT) can be combined with other identification processes:
Blockchain development will also be included in the research activities of the Union’s research funding body Horizon 2020 Work Programme, which is considered “the biggest EU research and innovation funding programme ever.”
“Innovative technologies, such as blockchain, can help preserve the integrity of content, validate the reliability of information and/or its sources, enable transparency and traceability, and promote trust in news displayed on the Internet. This could be combined with the use of trustworthy
On April 11, the EC announced the signing of a Declaration to create a European Blockchain Partnership made of up 22 countries. EC Vice-President Andrus Ansip had previously urged the EU to take action in blockchain development in an effort to make Europe a world leader in digital innovation.
Commission in talks over simpler state aid rules for research projects The new rules would enable researchers to access structural funds for projects when Framework programme R&D funding is not available Research proposals that receive the seal of excellence from EU peer reviewers but fail to get a grant from the EU Framework R&D programme will in future be eligible for money from structural funds. The seal of excellence will, “Automatically be considered state aid compatible, even if you are funded from another public source,” Jean-David Malo, director for open science at the European Commission, told delegates at the ‘Research Infrastructures beyond 2020’ conference in Sofia last month. The change follows an agreement between the directorate for research and innovation and the competition and regional policy directorates, with EU commissioners for research, Carlos Moedas and competition commissioner Margrethe Vestager backing the plan. The Commission introduced the seal of excellence certificate in Horizon 2020 for project proposals that were rated highly, but for which no money was available. The idea is that with that quality stamp, the projects should be able to raise money easily from better-funded structural funds initiatives.
In fact, very few researchers were able to do that, partly because not all member states offered access to other funding streams, but also because their projects were not compatible with state aid rules applied to structural funds. “The complexification was perfectly illustrated,” said Malo. “We have tried to smooth a little bit the situation, but we are aware that it is not satisfactory.” At the moment, the EU does not apply the same state aid rules to projects funded from the research and innovation programme, and those funded from regional funds. If a managing authority decides to finance a project that has received the seal of excellence under Horizon 2020, it will be able to do so under the rules of the next Framework programme, not under the rules governing structural funds. “Same rules, same eligible costs, same state aid rules,” said Malo. Because the seal of excellence is mainly targeted at single beneficiary projects, this would also be useful for funding the construction of new equipment and ongoing costs in research infrastructures.
Precision Agriculture Will Change How Food Is Produced Technology like drones, autonomous tractors and IoT applications are playing a role in the transformation of agriculture. Food waste during the production process is costly to the environment and economically. According to a McKinsey & Company report in 2016 on how big data will revolutionize the global food chain, about one-third of all food is lost during production each year globally in developing and emerging countries, while at the same time, 795 million people go hungry. Food loss and wastes cost about $940 Billion and have a carbon footprint of 4.4 Gt CO2-equivalent which is more than eight percent of global greenhouse-gas emissions. In March 2018, the Federal Aviation Authority (FAA) released a statement indicating that the number of commercial drones and drone operators will quadruple over the next five years with almost a half a million drones in action by 202o. Combined with the advancements in drone technology and the impact the technology can have on farm production and irrigation, the commercial drone market is quickly expanding to cover new industries and applications like agriculture. According to Kim Niederman, CEO, Freewave, RF tech provides the communications backbone for autonomous vehicle Real-Time Kinematics (RTK), drone deployment and smart sensor ecosystems. “Real-Time Kinematics comes from the combination of RF and GPS which allows autonomous tractors to navigate and course-correct throughout crops with up to one-centimetre of accuracy. A wireless Machine-to-Machine (M2M) network with wireless RF devices installed has the potential to solve major connectivity issues in autonomous agricultural settings,” said Niederman. “An RF-powered drone can help optimize resources such as fresh water, fertilizers, and pesticides as well as identify healthy and unhealthy crops and irrigation problems.” Niederman says that smart sensors can be applied to soil monitoring to pivot irrigation but that RF is more efficient and affordable in
remote sensor ecosystems that need to transmit data beyond cellular or Wi-Fi range. “While automation isn’t new to the agriculture industry, the use of drones to make farm production more precise is still in its infancy,” said Niederman. “As human populations increase, utilizing every centimetre of arable land and conserving resources is paramount to meet demand and for sustainable agriculture systems.” “Farmers can deploy drones to get real-time data of their fields to identify issues such as irrigation problems or poor performance areas,” added Niederman. For example, if a sensor in the field detects crops are being over or under-watered, or if the soil doesn’t have enough nutrients, it can automatically alert drones to be deployed to scope out if it’s an issue with the irrigation system, what the soil condition looks like, or even if it’s a false-read. Or, when the drone flies over farmland, it can also send back video and multispectral images that can observe the differences between healthy and unhealthy crops and then notify resources to be deployed accordingly. This helps farmers see issues they may not be able to see regularly, especially if they have survey hundreds of acres. Michael Chasen, CEO, PrecisionHawk, said that to optimize farm management decisions, farmers continuously have to make trade-offs based on different parcels of land. “For example, say a farmer can only collect data over half of their fields in one day using the ‘old’ method. The farmer looks at half the fields and decides to fertilize field 14. Unfortunately, field 22, which the farmers didn’t get to that day, is the field that needs to be fertilized immediately,” said Chasen. “By capturing data over your entire operation, you can catch the most pressing issues early enough to make an adjustment. When you are flying in visual line of sight, you just cannot acquire all that information in one day.”
Urgent action needed on climate change: UN
© NASA/GSFC/METI/ERSDAC/JAROS, and U.S./Japan ASTER Science Team
The world must redouble efforts to halt global warming before it is too late, the UN’s climate chief said Monday. “Our window of time for addressing climate change is closing very quickly,” Patricia Espinosa told journalists. “We need to dramatically increase our ambition.”
The 197-nation treaty also promises $100 billion (82 billion euros) a year from 2020 to help developing nations green their economies and prepare for climate change impacts already in the pipeline.
The 12-day technical talks are focused on hammering out an “operating manual” for the landmark 2015 Paris climate pact, which calls for capping global warming at “well below” two degrees Celsius (3.6 degrees Fahrenheit), and 1.5C if possible.
“The international community must act now to ensure our Paris goals do not slip out of reach,” said Gebru Jember Endalew, a diplomat from Ethiopia who heads the Least Developed Countries (LDC) negotiating bloc of nearly 50 nations.
Earth’s average surface temperature has already risen by 1.0C since the mid-19th century, enough to push up sea levels and boost the severity of cyclones, drought, and deadly heat waves.
“There remains a vast gap between the support needed and support received,” he said in a statement.
Voluntary national pledges to reduce greenhouse gas emissions, annexed to the Paris agreement, fall well short of the target, and would yield a 3C world that scientists say would strain the fabric of human civilisation. “A rise of this magnitude would be extremely destabilising,” Espinosa said at a press conference, webcast live. “We cannot allow this to happen.” The Paris Agreement calls for revisiting nations’ carbon-cutting pledges in 2023, but on current trends, experts warn that may be too late. After remaining flat for three years, global CO2 emissions in 2017 went up by 1.4 percent, dashing hopes that they had peaked.
Diplomats in Bonn are mindful of the need to build bridges between the political process and the global economy, which has started to needed shift away from fossil fuels towards renewables and low-carbon energy. The Bonn meeting is one of a dozen climate-related gatherings this year that will culminate in a UN climate summit in Katowice, Poland, in December—the cut-off for adopting a nuts-and-bolts “rule book” for implementing the Paris Agreement. The issue is also on the agenda at the Quebec G7 summit in June, and the Buenos Aires G20 in November, as well as half-a-dozen ministerial meetings. The treaty goes into effect in 2020.
World’s smallest Chemistry experiment creates big possibilities Researchers at Harvard University managed to coax together a single sodium atom and a single caesium atom, to manipulate them into an alloy. The Harvard research, published this month in the journal Science, represents the world’s first bespoke creation of a single molecule. This feat of quantum-scale chemistry has two significant implications. First, it shows that the complete control of chemical processes is theoretically possible. This opens the door to combinations of elements that do not share an easy affinity. Second, the precision-tooling of molecules at this scale bodes well for the development of quantum computers. This futuristic version of computing, which promises to be more powerful than classical computing, relies on “qubits”, the quantum equivalent of classical bits. A sodium-caesium molecule (NaCs) would possess just the right electrical properties for a qubit, which is why Professor Kang-Kuen Ni and her colleagues decided to make one. Sodium and caesium are not natural chemical bedfellows. They are both alkali metals, lurking in the same region of the periodic table. Normally, this would make them unreactive partners. The challenge was to see whether a naturally highly improbable chemical partnership could be orchestrated at the level of individual atoms. The first technical hurdle was to hold one atom of each metal: each atom, cooled to almost absolute zero, was gripped by a magnetooptical trap in a vacuum chamber. Then, using “optical tweezers” — extremely cold laser traps — the two atoms were lured together. Each
tweezer used a different wavelength of light to keep its atom captive. As the atoms drew closer, adding a photon into the mix fused them into an alloy. The result was a distinctive new chemical signature: a molecule of NaCs had been created. The molecule existed in a single quantum state, a property critical in quantum computing. Jun Ye, from the University of Colorado, who was not involved in the work, told Chemistry World that the newly minted molecule was a dramatic moment for the field: “When individual constituents can be brought together, and molecules can be assembled or disassembled under fully quantum mechanical processes, then we have gained a full control of chemistry.” Around 25 elements can be cooled and handled in this way, according to Ms Ni. For other elements, tweezers of the right wavelengths are yet to be developed. Still, the research paves the way for the creation and manipulation of hitherto unknown molecules. Meanwhile, the toy box from which these made-to-measure molecules will emerge is expanding. There are 118 known elements in the periodic table, running from hydrogen (the lightest) to oganesson (formerly known as ununoctium). Last year, researchers in Japan announced they would team up with American rivals to hunt for elements with atomic numbers 119 and 120. Plutonium (atomic number 94) is the heaviest element to exist naturally; heavier ones have all been synthesised in the lab by crashing lighter elements together.
Science in North Korea Easing the nuclear stand-off might bolster research, as political tensions thaw, researchers hope for greater collaboration. The isolated nation publishes fewer than 100 scholarly articles a year, but after historic peace talks last week with South Korea, scientists have reason to hope that the country will open up to more collaborations. In the 20 April speech in which he announced the halting of nuclear tests, North Korean leader Kim Jong-un said his country would start to focus on boosting its economy through science and education. That’s an exciting prospect, says James Hammond, a seismologist at Birkbeck, University of London, who returned from Pyongyang on 21 April after a meeting to develop research proposals with North Korean scientists. Hammond works in a rare collaboration with scientists in the United States, China and North Korea, to study a hazardous volcano — which the Koreans call Mount Paektu — on the Chinese– Korean border. “There’s a general positive feeling in North Korea now,” Hammond says. “It’s clear that science and scientists are held in very high regard there. Our colleagues have been enthusiastic to pursue science and to conduct international collaboration, and this new atmosphere can only help with that.” North Korea has already been expanding its presence in the international research literature: its scientists published more than 80 articles in mainstream journals last year, more than 4 times their output in 2014, according to the Web of Science database. (Records of North Korea’s output are not entirely reliable, but other scholarly indices, such as Dimensions, report similar changes; these numbers exclude papers from conference proceedings.) The country’s biggest collaborator is China — its scientists co-published 60% of their papers with Chinese co-authors over the past three years — followed by Germany and South Korea (see ‘North Korea’s science’). South Korea’s research output currently dwarfs its neighbour’s. Last year, South Korea produced more than 63,000 research papers, up by around 16% from 2014.
Hammond hopes that research will be a vehicle for North–South political rapprochement. “Science is a subject around which you can build relationships,” he says. He and his colleagues are preparing a paper on their first analysis of the volcano that uses seismic data from both the Chinese and North Korean sides. “If we want to understand this volcano, it requires a cross-border approach,” he says. Collaboration will still involve numerous diplomatic hurdles. Last year, the United States said that people with US passports could not travel to North Korea without special dispensation. And United Nations sanctions currently prevent scientific collaborations with North Korea unless a UN committee agrees that the research won’t contribute to nuclear or military-related activities. (The scientists have received their clearance from the UN, Hammond says.) North Korea’s research on nuclear energy (which isn’t published in international journals) has largely been built through extensive collaboration with Russia, says Jenny Town, assistant director at the US-Korea Institute at the Johns Hopkins School of Advanced International Studies (SAIS) in Washington DC. And North Korea and the United States previously engaged in scientific exchanges on topics including energy and climate change, Town notes. “There is certainly plenty of interest in getting back in the game, but there has been little opportunity since sanctions have become stricter and stricter.” Kim also pledged last week to close North Korea’s Punggye-ri nuclear complex — where it conducted nuclear-bomb tests — and said it was willing to allow international inspectors into the site to verify the shutdown. “Closing the site and allowing inspections is an important diplomatic concession, but one that can still be portrayed to domestic audiences in a more moderate way,” says Town, who is also managing editor of the website 38 North, a project of the US–Korea institute at SAIS that some US nuclear experts contribute analysis and reporting to. “I know our scientists and geologists are very excited at the prospects of getting boots on the ground,” she says.
NASA on brink of solving massive particle physics mystery In the summer of 2018 NASA can collect data on neutron stars and understand how matter behaves when pushed to its wildest extremes? It takes 512 years for a high-energy photon to travel from the nearest neutron star to Earth. Just a few of them make the trip. But they carry the information necessary to solve one of the toughest questions in astrophysics. The photons shoot into space in an energetic rush. Hot beams of X-ray energy burst from the surface of the tiny, ultradense, spinning remnant of a supernova. The beams disperse over long centuries in transit. But every once in a while, a single dot of X-ray light that’s travelled 156 parsecs (512 light-years) across space — 32 million times the distance between Earth and the sun — expends itself against the International Space Station’s (ISS) X-ray telescope, nicknamed NICER. Then, down on Earth, a text file enters a new point of data: the photon’s energy and its arrival time, measured with microsecond accuracy. That data point, along with countless others like it collected over the course of months, will answer a basic question as soon as summer 2018: Just how wide is J0437-4715, Earth’s nearest neutron-star neighbour? If researchers can figure out the width of a neutron star, physicist Sharon Morsink told a crowd of scientists at the American Physical Society’s (APS) April 2018 meeting, that information could point the way toward solving one of the great mysteries of particle physics: How does matter behave when pushed to its wildest extremes? [10 Futuristic Technologies ‘Star Trek’ Fans Would Love] On Earth, given humanity’s existing technology, there are some hard limits on how dense matter can get, even in extreme laboratories, and even harder limits on how long the densest matter scientists make can survive. That’s meant that physicists haven’t been able to figure out how particles behave at extreme densities. There just aren’t many good experiments available.
“There’s a number of different methodologies that people come up with to try to say how super-dense matter should behave, but they don’t all agree,” Morsink, a physicist at the University of Alberta and a member of a NASA working group focused on the width of neutron stars, told Live Science. “And the way that they don’t all agree can actually be tested because each one of them makes a prediction for how large a neutron star can be.” In other words, the solution to the mystery of ultradense matter is locked away inside some of the universe’s densest objects — neutron stars. And scientists can crack that mystery as soon as they measure precisely just how wide (and, therefore, dense) neutron stars really are. Morsink told EU Research that she wasn’t trying to tease the upcoming announcement. NICER just hasn’t collected enough photons yet for the team to offer up a good answer. “It’s like taking a cake out of the oven too early: You just end up with a mess,” she said. But the photons are arriving, one by one, during NICER’s months of periodic study. And an answer is getting close. Right now, the team is looking at data from J0437-4715 and Earth’s next-nearest neutron star, which is about twice as far away. Morsink said she isn’t sure which neutron star’s radius she and her colleagues will publish first, but she added that both announcements will be coming within months. “The aim is for this to happen later on this summer, where ‘summer’ is being used in a fairly broad sense,” she said. “But I would say that by September, we ought to have something.”
X-ray: NASA/CXC/University of Amsterdam/N. Rea et al; Optical: DSS
Stephen Hawking 1942 – 2018
A Brief History Stephen Hawking forms part of our collective identity as a generation. He represented achievement. He had the popularity of a movie star with the general public and a scientific career that won him admiration and awards from his peers. Despite struggling with a deadly immobilising disease, nothing would stop Hawking from reaching for the stars and making new discoveries. His death this year was a marker in history, the ending of a life that conquered time and space again and again, whilst facing unimaginable challenges.
At the World Science Festival in 2015, Lucy Hawking, Stephen Hawking’s daughter from his first marriage, revealed in her presentation, one of his teachers stated in a school report, rather cruelly: “this boy will never amount to anything.” As it turned out, Stephen Hawking’s signature move would be proving eminent people wrong, from doctors to scientists, through out the course of his life. Hawking had a sharp, enquiring mind. He graduated from University College Oxford, with a first in Natural Science before moving over to the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge in 1962, to conduct research in cosmology. When every word counts
Stephen Hawking at NASA’s StarChild Learning Center, 1980s.
It was around the same time as his move to Cambridge that he began to notice some unusual physical problems manifest, such as having trouble tying his shoe laces. At the tender age of 21 he was informed that he had Amyotrophic Lateral Sclerosis (ALS), commonly known as motor neuron disease, a degenerative condition that meant he would eventually lose control of all voluntary movement in his muscles. He would lose the ability to walk, write, chew, talk and ultimately to breathe. In the progressed state of the illness he would be ‘locked-in’, in paralysis, despite keeping all his mental faculties intact. To compound the horror of this discovery, the doctors told him he had about three years optimistically and would not see his 25th birthday. “My expectations were reduced to zero when I was 21. Everything since then has been a bonus,” Hawking once reflected. Time was precious to Hawking, he did not waste it. Being threatened with imminent death gave a true value to time. The medical prognosis and short life expectancy posed a serious problem for Stephen Hawking, because he was nowhere near ready for the conclusion of his life – he had just begun his journey in scientific exploration. He needed to understand how the universe worked and to do that involved having more than a year or two to research.
When he recently died at the age of 76 it’s been widely speculated that part of what kept him going was his deep desire to unravel the mysteries of the cosmos. His work held such a high place in his mind. “Work gives you meaning and purpose and life is empty without it,” was his perspective. Adapting to change became his daily challenge and he realised, as he put it: “Intelligence is the ability to adapt to change.” He once shared a personal observation that his disease was in some ways giving him more focus: “By losing the finer dexterity of my hands, I was forced to travel through the universe in my mind and try to visualise the ways in which it worked.” Hawking became very well known as his scientific achievements harvested publicity around the world and in 1979 he attained the most prominent of scientific positions, following in the footsteps of Isaac Newton to become the Lucasian Professor of Mathematics at Cambridge. His disease did not detract his functioning in this eminent role at all, despite that at this stage he could not feed himself, had slurred speech and was wheelchair bound. There were times however, when he was completely blind-sided by the symptoms and impacts related to his condition. In 1985 on a trip to CERN in Geneva, Stephen caught pneumonia and was put on a ventilator, in critical condition. His wife at the time, Jane Wilde, was asked if she would consider turning off the life support but she refused. As part of the process of managing the infection successfully he had a tracheotomy, which meant Stephen lost the ability to use his voice permanently. From here, he initially used spelling cards – communicating with movements in his eyebrows. He went on to benefit from progressively more helpful electronic devices, which meant he could write at 15 words per minute. In 1988, Stephen’s voice took the form of an electronic voice named ‘Perfect Paul’ by its creators and Hawking would stick with it as he liked it, despite the offer of upgraded versions. The disease continued to degrade his physical abilities and eventually robbed him of the nerve power in his thumb. By 2008 Hawking’s could
no longer rely on his hand and by 2011 he was back to two words a minute. A team at Intel stepped in and helped develop technology just for Hawking, working continuously with the scientist to ensure a device could read the micro movements in his face, most specifically with his cheek muscle and link them to an interface that could not only type characters but also took shortcuts to lengthier speech, much like predictive text on a mobile. Whilst most scientists can scribble notes, record their voices into devices, edit and re-edit their formulas and ideas at speed, Hawking carried out his work with unbearable restraints upon method. Whilst Hawking would not want disease as a defining reference for him, it is important as a context to what he achieved, because whilst this relentless struggle to communicate continued through the years, Stephen managed to accomplish a vast array of work, where each piece of research and writing became definitive in its respective field.
An elite physicist Hawking’s earliest work at Cambridge made it clear he was a standout talent but one of Hawking’s pivotal moments was his contribution to the understanding of the dynamics of black holes. In 1974 he defined how ‘blackbody’ radiation can be released by black holes due to quantum effects on its event horizon. He surprised the physics community by proving that black holes thermally create and emit subatomic particles until they run out of energy and evaporate completely. He proved that black holes lost mass. It was shocking because it meant black holes were no longer completely black and they would not last forever. Hawking became the first to realise that black holes could not only shrink but also die. The process he revealed became known as Hawking radiation. He proved this with some complex maths which basically, estimated the temperature of a black hole. He used a combination of research to do this from Einstein’s theory of relativity that described gravity on a large scale and quantum mechanics – which studied the smallest elements. An event horizon has gravity that is so immensely powerful
Work gives you meaning and purpose and life is empty without it. that even light cannot escape. So, the event horizon of a black hole is a great place to study these fundamental forces which make for a fascinating and weird playground of physics, ideal for scientific study that looks at extremes to unroot truths. Hawking’s research here, felt like a stepping stone to an ultimate theory that combined general relativity with quantum mechanics and for cosmologists, it was quite tantalising. With his increasing notoriety, Hawking decided to write a book in 1988, to communicate the wonders of cosmic science to a wide, largely non-scientific audience. He focused on subjects of his research to date, like the big bang, black holes, gravity and the quest to find unifying theory to the workings of the universe. He called it A Brief History of Time. It eventually sold some ten million copies and was on the London Sunday Times bestseller list for four years, longer than any other book. The book jettisoned the professor further into global fame. He had become a household name – the A-lister of the scientific world.
The event horizon of celebrity The truth is, that in parallel to the unprecedented success of his first book, the fame took a toll on home life. Although it was an amicable parting, pressures of the popularity, the scientific work and the relentlessness of homecare culminated to a point where Stephen and Jane, with whom he had three children, came to a decision to separate. This was a blow because love and companionship were a keel for Hawking. He said: “If you are lucky enough to find love, remember it is there and don’t throw it away.” In 1995 Stephen found companionship again and went on to marry his nurse, Elaine Mason. It was during the late nineties that a new wave of fame hit Hawking and he found himself as a guest on popular TV shows, such as The Simpsons and Star Trek: The Next Generation. He had truly become a popular icon. Hawking’s publishing career was not limited to The Brief History of Time either. He wrote many more successful books, with the constant need to close in on that illusive theory that tied everything together. For instance, in 2001 he wrote The Universe in a Nutshell, which attempted to draw together subjects as lofty as 11 dimensional supergravity, P-branes, M-Theory, quantum mechanics, general relativity, 10 dimensional membranes, superstrings and of course, black holes. Constantly aware that this kind of work required a
shift in imagination that some might find a stretch, he worked with Moonrunner Design, who created intricate illustrations to explain some of concepts in a more intuitive way. He indicated in the foreword, that although since A Brief History of Time, we have advanced a long way, there was a long way to go for a complete understanding of the mechanics of the Universe. Only a year later in 2002 he published the book: The Theory of Everything, that similarly accumulates the most complex of theories made to date on the origins and workings of the universe. He was merging different fields to come up with clues to THE answer, mixing in an alchemy of sciences, to get closer to that big illusive theory he knew was hiding in plain sight.
Exploring every possible universe Hawking had been chasing the ultimate theory of the universe all his life with a dogged tenacity and curiosity. However, in 2007, it seemed from the outside like he had decided to embrace some more personal goals. For instance, Stephen changed direction with publishing in 2007, when he began to co-author a series of science fiction novels with his daughter, Lucy. The first in the series was called George’s Secret Key to the Universe and this time around, the target reader was primary school children, not academics or amateur cosmologists. It was as if he made a conscious decision to have a little fun, to take a small step outside the intensity of the academic circles and work closely with his daughter. Again, it gave him time to appreciate love in his life, weaving the relationship with his daughter around a hinge-pin of shared interest in science. In the same year, Stephen took time out to be awarded the Presidential Medal of Freedom in a ceremony in the White House. He was personally presented the medal by then President, Barack Obama.
There is no such thing as a standard or run of the mill human being but we all share the same human spirit. However difficult life might seem, there is always something you can do and succeed at.
Two thousand and seven was the year he took the chance to experience weightlessness, like an astronaut, during a zero-G flight in Florida, where a modified plane dived eight times to create four minutes of ‘floating’ time for the professor. Afterwards he beamed: “It was amazing. Space, here I come.” A few years later, in 2012, he became a highlight on the stage for the opening of the London Paralympics – broadcasting live to the world. “We are all different,” he said “There is no such thing as a standard or run of the mill human being but we all share the same human spirit. However difficult life might seem, there is always something you can do and succeed at.” And succeed he did. In 2013 he became one of the first winners of the Breakthrough prize recognising his theoretical work for the discovery of Hawking Radiation, a prize which incidentally banked him $3 million dollars. He also agreed to be the subject of a major movie. In 2014 the film was made about Hawking’s earlier life in Cambridge – revolving around his first marriage and was aptly called The Theory of Everything. It starred Eddie Redmayne as Hawking, earning the actor an Oscar and a Bafta for the role. The film grossed £77 million and had the unexpected effect of bringing Stephen and Jane closer together since their separation, in the same way a counsellor may show a reflection of a couple’s entwined realities, when in the past they had been too close to truly comprehend how they had affected each other.
It was still within 2014 when Stephen found headlines for other reasons than the popular movie on his life. This time it was his commentary on Artificial Intelligence that was the subject matter. He warned a BBC reporter: “The development of full artificial intelligence could spell the end of the human race...” adding AI “could redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded.” It remains an interview that scares people. Hawking’s appraisal of the AI situation gives it an element of authenticity and considered accuracy that few others could command. Hawking had a bit of everything that made him a science superstar. He had imagination, charisma, a sense of fun and philosophy, acute intelligence and wasn’t afraid to say what he thought. His thoughts defined him, not his disease nor his personal challenges. Hawking will be missed, not just by the science community or by those who knew him personally, but by millions of people who connected and shared his drive for exploring the unknown and the universe. What’s more, whilst he’s opened the door to many new realities, he’s left just enough mystery out there in the cosmos, for other inspired scientists to carry on where he left off.
Photograph © Murdo MacLeod
Developing new pathways for biocatalytic cascades Sustainability is an increasingly prominent issue in the chemical industry, as researchers seek to develop new methods of producing key compounds that don’t depend on the use of fossilbased resources. Dr Francesco Mutti tells us about the Biosusamin project’s work in designing and developing biocatalytic cascades for producing amines, in particular chiral amines A type of
organic compound, amines are commonly applied in the synthesis of a number of chemical products and materials, including certain drugs, polymers and dyes, with chiral amines particularly important for pharmaceutical manufacturing. Current methods of producing amines are mainly based on the use of ketones as the starting material, yet with concern growing over the sustainability of these methods, researchers in the Biosusamin project are looking to develop alternative methods. “We are going to run out of oil at some point, and then we will have to change the way we make these amines,” stresses Dr Francesco Mutti, the project’s Principal Investigator. While not many ketones are available via renewable resources, this is not the case with biobased molecules that often contain alcohol functional groups, a major area of interest to Dr Mutti. “This is one important reason to develop methods to convert alcohols into amines,” he says. “Another important reason centres on how amines are currently made in industry, which is relatively inefficient. Several steps are involved, leading to waste.”
Intermediate steps This complexity does not however guarantee that the final product will be suitable for use in industry, and it may be necessary to include further steps, to improve the purity of the compound for example. There is not a direct relationship between the number of steps and the amount of waste, as one step may be less efficient than another, but in general, additional waste is generated with each step in a multi-step chemical process. “This is especially the case if, after a single step, you need to isolate and purify your intermediate before starting the next step as it is unavoidable in traditional chemical production,” stresses Dr Mutti. One key goal in the project is therefore to minimise the number of biochemical steps involved in the conversion of starting material. “The aim is to minimise the number of steps, and to avoid intermediate chemical work-ups, such as purification and isolation. Using the strategies developed in Biosusamin, based
Analysis and comparison of the 3D structure of amine dehydrogenases.
on biocatalytic cascade chemistry, we aim to emulate what nature does in a normal metabolic pathway,” says Dr Mutti. Many steps are typically involved in the natural metabolic pathways that researchers are seeking to emulate in the project. The main difference with the traditional synthetic chemistry approach is that Dr Mutti and his colleagues are designing cascades in which the steps run sequentially and concurrently. “The overall process runs from one step to the other without stopping,” he says.
them and do molecular simulations. In this way we can gain an understanding of where we have to induce mutations in order to obtain the final catalyst that we need for a specific transformation.” The amine dehydrogenases generated so far through the current approach have been relatively limited in scope however, capable of acting effectively on some molecules, while failing with others. Researchers are working to improve existing methodologies, which it is hoped will prove effective on a wider range
Using the strategies developed in Biosusamin, based on cascade chemistry, we aim to emulate what nature does in a normal metabolic pathway Researchers are using amine dehydrogenases (AmDHs) as biocatalysts in the key steps of these pathways. “This is a new class of enzymes that we are creating in the laboratory by inducing mutations in other enzymes to create amination activity,” explains Dr Mutti. “We study the crystal structure of available enzymes – we do computational studies of enzymes, analyse
of molecules, providing a strong foundation for continued development in this area. “This will allow us to obtain a toolbox of AmDHs, to transform different types of molecules,” says Dr Mutti. While the initial aims of the project were relatively limited, it has since become apparent that quite a wide range of products could potentially be obtained with this new family of enzymes, reinforcing the wider relevance of this research. “With this type of methodology, we do not need to use any toxic intermediate, compound or reagent,” explains Dr Mutti. “With this multi-step cascade, molecular oxygen is used as an oxidant and everything runs at room temperature. So it’s safer, and a lot of energy is saved.”
BioSusAmin The design and development of efficient biocatalytic cascades and biosynthetic pathways for the sustainable production of amines
In general, research in Mutti’s lab aims at the development of novel atom‐efficient and sustainable biocatalytic routes for the manufacture of high value chemical products and materials. This work involves the creation of enzymes with improved or unprecedented activities (i.e. not known in nature). The research line includes bioorganic chemistry, enzyme engineering, biochemical characterisation of enzymatic reaction and mechanisms as well as computational studies.
This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 638271, BioSusAmin).
Atom efficiency This is an important issue in terms of resource efficiency, while Dr Mutti is keen to stress that the atom efficiency is also greatly improved. This relates to the conversion efficiency of a chemical process. “If all the atoms used along the route are incorporated into the final product, then you have the maximum efficiency of 100 percent. However, with traditional chemistry, some waste is typically generated,” outlines Dr Mutti. By contrast, Dr Mutti says that efficiency with this new methodology developed in the project approaches 100 percent, which will help in meeting ever-more stringent environmental standards. “This is an increasingly important issue,” he points out. “A second important point is that there is also an advantage from an economic perspective. If you can optimise the utilisation of resources by reducing waste, then that also helps maximise profits.” These are important issues for many companies, so there is a corresponding level of interest in the project’s research in the commercial sector. However, while one patent has already been filed around the exploitation of the alcohols to amines transformation, Dr Mutti says that more research is required before wider exploitation can be considered. “We need to improve the efficiency of the catalysts, for example the AmDHs, and also other enzymes. This is lab-scale work, that we can do in my group. At the moment we are developing these processes, which are running very effectively on the lab-
scale. Then, later on we will look towards exploitation,” he says. The project’s research could also be integrated with findings from other initiatives working in similar areas. “By combining the results of our project with results from other projects, I think that we can help to build a new generation of biochemical processes for use over the next decade,” continues Dr Mutti. This points to a wider change in the chemical industry, as sustainability becomes an ever more prominent issue. While work in Biosusamin has centered around the production of chiral and achiral amines, Dr Mutti says that his research in this area will continue beyond the term of the project. “We aim to generate a new type of bioorganic chemistry, in which we can start with material from renewable sources, and use it to produce a wide variety of the compounds which we need in our daily lives. This will reduce our dependence on fossil-based resources,” he outlines.
• Professor Tom Grossmann, Department of Chemistry, VU University Amsterdam, The Netherlands • Dr Frank Hollmann, Department of Biotechnology, Technical University Delft, The Netherlands • Dr Karim Cassimjee, EnginZyme AB, Stockholm, Sweden • Dr Michael Breuer, BASF SE, White Biotechnology Research, Ludwigshafen, Germany
Dr Francesco Mutti, Associate Professor & Head of Biocatalysis, University Amsterdam Van’t Hoff Institute for Molecular Sciences (HIMS) Faculty of Sciences University of Amsterdam Science Park 904 1098 XH Amsterdam The Netherlands T: +31 (0)205 258 264 E: email@example.com W: www.hims-biocat.eu
Dr Francesco Mutti
Dr Francesco Mutti obtained his master’s and PhD in chemistry at the University of Milan (2008). He was postdoctoral researcher at the KF University of Graz (2009‐2012) and at Manchester Institute of Biotechnology (MIB), The University of Manchester (2013‐2014). After a short period as PI at the MIB, he was appointed tenure‐track chair of the Biocatalysis group at the University Amsterdam (2015) where he is currently associate professor.
Schematic representation of the biocatalytic reductive amination.
The most dramatic of mutations: whole genome duplication Whole genome duplication is an important force in evolution, and while it can help to enhance the adaptability of a lineage, there are serious biological challenges in dealing with twice as many chromosomes. We spoke to Dr Levi Yant about his work investigating how different organisms adapt to having a doubled genome complement An organism with more than two complete sets of chromosomes is known as a polyploid, and while this occurs throughout the eukaryotic kingdom, it is particularly prevalent in plants. Currently based at the John Innes centre in Norwich, UK, Dr Levi Yant is the Principal Investigator of an ERC-funded project investigating the impact of whole genome duplication (WGD), the process that gives rise to polyploidy. “There are serious implications around dealing with twice as many chromosomes, as it means there are twice as many opportunities for novel associations in the nucleus,” he outlines. The focus of the ERC project is on understanding how different organisms can adapt to having a doubled genome complement, despite attendant challenges. “Understanding how a lineage with a doubled genome can deal with some of the associated challenges is a broadly interesting topic, because this impacts adaptability of a species across kingdoms, from crop domestication to polyploid cancer lineages in humans,” says Dr Yant.
interest in the project is meiotic crossovers, which are important for stabilising the segregation of chromosomes. “One crossover between homologous chromosomes is necessary to stabilise meiosis, while there’s also the benefit that this cross-over reassorts genetic material,” he explains. “So, half a chromosome from the mother is switched to be associated with half a chromosome from the father.” Arabidopsis arenosa, a major model of adaptation to genome duplication. Photo credit: Filip Kolář
Repeatability of Evolution
Meiotic crossovers This research builds on the observation that meiosis, a type of cell division, is not a straightforward process in some of these young, genome-doubled lineages. This is largely because there are simply too many opportunities for association between similar chromosomes. “Instead of having just one possible partner - a homologue - suddenly there are three other possible partners, because of the doubling of the genome,” outlines Dr Yant. A major area of
environment. “There’s a lot of risk initially, as when the genome has these interactions, too many crossovers and entanglements can lead to chromosome breakage,” explains Dr Yant. “However, if you’re able to tweak meiosis early on, you then suddenly have twice as many chromosomes to play with, and that almost certainly increases the adaptability of the lineage. This has been shown rigorously for example in yeast, where groups have been able to take haploid, diploid and tetraploid genomes, and show that more chromosome copies can enhance evolutionary potential in experimental evolution studies in vitro.”
A greater number of meioitic cross-overs means that there is a greater diversity of genetic combinations in the population, which is a positive point in evolutionary terms. Genome doubling can be thought of as a high-risk, high-gain strategy, in terms of a population’s ability to adapt to the
The main focus of attention in the ERC project is on looking at the repeatability of the striking adaptations which have been found in independent lineages. A key part of this work centres around a large-scale comparative genomics and population genomics study on different, independently-derived polyploid plant and amphibian species. “We’re looking to understand the evolutionary changes in each of these species, and how repeatable these changes are,” outlines Dr Yant. By resequencing many individual diploids, and closely related tetraploids, Dr Yant and his colleagues have been able to observe the footprints of selection on the genome following genome duplication. “We’ve resequenced hundreds of genomes in the last three years, using large scale population genomic studies that now are beginning to present clear candidate genetic changes that allowed the stabilisation of meiosis in these
Meiosis-specific proteins ZYP1 (red) and ASY1 (green) juxtaposed between the chromatin loops (blue) during meiosis in Arabidopsis arenosa. Homologous chromosomes have paired and recombined, which is essential for normal levels of genetic crossovers as well as promoting correct segregation of chromosomes during sexual reproduction. We discovered ZYP1 and ASY1 evolving rapidly in response to genome duplication (polyploidy). Credit: James Higgins
tetraploid plant and animal species,” he says. A diverse range of techniques are being applied in this project, including genetic, genomic and cytological approaches, to gain deeper insights into how different species adapted to the challenges associated with whole genome duplication. This work has yielded some interesting findings so far. “The first striking result was that the number of crossovers in meiosis was indeed successfully reduced in young tetraploids,” says Dr Yant. Now, by analysing the genes involved in this adaptation, researchers are looking to identify hotspots for genome evolution
they also suffer from some problems during meiosis similar to those which naturally adapted tetraploids have overcome,” explains Dr Yant. More broadly, this research could inform approaches to synthetic biology and rational crop improvement, helping to enhance crop resilience. “Some specific challenges linked to chromosome stabilisation include the ability to adapt to temperature volatility – which of course is also a very important problem for crops,” says Dr Yant. The main focus of attention in the project was on investigating the repeatability of adaption mechanisms to WGD, yet Dr Yant
If you’re able to tweak meiosis early on, you then suddenly have twice as many chromosomes to play with, and that increases the adaptability of the lineage across different genomes. “There appear to be functional hotspots – that is, it appears these adaptations are functionally constrained: the pathway and the output is the same, but fascinatingly, the way the genome does it differs in each case,” continues Dr Yant. “So there is a flexibility and innovation in each case: there are different ways in which genomes can come up with solutions to the same chromosomal challenge.”
Functional changes The longer-term goal in this research is to build a deeper mechanistic understanding of the functional changes in each of these cases and derive more general principles. This work also holds wider relevance in terms of our understanding of crop domestication. “Crops are typically polyploid, so they have double genomes. So, lots of things that we’ve been working on are relevant directly to crops, and
and his colleagues have also been able to explore other topics of interest. This includes research into some of the positive aspects of being genome-doubled. “Some of these polyploid species have been able to colonise really difficult, extremophile environments, such as toxic mines. We’ve found that tetraploids are able to deal with toxic mine sites where there’s a lot of lead and cadmium and low nutrient availability – for example, low levels of nitrogen and phosphorus,” outlines Dr Yant. This kind of situation, where plants have been able to colonise a seemingly unpromising environment, is highly interesting from an evolutionary genetics perspective. “In evolution there’s an initial challenge, and we understand the genes that mediate that. That then feeds into an adaptability – we aim to understand the genetic correlates of that, and we’re able to describe it using high-resolution, dense population genomics,” says Dr Yant.
HOTSPOT The population genomics of adaptation Project Objectives
This programme aims to understand evolutionary repeatability and constraint in the context of adaptation to whole genome duplication, an prevalent force in evolution and a driver of evolutionary diversification. This will provide insight into how organisms adapt to altered cellular environments and how conserved fundamental biochemical process, such as meiosis, evolve nimbly when required.
Major collaborators include Kirsten Bomblies (JIC also: http://bomblies.jic.ac.uk/). Funding is provided by a Starting Grant from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement 679056).
• Cambridge University, UK • Charles University, Prague, CZ • Harvard University, USA • Stirling University UK • University of Nottingham UK • University of Leicester UK
Levi Yant, Project Leader John Innes Centre Norwich Research Park Colney Lane Norwich NR4 7UH United Kingdom T: +44 1603 450 000 E: firstname.lastname@example.org W: http://yant.jic.ac.uk/ Levi Yant Levi Yant is a Project Leader in Cell and Developmental Biology at the John Innes Centre in Norwich, U.K. His work focuses on wild plant populations to learn how evolution finds solutions to significant environmental and physiological challenges. Among these are adaptations to high metal soils with low nutrients as well as the internal struggle in the nucleus following whole genome duplication. Using population genomic approaches, this work identifies changes specific to adapted populations, revealing candidate genes and process mediating adaptation. A major goal of his work is define which evolutionary routes are constrained and predictable, and which meander along diverse paths. This promises a view into rules broadly underlying evolutionary change. Dr Yant is currently moving his research programme to The University Nottingham as an Associate Professor of Evolutionary Genomics in the School of Life Sciences and University Beacon for Future Foods.
A cell following genome duplication. Blue is DNA and red is ZYP1, a protein we discovered to be undergoing adaptive evolution to help stabilise meiotic crossovers following genome duplication. Image: Chris Morgan
Following the signs of decision making There are more than 1,300 bat species, and their behaviour is extremely variable, now researchers are using miniature sensors to investigate how they make decisions on food foraging. Dr Yossi Yovel tells us about the GPS-BAT project’s work in monitoring the behaviour of individual wild bats within a colony The underlying basis
of animals’ decision-making is a topic of great interest in biology, influenced by both their own neural mechanisms and the environments they inhabit, yet gathering data on their behaviour in the natural environment is a challenging task. Based at Tel Aviv University, Dr Yossi Yovel and his colleagues are working to bridge the gap between neuroscience and ecology. “We’re trying to develop methods that allow us to run experiments on animal behaviour in the field that are highly controlled – almost as controlled as in the lab,” he outlines. In particular, Dr Yovel is deeply interested in echo-locating bats, which perceive their environment through sound. “Bats are great models – they’re wild and extremely variable. It’s a huge group with more than 1,300 species, and a lot of behavioural variability. Some bats migrate, some navigate far, some near, they eat different things, some are social, some aren’t,” he continues.
GPS-BAT Foraging Decision Making in the Real World – revealed from a bat’s point of view by on-board miniature sensors Project Coordinator, Dr Yossi Yovel Department of Zoology, Faculty of Life Sciences Tel-Aviv University Levanon st. P.O. Box 39040 Tel-Aviv 6997801 Israel T: +XX 03 6407304 E: email@example.com W: http://yossiyovel.com/ index.php/research Dr Yossi Yovel is an associate professor in the School of Zoology and in the School of Neuroscience in Tel-Aviv University. He received a B.Sc. in Physics, a BSc in Biology and a M.Sc. in Neuroscience from Tel-Aviv University; and a Ph.D. in Biological cybernetics from The University of Tuebingen, Germany. Before joining Tel-Aviv University as a faculty member, he was a post-doctoral fellow in the Weizmann Institute of Sciences and in the University of Chicago.
GPS-BAT project This variability is an important advantage in terms of studying animal decision-making, a topic which lies at the core of the GPS-BAT project. As the project’s Principal Investigator, Dr Yovel is developing miniature sensors that can be mounted on the bats, including a GPS sensor and a microphone to track and record them. “This allows us to detect, for example, when a bat is attacking prey, when it is interacting with another bat, and other types of behaviour as well,” he explains. Dr Yovel’s team is monitoring an entire colony, gathering data on how bats make decisions on foraging for food, such as
individuals and gather behavioural data on a very fine scale, from which new insights can be drawn about how decisions are made. The focus of attention in the project is bats, yet Dr Yovel says the monitoring system could potentially be applied on other animals in future. “We are already applying it on rats in a different project, and several people have spoken to me about applying it on different animals. The main advantages of the system are its size – it’s tiny (<2 grams) – and the fact that it includes a lot of complementary sensors. You can record things like acceleration, audio, and heart rate,” he says. Dr Yovel and his colleagues have gathered
Bats are great models – they’re wild and extremely variable. It’s a huge group with more than 1,300 mammalian species whether they rely on their own experience and memory, or whether they are influenced by the decisions of other bats. “Imagine that we find that a bat returns, year after year, in the same season, to the same mango tree. This tree had ripe fruits a year ago – it’s now the same season again, and the bat immediately goes there. That would be indicative of spatio-temporal longterm memory,” says Dr Yovel. Now, the question is what would the neighbouring bat in the colony do? Would it go to the same tree as the first bat or to another tree it knows? The researchers hypothesise that bats possibly glean information from their neighbours, maybe by smelling them, exchanging verbal communication, or some other mechanism. “Now you have this more complex scheme of decision-making, where you glean information about the availability of mango from your neighbour, but then make a decision based on a combination of social and personal information. Those are the kinds of insights that we’re hoping to gain,” outlines Dr Yovel. There are also differences in behaviour between individual bats. “For example some individuals are very exploratory, and will always look for new places, while others are exploitatory, and tend to return to places where they have been before. This is another level of decisionmaking, where the personality of the bat influences their decision,” continues Dr Yovel. The goal in research is to track specific
a lot of in-depth information about a single colony, in future he plans to look at networks of colonies, monitoring the movement of bats between them. “Do bats move to a new colony to get new information about the environment? We would also like to start manipulating the availability of food by developing automatic feeding sites,” he outlines.
Field work Mexico 2014
Getting to the heart of the quality of life Health is an important factor in an individual’s quality of life, but it’s not the only consideration. Professor Mirjam Sprangers tells us about her work bringing together different areas of research to improve the conceptualisation of quality of life, and the sensitivity by which it is measured A high proportion
of patients with multiple morbidities report a stable quality of life (QoL), despite their health problems. One factor behind this stability is that many of the questionnaires currently used in research may track stable characteristics of an individual, rather than monitor their current state, says Professor Mirjam Sprangers. “The most common type of research in the past 30 years has used retrospective measures. For example, questionnaires asked patients things like; ‘have you had a headache in the past week? Did your disease interfere with your social activities? Please provide a rating, from not at all to very much,’” she explains. Based at the University of Amsterdam, Professor Sprangers is the Principal Investigator of a multi-disciplinary project aiming to improve the conceptualisation and measurement of QoL. “We work together with researchers from different disciplines, among which are religious studies and ethics,” she outlines.
QoL, while the other would also improve QoL, but not as dramatically,” continues Professor Sprangers. “I wanted to pursue research over a time period where people would experience change in their QoL – that was necessary to test my measures and my analytical procedures.” This research is built on group level retrospective data as well as on data about individual patients’ QoL, gathered at random intervals during the day. Alongside questions on an individual’s mood, such as how they feel at that specific moment, researchers are also gathering data on other issues. “We ask about their physical symptoms, so whether they’re experiencing pain and fatigue for example. We also ask for contextual information, such as where are they? Whom are they with?” says Professor Sprangers. An individual’s QoL is not necessarily directly related to their health, as there are many cases where people have objectively poor
We ask patients about their physical symptoms, so whether they’re experiencing pain and fatigue for example. We also ask for contextual information, such as where are they? Whom are they with? Quality of life The core principle here is that only the individuals themselves can judge their own QoL. A self-report of QoL is still prone to bias however, or to other influences that may affect the measurement, an issue central to the project’s research. “We want to improve the measurement of QoL by improving the measurement itself, and the analysis of change over time,” says Professor Sprangers. The project is targeting adults with a cardiac disease and multiple co-morbidities. “I chose a population of cardiac patients who were going to undergo different interventions. One intervention would dramatically improve
health and excellent QoL, and vice-versa, a point of great interest to Professor Sprangers. “What people experience in their own QoL is only partially correlated with objective parameters of health,” she says. “In general you would expect that people whose health has improved would also experience a comparable improvement in their QoL, but that relation is generally weak.”
Individualised treatment This research could hold important implications for medical decision-making and healthcare economics. When guidelines on preferred treatment for specific conditions
are developed for example, or when the reference value for a disease is determined, it is important to balance the cost of treatment with the cost per QoL adjusted years gained. “Those are the macro decisions that might be affected by not taking into account the fact that people differ in their dispositions and their ability to adapt to changed circumstances,” explains Professor Sprangers. The project will gain important insights in this respect, with two of Professor Sprangers’ PhD students working on research papers. “One student has developed a theoretical model and questionnaire to examine how disease can be integrated into one’s life and how that might affect QoL,” she outlines. “Another student is doing more analytical work, looking at QoL from another perspective which involves - among other areas of research network analysis.”
Improving the conceptualisation Improving the conceptualisation and measurement of quality of life of patients with multiple chronic morbidities, exemplified by patients with cardiac disease undergoing cardiac intervention The overall objective is to improve the conceptualisation and enhance the sensitivity and comprehensiveness of its measurement by taking the trait-state distinction and response-shift into account. Professor M.A.G. Sprangers Department of Medical Psychology, J3-211 Academic Medical Center/University of Amsterdam Meibergdreef 15 1105 AZ Amsterdam, The Netherlands T: +31 020 566 4661 E: firstname.lastname@example.org W: www.amc.nl/ medischepsychologie
Mirjam Sprangers is Professor at the Department of Medical Psychology, Academic Medical Centre, University of Amsterdam. She coordinates a research line on quality of life that addresses the theoretical and methodological conundrums of patient-reported outcomes research in somatic settings.
Personalised treatment of cancer depends on a deep understanding of the unique genetic characteristics of a tumour. We spoke to Gianni Medoro, Maximilian Sergio and Elena Bevilacqua about their work in developing the DEPArray™ technology, a new approach to cell isolation that could open up new possibilities in precision medicine
A single cell technology for personalised medicine The genetic makeup of a tumour varies significantly among individual cases, and treatment tailored to the precise characteristics of the tumour offers the best chance of leading to a better response. This relies on an accurate molecular diagnosis of the individual case and an understanding of the precise nature of the tumour, a topic that has attracted a lot of attention in clinical research. “There is a clear need to deepen our understanding of the underlying molecular mechanisms, which can be achieved by analysing the genome of tumour cells. This is a new approach that is emerging in the field of personalised medicine,” says Gianni Medoro, PhD. As Chief Technology Officer at Menarini Silicon Biosystems, Dr Medoro led the development of DEPArray™, a proprietary technology that could open up new possibilities in precision medicine, particularly with respect to cancer. “What we have developed with the DEPArray™ technology is a new approach to cell isolation, which enables accurate genetic analysis of tumours at the single cell level,” he explains.
This is part of a wider paradigm shift in the treatment of disease, away from a generic approach, towards more personalised treatment. The DEPArray™ technology holds clear potential in these terms, enabling scientists to gain deeper insights into a tumour. “The DEPArray™ technology is aimed at enabling more precise molecular diagnosis of a tumour. In future this will allow clinicians to match each tumour with the correct combination of drugs, that will then work more effectively,” says Elena Bevilacqua, PhD, Product Manager at Menarini Silicon Biosystems. This kind of personalised approach to therapy may offer several advantages. “For example to avoid the use of therapies that cannot work for some specific patients. This is important to prevent adverse side-effects and to offer alternatives to the patient that might work more effectively,” points out Dr Medoro. The DEPArray™ NxT System, the latest development of the technology, is composed of three elements: a benchtop
instrument, a disposable microfluidic cartridge and a proprietary software - the CellBrowser™. The core of the technology is the microsystem cartridge, which integrates a silicon chip, microfluidic chambers and valves. Microelectronics and microfluidics are combined synergically in the cartridge to provide unique single-cell sorting capabilities in a highly-automated platform, providing a simple and reliable system for isolating pure, single, viable rare cells from a heterogeneous sample, for culture or molecular analysis. The working principle of the DEPArray™ technology is based on the ability of a nonuniform electric field to exert forces on neutral, polarizable particles, such as cells, that are suspended in a liquid. This electrokinetic principle, called dielectrophoresis (DEP), is exploited to create - in the micro-chamber at the core of the cartridge - a patterned field force composed of tens of thousands of microscopic attraction regions (called DEP ‘cages’) capable of ‘trapping’ cells in stable levitation and controlled position.
A selection of mages of the core of the technology: the microsystem cartridge, which integrates a silicon chip, microfluidic chambers and valves.
“An array of about 300k micro electrodes integrated in a semiconductor chip allows the generation of up to 30.000 ‘DEP cages’, each able to capture a single cell in stable levitation through the application of gentle dielectrophoretic forces,” explains Maximilian Sergio, PhD, integrated circuit design (ICD) manager at Menarini Silicon Biosystems. The fluorescent microscope integrated in the DEPArray™ instrument allows the acquisition of high-resolution images for each individual cell in the sample, thus enabling accurate cell analysis and selection based on fluorescence and morphology. High-resolution imaging is key to minimizing errors and adding precision and robustness to the system. The proprietary CellBrowser™ software then elaborates the images and enables automatic or operator-assisted selection of the desired cell. “Once identified each target cell can be isolated from the other cell types automatically. DEP cages are moved by a change in the electric field pattern concurrently and independently, step by step, along trajectories calculated by the software, dragging each selected cell from the original location into a dedicated Parking chamber,” explains Maximilian Sergio. From the Parking chamber, target cells are moved into the Recovery chamber and from there are ejected directly into a recovery support through an accurate microfluidic control, without any risk of crosscontamination between different samples. The recovery procedure can be repeated to obtain from the same sample multiple recoveries of individual target cell or group of cells separately.
Gianni Medoro (right), CTO and Nicolò Manaresi (left), CSO of Menarini Silicon Biosystems presenting the DEPArray NxT at AACR Annual Meeting in New Orleans.
Medoro. “The problem with circulating tumour cells is that they are extremely rare – they are present in blood at a ratio of around one tumour cell to every billion normal cells. Our technology provides a solution to this problem by offering the capability to identify and isolate those cells with 100 percent
The DEPArray™ technology is aimed at supporting precision medicine in oncology by enabling precise molecular characterization of a tumour. This will help clinicians to develop and implement protocols to match each patient with the correct combination of drugs, that will then work more effectively and improve patient outcomes
Applying DEPArray™ in liquid biopsy Liquid biopsies are essentially non-invasive blood tests to detect circulating biomarkers like circulating tumour cells (CTCs) that are shed from tumours into the bloodstream. Genetic analysis of these cells provides all the information necessary to understand the genetic mutations of the tumour and to identify molecular targets for personalized therapies. DEPArray™ technology sets a new standard of excellence in this setting, as it allows the precise identification and isolation of these rare CTCs from enriched blood, right down to the single cell. “CTCs are very important, as they are a very good representation of patients’ tumour genomic variability, so there is a strong motive in isolating and analysing those cells, so as to understand the genetic characteristics of a tumour,” continues Dr
which they can identify the correct treatment, and then follow the patient’s progress as the disease evolves. A tumour by nature evolves and changes over time, also in response to therapy, so it therefore may be necessary to adapt and modify treatment; applying the DEPArray™ technology in liquid biopsy offers
The DEPArray NxT benchtop instrument
precision down to single cells, with a high level of automation and reliability.” CTCs play an important role in the advanced stages of cancer, as they are effectively the seeds of metastasis. “Once these cells have entered the bloodstream, they may then reach and colonize a secondary site in the body, which can lead to metastasis,” outlines Bevilacqua. The ability to isolate these cells and analyse them, on the single cell level, gives an indication of the state and genetic makeup of the disease. This will give clinicians the foundations on
a non-invasive means of monitoring the progression of a tumour over time, which can then help to inform therapeutic decisions. “The great advantage of this type of analysis is that it will allow clinicians, in the near future, to follow the evolution of the disease over time with a simple, non-invasive blood drawing,” explains Dr Medoro. “The goal is to develop a blood test that can be done at regular intervals to see if a therapy is working, helping doctors not only to identify what is the best treatment for individual patients but also to follow how the disease is responding to that treatment.” The benefits of the DEPArray™ technology in liquid biopsy are well illustrated by the example of a recent study at the Manchester Cancer Research Centre in the UK (https:// w w w. n a t u r e . c o m /a r t i c l e s /n m . 4 2 3 9 ) . Researchers there used DEPArray™ to isolate circulating tumour cells from the blood of patients affected by small-cell lung cancer, then analysed the genetic profile of those
cells one by one. “By analysing the copy number profiles of single circulating tumour cells, they were able to identify a genetic signature correlated to the response of these patients to chemotherapy. This is an important result, as it represents an example of how useful the DEPArray™ is to isolate those cells individually, to support the study of whether a particular therapy might be an effective treatment for an individual patient,” explains Dr Medoro. This is a clear example of the wider importance of this research. “It is widely understood now that not all patients respond in the same way to therapy, so a generic therapy for a cancer patient may not prove effective,” says Bevilacqua. “Therefore you need to go into the genetics of the tumour; you need to understand what is driving it.”
Dissecting tumour heterogeneity in tissue biopsies A further important characteristic of a tumour is its heterogeneity; every tumour is different and even within a single tumour specimen there are different populations of cancer cells and normal cells that have distinct characteristics. DEPArray™ technology holds clear importance to personalized medicine because of its ability to ‘dissect’ the heterogeneity of tumour biopsies. It is necessary to break the tumour biopsies by breaking the tumor down into its components to enable to analyse them separetely. Starting from a minute, low-cellularity tumour biopsy, DEPArray™ can isolate and recover groups of tumour cells or individual cells, allowing a complete and precise understanding of tumour biology, which can then guide a decision on the most effective molecular, target treatment. “The technology can be used to achieve a level of purity which enables the genetic analysis of tumour cells, even from small biopsies” outlines Dr Medoro. “Furthermore, each tumour cell may have different genetic characteristics, so analysing just the bulk of
Representative images of the CellBrowser™ software which enables image-based selection and recovery of specific cells and cell populations in a sample.
the cells in many cases is not sufficient to gain a deeper understanding of the tumour overall. The DEPArray™ technology is able to isolate tumour cells one by one or in pool, with high levels of purity, to reduce the background noise that might make it difficult to interpret the genetic data.” This underlines the wider potential of the DEPArray™ technology, now the team at Menarini Silicon Biosystems are looking to introduce it more widely in clinical cancer research and from there to diagnostic settings. Bevilacqua says there are cultural and technical barriers to overcome before this can be achieved. “There are certain standard, well-established practices in the clinic, and we need to ensure that the value of the technology is well understood. The other major challenge is technological, which is one of the major reasons why Menarini Silicon Biosystems is part of the ADMONT project, an EU-backed initiative developing a pilot line for certain technologies.
ADMONT project The DEPArray™ platform is currently installed in several research centres worldwide as a ‘Research Use Only’ instrument to support clinical research activities in life science, but its potential is not limited to the lifescience market. “With our involvement in the ADMONT project, we could accelerate the optimization of the technology to bring it from a research instrument to an instrument that has all the characteristics to be widely adopted in clinical settings,” says Dr Medoro. “We are working to introduce into the pilot line all the requirements to ensure that this technology is compatible with the diagnostic market, not only in terms of cost and quality, but also ease of use and automation.” “At the end of the project, we will have a technology which perfectly matches the needs of the users,” stresses Sergio.
ADMONT Advanced Distributed Pilot Line for More-than-Moore Technologies
The DEPArray™ systems is an innovative technology, developed by Menarini Silicon Biosystems, to sort, manipulate, and collect individual rare cells or groups of cells from heterogeneous samples. Using an electronic chip-based microfluidic cartridge and fluorescent image-based analysis, the DEPArray isolates 100% pure tumor cells from tissue biopsies or single circulating tumor cells form blood. By participating in the ADMONT project, Menarini Silicon Biosystems is optimizing the DEPArray™ system (CMOS/MEMS, Microfluidics, Automatic machine and Software) to accelerate its penetration in the cancer diagnostic markets.
The ADMONT project has received funding from: - the ECSEL Joint Undertaking under grant agreement No 661796. This Joint Undertaking receives support from the European Union’s Horizon 2020 research and innovation programme and Germany, Finland, Sweden, Italy, Austria, Hungary. - the Austrian Ministry for Transport, Innovation and Technology (BMVIT) under the program ICT for future (IKT der Zukunft). - the Swedish Governmental Agency for Innovation Systems.
Dr Karl-Heinz Stegemann (Coordinator) X-FAB Dresden GmbH & Co. KG Grenzstrasse 28 01109 Dresden, Germany T: +49 351 4075 6214 F: +49 351 4075 6607 E: email@example.com W: https://admont-project.eu Gianni Medoro, Ph.D.
Gianni Medoro, Ph.D. Chief Technology Officer and co-founder of Silicon Biosystems (now Menarini Silicon Biosystems). He is the inventor of the core technology patent of DEPArray™, and co-inventor behind more than 30 European and US patent families. He holds a PhD in Electrical Engineering and Computer Sciences. He is co-author of more than 70 scientific publications in the field of Lab-on-a-Chip.
Unravelling the role of redox modifications Reactive oxygen species (ROS) can cause oxidative stress and protein damage in organisms, but they are also involved in signalling processes and regulate cellular processes. Professor Haike Antelmann tells us about her research into specific protein modifications caused by ROS, molecular switches that play a role as oxidative stress defense mechanisms in bacteria, and are important for the development of effective treatment against specific pathogens The exposure of bacteria to ROS results in oxidative stress responses and cellular damage, but low doses of ROS are also implicated in signalling processes and regulate specific thiol-switches. Based at the Freie Universität in Berlin, Professor Haike Antelmann is the Principal Investigator of the Mycothiolome project, an ERC-backed initiative investigating the topic. “My project is investigating the role of these thiol-switches, protein modifications that play a role under oxidative stress conditions,” she outlines. When bacteria are exposed to ROS during infection, or just in the course of everyday life due to aerobic growth, specific proteins inside the bacteria are modified by ROS. “ROS can damage proteins, and hence bacteria need specific protection and defense mechanisms,” explains Professor Antelmann. “Low molecular weight (LMW) thiols help to protect against protein damage and to keep the redox balance.”
Low molecular weight Thiols in bacteria This forms a core part of Professor Antelmann’s research agenda. Actinomycetes, specific members of Gram-positive bacteria, are an area of particular interest in the project. “Mycothiol is the major LMW thiol in Actinomycetes which can modify proteins under oxidative stress,” says Professor Antelmann. In the case of eukaryotes and most Gram-negative bacteria, the major LMW thiol is glutathione (GSH), an antioxidant that protects cellular components from damage by ROS. There is a large body of research about the role of protein modifications by GSH in humans, which are implicated in many physiological and pathophysiological processes, and hence can be thought of as molecular switches controlling human health or disease. However, Grampositive bacteria, such as Actinomycetes and Firmicutes do not produce GSH, but instead utilize alternative LMW thiols, such as mycothiol (MSH) and bacillithiol (BSH). “We have found that proteins are protected under oxidative stress by MSH and BSH. These protein modifications can lead to changes in the activity of the proteins, meaning that the proteins become inactive or active, which has a regulatory effect,” continues Professor Antelmann.
We identified 58 proteins with S-mycothiolations in M. smegmatis under HOCl stress that are shown by color codes based on their % oxidation and classified into different functional categories in this Voronoi treemap. The cell size denotes the protein abundance in the total proteome. [Figure 2 published in Hillion et al., Scientific Reports 7: 1195. (2017)].
The project’s primary focus is fundamental research into the function and structure of specific proteins and how they are modified and what is the physiological consequence for the bacteria. Professor Antelmann and her colleagues in the project are using sophisticated techniques, including mass spectrometry and novel thiol-redox proteomics approaches, to analyse the S-mycothiolome under oxidative
to protect proteins from lethal damage in these conditions.” A number of interesting results have been gained, while researchers are also investigating the process by which proteins are switched-on or off between different conformational and functional states. Thiol switches play a central role in this regard. “We aim to find out what the most important thiol switches are with respect
We are investigating which modifications occur under oxidative stress, and the effect on the physiology of the cells. We have found that proteins are protected and redoxcontrolled under oxidative stress by mycothiol, which could be important under infection-related conditions stress conditions. “In this project we have applied new redox proteomics methods and visualization tools to unravel the different kinds of protein modifications in a quantitative manner. More than a thousand proteins are modified in different ways under oxidative stress conditions,” she says. There are several different forms of oxidative stress; one major area of research is hypochlorous acid stress. “Hypochlorous acid (HOCl) is a very potent oxidant,” explains Professor Antelmann. “Mycothiolations, protein modifications by the LMW thiol MSH, can help
to these protein modifications. With these thiol switches, you can almost switch a protein on or off,” explains Professor Antelmann. Many thiol switches have been found, and researchers now aim to develop a deeper understanding of the underlying mechanisms involved and their wider effects on cellular physiology. “Do these modifications play an important role in protecting the proteins against damage or do they change protein activity? We want to understand the physiology behind these protein modifications,” outlines Professor Antelmann.
Corynebacterium diphtheriae and Mycobacteria This research is quite fundamental in nature at this stage, yet it also holds real importance to our understanding of the body’s response to certain pathologies. The project aims to explore the comprehensive S-mycothiolome of Corynebacterium diphtheriae, the bacterium that causes diphtheria and in Mycobacterium smegmatis as the model bacterium for pathogenic mycobacteria. Professor Antelmann says this work could help in future drug development. “We have found important regulatory thiol switches that are controlled by MSH and are conserved also in the pathogenic agent of tuberculosis Mycobacterium tuberculosis. If we find that mutants are compromised in its resistance to oxidative stress, this could indicate that this protein could be novel drug target,” she outlines. “We could use these proteins as targets to develop novel antimicrobial drugs. This is something we will investigate in a future study.” Thus, there is significant scope for continued research in this area, to both pursue further fundamental investigations, and to translate new findings into improved treatment of pathogenic bacteria. “Developing drugs to inhibit specific proteins could be an effective method of treating multi-resistant bacteria. This is our major longterm vision, and we are focusing on exploring the role of redox switches that are important for pathogen protection,” continues Professor Antelmann. The development of new drugs that target redox-regulated proteins depends on fundamental knowledge of the functions of target proteins in the pathogen’s defense against the immune system. Professor Antelmann emphasizes the importance of continued basic research to the discovery of new drug targets. “There is not much fundamental research on redox-switches in pathogenic bacteria that are modified by their own LMW thiols and confer
Protein S-mycothiolation and real-time redox imaging in Corynebacterium diphtheriae during ROS stress and infection conditions
resistance to the host defense.” The project already discovered several new redox regulators that control antibiotics resistance and function also as ROS defense mechanisms which may be important also in M. tuberculosis. Moreover, the redox control mechanisms of main metabolic enzymes were studied in great detail. Some of these thiolswitches are conserved in the major pathogen Staphylococcus aureus which is another research focus. Thus, the developed redox tools were applied in other important pathogens, such as S. aureus to study adaptation under infection conditions. With the project being more than half way through its funding period, Professor Antelmann now concentrates her research on the molecular details and mechanisms of the thiol-switches using genetics, molecular biology, biochemistry and microbial physiology. “We focus on characterising the most important redox switches to understand the role of the thiol-switches in microbial physiology and pathogenicity,” she says. Translating this fundamental knowledge into the development of new drugs would be a task for a future project “We have found several interesting new drug targets and redox regulators, and we could potentially map out a research path for the next 10 years, for detailed investigation of these thiolswitches,” continues Professor Antelmann. “We work closely with other research groups, and have established many national and international collaborations at the Freie Universität Berlin.”
The ERC Consolidator Grant MYCOTHIOLOME aims to elucidate the physiological role of mycothiol (MSH) for post-translational thiol-modification of proteins in Corynebacterium diphtheriae and Mycobacterium smegmatis and to monitor in real-time the changes in the MSH redox potential under oxidative stress using genetically encoded redox biosensors.
European Research Council (ERC) Consolidator Grant MYCOTHIOLOME / GA No. 615585
• Brussels Center for Redox biology, Vrije Universiteit Brussels, Belgium • Department of Structural Biology, Freie Universität Berlin, Germany • Helmholtz-Zentrum für Umweltforschung, Leipzig, Germany • Institute for Microbiology, University of Greifswald, Germany • Centre for Biotechnology, University of Bielefeld, Germany • Center for Organismal Studies, Heidelberg, Germany
Project Coordinator, Professor Haike Antelmann Freie Universität Berlin Institut für Biologie-Mikrobiologie Königin-Luise-Straße 12-16 14195 Berlin T: +49-30-838-51221 E: firstname.lastname@example.org W: http://www.bcp.fu-berlin.de/en/ biologie/arbeitsgruppen/mikrobiologie/ ag_antelmann/index.html Professor Haike Antelmann
Since 2015, Haike Antelmann has been full Professor of Molecular Microbiology at the Institute for Biology at the Freie Universität Berlin. Her research is focused on the physiological role of thiol-switches and the low molecular weight thiols bacillithiol and mycothiol in redox-regulation, cellular metabolism and virulence mechanisms in pathogenic Gram-positive bacteria.
MD simulations of BSH docked into the active site Cys of GapDH of S. aureus indicates that S-bacillithiolation does not require major structural changes. The figure was generated in collaboration with Frauke Gräter (Heidelberg) and Agnieszka Pietrzyk-Brzezinska (Lodz).
A Climate for Change Environmental Scientist, Angela Terry, talks about her latest initiative to help prevent a runaway global temperature rise by inspiring consumers to make greener choices. After this interview by Richard Forsyth, she remarked she â€˜felt incredibly sadâ€™ dwelling on the impacts of climate change. Only urgent action, by everyone, can make a difference. EU Researcher: Tell us about the Climate Alliance. How and why was it set up and who is involved? Angela Terry: The aim of the Climate Alliance is to galvanise industry support for climate action. The Climate Alliance was set up to speed up the transition to a low carbon future by creating an online market place for promoting clean technology, backed up by a cross-industry marketing campaign called One Home. The rationale was that scientific understanding and technological developments are advancing at a huge speed and technologies are becoming more costeffective all the time. However, consumers donâ€™t know this. We want citizens to understand how accessible these technologies are and how
beneficial they are to them. The Climate Alliance is a social enterprise that is supported by members from the clean tech industry, including wind and solar companies. The Climate Alliance team includes experts in communications, behavior change, public relations and social media, as our focus is on creating stories of change that resonate with citizens.
EU R: Please explain the aims of your new website, One Home and how it will help make a difference? AT: One Home makes it easy for people to go green. It is a portal to explain and demystify clean technology. It exists to enable and
encourage consumers to make positive choices that significantly reduce their carbon emissions and prepare them for the impacts of a changing climate. We highlight the solutions that are becoming more affordable, accessible and effective by the day. People are worried about climate change but don’t yet understand the benefits of taking action or how easy it is to do so. One Home will help people harness technological advances to save money, whilst improving comfort and quality of life. This is the time to mainstream low carbon lifestyles and One Home intends to make this happen.
EU R: You have worked hard to network in the renewables and green industries, to get them involved in backing the website. Can you explain the kind of people and industries you are reaching out to and how much support for this idea is there? AT: I find people in the clean tech industry connect with the concept and understand the need almost straight away, as they often feel frustrated at the speed of progress in this sector. People believe this is a good idea, with companies signing up just on the concept, which is reassuring. Having said that, some companies are more cautious as they want to see immediate business benefits and large lists of followers from the beginning. This is a barrier most startup companies face and we have set membership fees at an accessible level to ensure alliance of companies. We are expecting significant numbers of members to join. EU R: What is the most upsetting aspect about climate change for you?
AT: We are sleep walking into a future of irreversible and catastrophic change and we need a public debate on this issue as we haven’t had a serious one yet. Given the scale of the damage and the transition required to stop and adapt to global warming, it is unbelievable. This silence is deadly. The clean tech sector has a great story to tell but in the past we just haven’t been good at telling it. No one wants to play Russian Roulette with our homes, water, safety and food security. Yet, despite 25 years
of talking about emission reductions, the amount of fossil fuels burnt each year increases because we engage in business as usual; from switching on our boilers at home, filling up the car at a petrol station, or never quite getting around to installing a solar panel system. However, if you ask anyone if they want more clean energy, inevitably they will say ‘yes’. Our purpose is to galvanise that support into climate action.
EU R: Do you think, in general, people will be more motivated to change their everyday behaviour through fear of environmental disaster, or by receiving personal rewards – like efficiencies at home? AT: We’ve been talking about climate change for decades and global carbon emissions have not decreased. Highlighting global impacts and paralysing people with fear does not work. However, explaining how climate change impacts them and offering positive solutions with a better lifestyle as part of the mix is empowering. Focusing on desired outcomes such as clean air, warmer homes, cheaper green energy, safer travel and protecting the planet are just a few benefits and these outcomes are ‘no brainers’ for most people.
Every time we make a purchase we are deciding what kind of world we support. This is true every time we choose to invest, every time we purchase food, every time we choose an energy supplier or a travel destination. It is urgent because climate change is happening now!
Ultimately, we need strong leadership in the form of legislation on carbon taxes and caps. Without those, the future starts to look quite bleak.
people have installed solar panels, because they ask their neighbours about them and have a trusted installer recommended, which takes away the uncertainty.
EU R: Is limiting the global temperature rise to 1.5 degrees C a
EU R: How many key industries are dragging their heels by not
realistic aim? Already there is talk of higher temperature goals by the UN, which might be more realistic to achieve. AT: Limiting warming to 2 degrees will take nothing short of a miracle. More attention to carbon reduction and more resources for an adaptation are needed. Droughts and downpours along with other impacts will dominate the headlines from here on in and we also know it is the poorest who will suffer most, so the sooner there is a real price on carbon, the better.
engaging with greener methods and innovation? What would you say to CEOs of industries that are still ignoring climate change? AT: I recently saw an investment fund manager give his view of companies to invest in. Surprisingly there was no one sector off limits but the companies that were failing to embrace the inevitable march to clean energy, they were the ones to avoid. So, there are two questions I would pose to CEOs. My first question is ‘what’s your legacy?’ The other is, ‘how are you steering your company to benefit from one of the fastest growing industries in the world?’ Change is inevitable. The rate of change is the only uncertainty. The longer we delay, the more costly it will be, the more damage we will see and more people will die as a result.
EU R: What are the problems people have, or barriers in the way for people, to make consumer choices that are better for the environment? How can these barriers be overcome? AT: Awareness is the biggest problem, as well as challenging the myths that exist around new technologies. Understanding that clean energy technology is viable and economically competitive is a big issue. There was an example where a council looked into supporting electric vehicles, where some members of the public had anxiety about the range of electric vehicles but cars are not usually being driven all day, so they can be charged when they are parked. Ensuring people understand their choices make a difference will help to mainstream climate action. We have to show that electric cars are affordable and desirable and also illustrate how petrol and diesel cars are morally unacceptable. We know fossil fuels are on their way out, so let’s make changes sooner and make a difference, rather than later. By far the best people to promote clean tech are the early adopters. They are evangelical about their solar panels, their electric car or their smart thermostat. That is why you end up with streets where so many
EU R: How is technology and innovation providing real solutions? AT: The third industrial revolution is offering economically advantageous solutions to have services on demand rather than depreciating fixed assets. For example, automated transport rather than a car parked outside your home, often bought on expensive credit. Or smart thermostats that only heat water when it is needed rather than twice a day on a timer.
EU R: How urgent is it that we make greener consumer choices? AT: Every time we make a purchase we are deciding what kind of world we support. This is true every time we choose to invest, every time we purchase food, every time we choose an energy supplier or a travel destination. It is urgent because climate change is happening now.
The aim of the Climate Alliance is to galvanise industry support for climate action. The Climate Alliance was set up to speed up the transition to a low carbon future by creating an online market place for promoting clean technology, backed up by a cross-industry marketing campaign called One Home. 28
EU R: Can you give some great tips on changes people can make today, to reduce their impact on climate change? AT: First, if you are thinking of making a change, just do it. Even if it costs just a little bit more, don’t delay or overthink it, just do it, act. Heat and transport are the greatest sources of emissions so insulating your homes, traveling by public transport and choosing holidays that are not a long, long way from home are great ways to save money,
improve comfort levels and avoid time spent trapped in planes and cars. Try to think about climate change as much as you do about financial reasons and justifications. See climate action as a reason to make a commercial choice. It may feel like a bleak situation when you look at what is happening with climate change but my motto is ‘focus on what you can do’, and make changes now.
Angela L Terry Angela Terry is an Environmental Scientist and founder of the Climate Alliance. She has twenty years of experience in clean energy, having worked in hydro, solar, on-shore wind and biomass schemes. She pioneered community renewables in the UK, raising £15 million in positive investments and was key to setting up the first wind farm in the South East of England. She was the Head of Wood Fuel for the Forestry Commission and is active in the development of the UK Renewable Energy sector. E: email@example.com
Effective strategies to deal with climate change Climate change is set to have a significant impact on Europe’s forests, with changing levels of water availability likely to affect their ability to provide key ecosystem services. Understanding how plants respond to climate variability is central to forecasting how they will respond to future change, as Dr Elisabeth Robert and Dr Jordi Martínez-Vilalta explain Our forests play a number of ecologically important roles, including both producing oxygen and removing CO2 from the atmosphere. However, climate change and other pressures are likely to affect forests’ ability to perform these roles in future, a point which underlines the importance of the Phloemap project’s work. “We aim to understand how plants adjust to changes in climate, and with that build a better understanding of how they might respond to future climate change,” says Dr Elisabeth Robert. A key priority in research is to investigate the traits that determine how tree species use water. “Our main aim was really to understand better how plants adjust to different levels of water availability. So we’re focusing on the traits that we suspect are related to this,” outlines Jordi Martínez-Vilalta.
PHLOEMAP Hydraulic functional traits as determinants of forest function and drought responses. Putting xylem and phloem attributes into the functional trait map EU H2020 Marie Curie IF grant (659191) Centre for Ecological Research and Forestry Applications Universitat Autònoma de Barcelona - Campus de Bellaterra - Edifici C 08193 Cerdanyola del Vallès, Spain T: +34 935 81 38 11 E: Jordi.Martinez.Vilalta@uab.cat PHLOEMAP project: http://www.creaf.cat/ hydraulic-functional-traits-determinants-forestfunction-and-drought-responses-putting-xylemand-phloem-attributes-functional-trait-map FUN2FUN project: http://www.creaf.cat/ functional-traits-approach-forest-function-anddynamics-implications-provision-ecosystemservices-under-climate-change
Elisabeth M.R. Robert (right) obtained a PhD in Sciences from the Vrije Universiteit Brussel (Belgium) in 2012. In 2016, she joined CREAF to work on the PHLOEMAP project, thanks to a EU Marie SkłodowskaCurie individual fellowship. Jordi Martínez-Vilalta (left) obtained a PhD in Environmental Sciences at the Universitat Autònoma de Barcelona (UAB) in 2001. He is now Senior Lecturer at UAB and researcher at CREAF, has been Honorary Fellow at the University of Edimburg (UK) and currently holds an ICREA Academia award.
Tree species This research centres around analysing specific traits in six different tree species commonly found across Catalonia, and indeed much of the Western Mediterranean. The traits selected represented the anatomy of the tree, part of the structural component of the wood. “One of the traits we measured was, for example, the size of the cells which transport water in trees. Are there differences among the different tree populations and different species in relation to water transport capacity?” asks Dr Robert. The project researchers are studying these species across 90 study sites; despite its relatively small size, Catalonia has a high degree of climate variability, and the study sites have very different levels of water availability. “We sampled each of the species at 15 of these sites. We took a comprehensive measurements of environmental variables – so for example measurements of climate variables and certain soil characteristics,” continues Dr Robert.
further information. “We’ve got information on the dynamics of how forests change over time. We can then relate this with the data that we are gathering now to determine which traits underlie the response of these forests to climate over recent decades,” he explains. This could help researchers build a deeper understanding of how forests might respond to future climate change. “We can use this information to improve our ability to forecast how forests might respond in the future,” says Dr Martínez-Vilalta.
Climate change The backdrop to this research is growing concern about the impact of climate change. Catalonia as a region is experiencing higher temperatures, which leads to higher demand for water in forests. “The fact that temperatures are rising means that there will be more atmospheric water demand for forests. This essentially means that water will evaporate faster and that forests will need more water,” explains Dr Martínez-Vilalta.
Our main aim was really to understand better how plants adjust to different levels of water availability. So we’re focusing on the traits that we suspect are related to this The project’s work is complemented by the work of the Fun2fun initiative, in which scientists have taken additional measurements of the same trees, aiming to help better their knowledge of forest ecosystems. The combined work of these two projects enables researchers to build a deeper picture of how forests adapt to climate variability. “We can draw bridges between the cellular level – where the actual water transport occurs – and the forest level, at which decision making takes place,” explains Dr Robert. “The special aspect of these two projects is that these different scales are brought together in a design that allows us to understand the structural (anatomy) and functional (physiology) variability which exists in a forest, and from there upscale and draw conclusions at forest scale.” The Spanish government has also conducted several large-scale national forest inventories over the last 25 years, so Dr Martínez-Vilalta and his colleagues have access to a wealth of
The project will play an important role in understanding and forecasting how forest ecosystems will change in the future, and Dr Martínez-Vilalta is working with regionalscale modelling of vegetation dynamics to gather information. “A very important factor that we need to consider in our models is wild fires. The higher the temperature the higher the likelihood of wildfires,” he says. Three dimensional visualisation of a small proportion of the water transporting tube system within the stems of trees. Different water conducting tubes are visualised in different colors.
Coordinating research to overcome the challenges of agriculture, food security and climate change Close collaboration between scientific researchers across Europe is central to addressing major societal challenges. We spoke to Dr Hartmut Stalb, Dr Heather McKhann, Professor Michael Bruford and Dr Marta Pogrzeba, about the work of FACCE-JPI in supporting transdisciplinary research into interconnected challenges around climate change, sustainable agriculture and food security Joint Programming Initiatives (JPIs) are designed to enable coordinated action on major societal challenges, with countries sharing research expertise and resources to investigate key areas of societal concern through the alignment of national priorities and research programming. One such challenge is maintaining food security and agricultural productivity in the face of the impacts of climate change, a topic that lies at the core of FACCE-JPI, an initiative which coordinates research across these areas in 24 countries. FACCE-JPI is supported by a coordination and support action FACCE EVOLVE (see panel information). “We work with the commission to identify research priorities across this broad area of food, agriculture and the bioeconomy,” says Dr Heather McKhann, one of the coordinators of FACCE- JPI. The European Research Area (ERA) provides a framework for research cooperation between Member States on specific areas of research, and several ERA networks (ERA-NETs) have been proposed by FACCE, while JPIs are more about strategic planning. “FACCE is really about this intersection between agriculture, food security and climate change. So we are taking
a system-based approach, not a topic-based approach,” outlines Dr Hartmut Stalb, the Chair of FACCE-JPI. This helps to ensure resources are allocated effectively and prevents duplication of research, while also putting in place the foundations for future collaboration. While there is of course a lot of ongoing research into different aspects of agriculture and food production, Dr McKhann says it’s important to not view it in isolation, and consider the wider picture. “With FACCE, we’ve really tried to look around at initiatives working on different topics that touch on our remit. Then we can exchange information with them, and in many cases even work with them and put in place joint actions,” she outlines. The research landscape evolves continuously, so Dr McKhann and her colleagues are also aware of the need to monitor emerging topics of interest. “We hold exploratory workshops and bring together experts to see if there is something which needs to be addressed by FACCE,” she says. “FACCE has a portfolio of research actions including several ERA-NETs, and there are research projects within each of the actions that are running. We try to assess what’s been covered up to that point, so we’re
always thinking about the future direction of research.”
ClimGen project One action is the FACCE ERA-NET Plus, which examines ‘Climate Smart Agriculture’ through 11 research projects. A major research priority is investigating the impact of climate change on agriculture, a topic that is central to the work of the ClimGen project. “Our project is about trying to understand the genomic architecture of different livestock breeds, and predicting how they could function and operate outside their climate comfort zone,” says Professor Michael Bruford, the project’s Principal Investigator. Researchers have been examining genome-scale data in four species, looking at breeds living in very different climatic environments. “We’re looking at genome data from livestock species in tropical Africa and tropical south America, all the way through to breeds in Yakutia, which is one of the coldest parts of Russia, as well as some from northern Finland and Iceland,” continues Professor Bruford. “From there we can ask; what does that tell us about how breeds adapt to climate differences?”
Photograph by Øyvind Antonsen
Yakutian cattle are raised in an environment where the temperature regularly drops as low as -50°, demonstrating that under some circumstances livestock can adapt to very extreme conditions
Yakutian cattle are raised in an environment where the temperature regularly drops as low as -50°, demonstrating that under some circumstances livestock can adapt to very extreme conditions. While many factors need to be considered in terms of a species’ ability to adapt to environmental change, the primary focus in the project is on genetic and physiological adaptability, with Professor Bruford and his colleagues measuring changes in the animals through the genes they express. “For example, one part of our project involves looking at the red-legged partridge, which is being bred in southern Spain, an area where temperatures are really rising rapidly,” he outlines. Researchers are looking at the transcriptomes of those animals. “We’re interested in how the characteristics of their different transcripts change as a function of heat-related immune stress,” continues Professor Bruford. “We’re also looking at the epigenome, which is expressed differently in each tissue. The DNA is modified by a process called methylation – and expression patterns can rapidly change according to the environment.” The project’s work has also involved investigating cattle from Colombia, descendants of Spanish bulls that were taken to the new world by Christopher Columbus. Over a relatively short time, these bulls
have developed adaptations to the tropical Colombian environment. “Their genetic architecture and expression patterns have effectively been modified. We’re looking at this process in cattle and pigs, at the effect of temperature on how the epigenome is organised. Our team is also comparing sheep and goat populations living in colder environments at high altitudes in the Atlas mountains with those living at low altitudes in much hotter, desert environments,” says Professor Bruford. The wider goal in this research is to develop new breeding strategies and help future-proof livestock under various climate change scenarios. “The final aim is to take the data that we’ve generated – on how many genomic signatures of selection there are and how important they are in the traits of the animals – and incorporate that in breeding simulation models to see how we can accelerate climate adaptation in vulnerable livestock populations,” outlines Professor Bruford.
Miscomar project A number of other projects are also funded under another ERA-NET called FACCE SURPLUS on sustainable and resilient agriculture for food and non-food systems, which funded 14 research projects in a first call, including MISCOMAR, a project developing
techniques for the production of biomass on marginal land. The soil on marginal lands is often unsuitable for food crop cultivation, so researchers are investigating possible alternative uses. “In our project, we are trying to convince farmers to use this land not for food production, but to produce bioenergy crops,” says Dr Jacek Krzyzak, a key figure in the project. The project is investigating the potential of growing different novel hybrids of Miscanthus, a high-yielding energy crop, on marginal land, with researchers trialling it in three different locations. “We have a trial at a site in Poland, in heavy-metal contaminated soil. In Germany, we planted Miscanthus in high-clay content soil. The third site is in Lincolnshire in the UK,” outlines Dr Marta Pogrzeba, the project’s Principal Investigator. “The problem in Lincolnshire is that the soil was previously used quite intensively for agriculture, and now there is a very low concentration of nutrients. So the soil is very low-quality.” Researchers are also working to identify possible options for the eventual utilisation of this crop in terms of energy production. The crops have been divided into green biomass – collected in the Autumn – and brown biomass, which is collected during the Winter. “We use this Winter biomass for combustion for example, and we’re also looking at anaerobic
Figure 1: Main FACCE-JPI joint research actions
Agriculture, Food Security and Climate Change
Photograph by Øyvind Antonsen
digestion,” says Dr Pogrzeba. Currently, biofuels are often produced from agricultural land, leading to tensions with food producers; the project’s research holds clear potential in these terms, opening up the possibility of producing biomass elsewhere, while also helping to strengthen the rural economy. “Biomass production is a possible way of helping local people gain a greater degree of independence in their energy supply, but that’s more of a long-term goal,” says Dr Krzyzak. “What’s important for us at the moment is to improve the management of soil in contaminated areas, to try to find a solution for such soil, and to help farmers and land users in such areas build a sustainable source of income.”
Funding the future The funding through the ERA-NET is central to the future of the project, giving researchers a firm financial foundation on which to investigate questions of importance to wider society. The funding has allowed researchers
to build on earlier investigations into biomass production in agricultural areas. “We previously focused on selecting crop species for use on land contaminated with heavy metals. With the MISCOMAR project, we can develop a more or less full value chain for the biomass from contaminated and marginal land – so it really helps us, both from the scientific point of view, and also from the stakeholder point of view,” outlines Dr Krzyzak. FACCE-JPI will continue to support this kind of research in future, contributing to the goal of building a bio-based economy in Europe, while Dr Stalb says the initiative will also evolve in line with emerging challenges. “We will continue to develop our strategic research agenda, and we will update our implementation plan, according to the challenges that have been identified in the field of agriculture and climate change,” he outlines. “We will also look to coordinate research more effectively with other JPIs in related areas, while we will also develop our international strategy.”
One part of our project involves looking at the red-legged partridge, which is being bred in southern Spain, an area where temperatures are really
The Joint Programming Initiative on Agriculture, Food Security and Climate Change (FACCE-JPI) was launched in 2010, bringing together 24 member countries. Its aim is to build the European Research Area tackling the challenges at the intersection of agriculture, food security and climate change that cannot be addressed solely at the national level (Fig 1). This is being realised through the alignment and integration of national and European research programmes, the funding of new research programmes, and through exploring innovative approaches for the member countries to work together to address the challenge of ensuring a secure food supply to an ever-increasing global population in the context of climate change. FACCE-JPI VISION: An integrated European Research Area addressing the challenges of Agriculture, Food Security and Climate Change to achieve sustainable growth in agricultural production to meet increasing world food demand and contributing to sustainable economic growth and a European bio-based economy while maintaining and restoring ecosystem services under current and future climate change. FACCE-JPI MISSION:To achieve, support and promote integration, alignment and joint implementation of national resources in Europe under a common research strategy to address the diverse challenges in agriculture, food security and climate change.
FACCE EVOLVE – Coordination and support action FACCE ERA-NET Plus FACCE SURPLUS - ERA-NET Cofund
Amélie Sordet Institut National de la Recherche Agronomique Rue De L’Universite 147 75338 Paris Cedex 07 France E: firstname.lastname@example.org FACCE-JPI Secretariat T: +33 (0)1 42 75 94 07 E: Facce-Secretariat@inra.fr W: https://www.faccejpi.com
Dr Hartmut Stalb, FACCE-JPI Chair
Dr Hartmut Stalb is the FACCE-JPI Chair, and is also currently the Head of Division for Research and Innovation within the Federal Ministry of Food and Agriculture, BMEL (Germany). He is an agricultural economist by training and obtained his doctoral degree at the University of Kiel (Germany) and the University of Newcastle upon Tyne (UK).
HOLDING BACK THE BUSHES, Earth Observation for monitoring and preventing land degradation Land degradation and desertification (LDD) is a significant problem across the world, affecting soil fertility and, in some areas, even leading to food insecurity. Researchers in the LanDDApp project aim to gain deeper insights into LDD in the North West Province of South Africa by assessing the extent of bush encroachment, a process often linked to LDD, as Dr Elias Symeonakis explains The encroachment of bush or woody vegetation into previously grassy areas is a serious problem in South Africa’s North West Province, leading to reduced soil fertility and even food insecurity in some areas. As the Principal Investigator of the LanDDApp project, Dr Elias Symeonakis aims to assess the extent of the problem in the area. “With this project we are trying to monitor the growth of woody vegetation in North West Province. We are using remote sensing tools and data sets to identify woody vegetation as accurately as possible,” he explains. Woody vegetation can be identified highly accurately with modern remote sensing technologies. By combining recent estimates with earlier data from the ‘80s and ‘90s, researchers can gain new insights into how African savannahs are changing and evolving over time. “Where we see steady increases in woody vegetation, we can identify those areas as potential hotspots of land degradation and desertification (LDD),” continues Dr Symeonakis.
Land degradation and desertification This affects many areas across the world, with more than 50 percent of the Earth’s land surface thought to be prone to LDD, which can be broadly defined as the inability of the land to provide those services that it would have been able to provide previously, due to a combination of anthropogenic processes. A number of reasons have been put forward to explain the growing severity of the LDD problem, yet Dr Symeonakis says the primary factor is widely thought to be climate change. “A lot of blame was previously put on the management of the area by the local community and the herdsmen, so it was thought to be related to issues like overgrazing, fires and bad management practices, generally. Nowadays we are confident that climate change is the most important factor,” he outlines. There is now a concerted focus in research on understanding the extent of the problem, an issue that lies at the core of the project’s work. “We’re looking at the problem of bush encroachment specifically,” says Dr Symeonakis.
Typical land cover types of the the study area: (a) woody vegetation; (b) grassland; (c) cropland, and (d) nonvegetated land.
The impact of this on local eco-systems can be serious, affecting the availability of food for local animals and disrupting the local economy. The problem is particularly acute in the North West Province, where livestock are a key source of income. “When these grasses have been taken over by the bushes, by the woody vegetation, then cattle, the only source of income for many locals, becomes malnourished,” outlines Dr Symeonakis. This has knock-on effects on the local economy, underlining the wider importance of the project’s work. “Alongside identifying the presence of woody vegetation, we also aim to map its density, i.e. the percentage of it in specific areas,” continues Dr Symeonakis. “If an area that previously had 10 percent shrub coverage, with the rest of it typically being grasses and bare soil, changes to 20 percent woody vegetation, then that constitutes a 100 percent increase, which can be quite alarming. It’s important to know if we can identify the percentage of woody vegetation with sufficient accuracy.” This would allow researchers to build a deeper picture of regional variations, and
from that identify in which areas it might be necessary to take mitigation measures. The NW Province itself covers an area of approximately 100,000 km2, and there are variations across the region in the extent of land degradation. “On the western side of the province, the situation is pretty serious. There are poorer communities and smaller properties which don’t have the resources necessary to deal with the problem,” says Dr Symeonakis. The way the land is used is also an important consideration. “There are areas in which bush encroachment is a bigger problem than others. It might be that the land is really only used in some areas for raising cattle for example, whereas in other parts of the North West Province, it is mostly used for crop cultivation,” says Dr Symeonakis. “In the central part of the Province, most of the land is devoted to agricultural practices, and bush encroachment is less of a problem there.” The scale of the LDD problem is still a matter of debate, with international authorities and organisations, such as the United Nations Convention to Combat Desertification (UNCCD) looking to gain a
deeper understanding. The project’s research will make an important contribution in these terms, helping to identify hotspots of savannah land degradation, which could also enable policy-makers to target mitigation measures more effectively. “Decision-making can be assisted by the types of maps that we can provide with these remote sensing tools. They are highly accurate and they cover large areas,” explains Dr Symeonakis. With more data about land quality in specific areas, as well as other relevant information, researchers could then look to map the progression of LDD under certain climate scenarios, a topic Dr Symeonakis plans to investigate in the future. “We haven’t done any specific scenario work in this area yet, but it’s on the to-do list,” he says.
consider the effect of bush encroachment into savannah grasslands that were previously used for feeding livestock, it’s entirely reasonable to label it as land degradation,” he outlines. Researchers are currently working on the methodology for estimating the percentage of woody cover, while Dr Symeonakis is also keen to build collaborations with projects in complementary areas. “We’re working together with ecologists and applying our remote sensing techniques to other areas in East Africa as well as South America and Southeast Asia,” he says. There is also the possibility of including additional data in the models. Dr Symeonakis and his colleagues are looking to combine data sets from drones with other sources, which could enable researchers to differentiate
With this project we are trying to monitor the encroachment of woody vegetation in the North West Province. We are using remote sensing tools and data sets to identify woody vegetation and its density and how that might be linked to land degradation in the region Future progression This work will involve some modelling, based on data about past and present conditions, from which deeper insights can be drawn into the likely future progression of LDD, which is central to wider debates around environmental change and sustainability. While there is still some level of debate as to whether bush encroachment is really a problem - in fact it may even be beneficial for certain species - Dr Symeonakis says it can have a serious impact on some communities. “When we
between woody vegetation species. “The ability to identify different types of woody vegetation is also quite important,” he says. Some types of woody vegetation are beneficial, for example providing shelter for animals and fuelwood to humans, which is very different from other invasive thorny and unpalatable species encroaching in areas essential for feeding. “We are in a position where we have all these different data sets and techniques, and we’re looking to apply them,” says Dr Symeonakis.
During the last 25 years, a large part of the Northwest Province has been converted to woody vegetation cover (depicted here in brown colour).
LanDDApp Land Degradation and Desertification Appraisal for South Africa Project Objectives
LanDDApp assessed land degradation caused by bush encroachment in the pilotstudy area of North West Province of South Africa. LanDDApp used multi-temporal, multi-sensor and multi-seasonal satellite data to identify degrading areas where mitigation measures are required in order to provide a management tool for the prioritisation of such measures.
The project was funded by an EU FP7 Marie Curie Career Integration project (PCIG12GA-2012-3374327).
• Humboldt University of Berlin (Germany) •N ational Technical University of Athens (Greece) • Northwest University (South Africa) • Free State University (South Africa)
Project Coordinator, Dr Elias Symeonakis Senior Lecturer in GIS & the Environment School of Science & the Environment Manchester Metropolitan University John Dalton Building E410a, Chester Street, Manchester M1 5GD, UK T: +44 (0)161 247 1587 E:email@example.com E: @EliasSymeonakis W: www.land-degradation.org o Higginbottom, T.; Symeonakis, E.; Meyer, H.; van der Linden, S. 2018. Mapping Woody Cover in Semiarid Savannahs using Multi-seasonal Composites from Landsat Data. ISPRS J. Photogramm. Remote Sens. 139, pp. 88-102; DOI: 10.1016/j.isprsjprs.2018.02.010 o Symeonakis, E., Higginbottom, T.P., Petroulaki, K., Rabe, A., 2018. Optimisation of Savannah Land Cover Characterisation with Optical and SAR Data. Remote Sensing 10(4), 499; DOI:10.3390/rs10040499 o Symeonakis, E., Petroulaki, K., Higginbottom, T. 2016. Landsat-based woody vegetation cover monitoring in Southern African savannahs. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 41, pp. 563-567; DOI: doi=10.5194%2fisprsarchives-XLI-B7-563-2016
Dr Elias Symeonakis
Dr Elias Symeonakis is a Remote Sensing scientist with a Master’s degree in GIS and a PhD in Geography. He is a Senior Lecturer at Manchester Metropolitan University and has previously worked at the CSIRO, Royal Holloway, King’s College London, CGIAR/ CIAT, the University of Valencia and the University of the Aegean.
Protecting the fruits of farmers’ labour Insect pests and pathogens represent a significant threat to the EU fruit industry, destroying crops and causing significant financial losses. The Dropsa project has been developing a cost-effective, integrated approach to pest management that will help protect crops and boost the European fruit industry, as Dr Neil Audsley explains
A type of fruit fly, Drosophila suzukii was
first recorded in Europe (Italy and Spain) in 2008, and it has since had a significant impact on soft fruit production. Drosophila suzukii is extremely difficult to control with pesticides, as Doctor Neil Audsley, the coordinator of the Dropsa project, explains. “It lays its eggs and larvae develop inside the fruit, causing the fruit to rot. Furthermore, the harvest to market interval, when pesticides are not applied, coincides with the time when the fly infests the fruit,” he outlines. The major aim of the Dropsa project was to improve the management and control of new and emerging pests like D. suzukii, as well as specific pathogens that affect fruit production. “We’ve also been working on three pathogens, Pseudomonas syringae pv. actinidiae (Psa) on kiwi fruit and Xanthomonas species on soft and stone fruits. These pests and pathogens normally arrive on imported fruit or plants, then spread more widely.” Many different pests and pathogens have arrived on new shores in this way, and more could potentially be introduced into Europe via the fruit trade, as vast quantities of goods cross borders every day. Dropsa has investigated the pathways of introduction of different pests and pathogens, which will provide a basis for the development of preventative strategies. “The pathways and the risks involved with the trade of certain fruits and what would be carried by those fruits have been reviewed. An alert list for soft fruits, as well as apples, table grapes, oranges and mandarins, has been created to try and identify the pests that could potentially be introduced,” says Dr Audsley.
Since its arrival in Europe D. suzukii has spread across the continent, causing significant financial losses to fruit growers. In Europe D. suzukii is not regulated by natural factors - unlike in its region of
has been spent looking into this,” explains Dr Audsley. “Drosophila suzukii for example is quite cold-tolerant. It breeds best at 20-25 degrees, but it will survive at low temperatures (-5°C).” A major problem with this particular
A major problem with this particular pest is that it has a very large host range. It attacks not only horticultural crops, such as strawberries, raspberries and other soft fruits, but also a lot of wild fruits origin - hence populations are unchecked and can grow rapidly, causing serious problems. “You need to understand the ecology and biology of the pest to help identify means to control it. A lot of time Project coordinator Dr. Neil Audsley attending to a Drosophila suzukii monitoring trap.
pest is that it has a very large host range. It attacks not only horticultural crops, such as strawberries, raspberries and other soft fruits, but also a lot of wild fruits. “It survives in woodland and hedgerows where there’s food present. When crops start to ripen, it migrates into the crop. You can’t just indiscriminately spray pesticide along hedgerows and in woodlands,” points out Dr Audsley. Given this background, an area wide control strategy, such as biological control, is required. “We’ve been investigating parasitic wasps that attack D. suzukii, both those that are native to Europe and also parasitoids from Asia, where D. suzukii originates. These parasitoids may help to keep the pest in check in the region of origin,” outlines Dr Audsley. “However, a non-native parasitoid from Japan cannot be released in Europe without authorisation to do so – it’s a bit like introducing another pest.” The introduction of an exotic biological control agent may affect native species and disrupt the local eco-system, so it’s essential to understand its wider impact before it can be introduced. ”Exotic
parasitoids have been identified which show potential in controlling this pest in the laboratory,” he says. “The work on these parasitoids will continue over the longer term, in the hope that permission to release them as a means of controlling D. suzukii will be granted.”
Pathogens Early detection is essential for the management of plant disease, especially for bacterial diseases such as bacterial spot of Prunus (caused by Xanthomonas arboricola pv. pruni; Xap), bacterial canker of kiwifruit (due to Psa) and angular leaf spot of strawberry (caused by Xanthomonas fragariae). Diagnostic assays, based on loop-mediated amplification (LAMP) have been developed for all three bacterial pathogens. Diagnosis can be achieved in 30 minutes in the field, and could be developed into commercial kits for on-site pathogen detection. There are no effective strategies for the control of these bacterial diseases
in the EU. Dropsa has been developing novel products, formulations and delivery technologies. A novel antimicrobial peptide was effective at controlling kiwifruit, peach and strawberry pathogen infections. The peptide is biodegradable and with very low toxicity. Stem injection methods have been developed to deliver antimicrobial peptides,
which protected kiwifruit and peach plants against infections of Psa and Xap. Biological control agents (BCAs) were identified and evaluated, resulting in two potential candidates (Lactobacillus plantarum and Bacillus amyloliquefaciens) that are being developed as novel microbial pesticides. The bacteria were produced by
Drosophila suzukii monitoring trap on a netted cherry tree © N. Audsley (Fera).
Experimental plot testing netting on individual Cherry trees to exclude Drosophila suzukii. © N. Audsley (Fera).
DROPSA Strategies to develop effective, innovative and practical approaches to protect major European fruit crops from pests and pathogens
To develop reliable, robust and cost-effective approaches to protect the major European fruit crops from Drosophila suzukii, and bacterial pathogens. This has been achieved by investigating their pathways of introduction, determining their biology, ecology and/ or epidemiology to develop innovative, preventative and integrated control solutions, and developing forecasting and decision support systems and risk mapping as a component of integrated pest management.
FP7-KBBE-Control of pests and pathogens affecting fruit crops
Fera (UK) • Stichting Dienst Landbouwkundig Onderzoek (Netherlands) • Cab International (Switzerland) • Universitat De Girona (Spain) • Alma Mater Studiorum-Universita Di Bologna (Italy) • University of Leeds (UK) • Imperial College London (UK) • Consiglio per la Ricerca E La Sperimentazione In Agricoltura (Italy) • European and Mediterranean Plant Protection Organisation (France) • Institut National De La Recherche Agronomique (INRA) • Endoterapia Vegetal (Spain) • Oxitec Ltd (UK) • Julius Kühn-Institut (Germany) • Zurcher Hochschule Fur Angewandte Wissenschaften (Switzerland) • Agriculture and Agri-Food Canada • The New Zealand Institute for Plant and Food Research Limited • United States Department of Agriculture • Yunnan Agricultural University (China) • Pherobank BV (Netherlands) • Agrifutur SRL (Italy) • Hokkaido University (Japan) • Ecologia Proteccion Agricola SL (Spain) • Handelsonderneming Vlamings BV (Netherlands) • Instytut Ogrodnictwa (Poland)
Dr Neil Audsley, Project Coordinator Fera Science Limited Sand Hutton York, North Yorkshire YO41 1LZ United Kingdom T: +44 1904 462628 E: firstname.lastname@example.org W: www.dropsaproject.eu
Male (left) and female Drosophila suzukii on a strawberry © Fera .
fermentation processes in a bioreactor, and innovative formulations have also been developed to improve the fitness of the BCAs. Preliminary field tests have confirmed the efficacy of both bacteria for control of the quarantine bacterial diseases. More extensive field testing is in progress in different areas of Spain and Italy.
Management methods Although methods have been developed to help manage these pests and pathogens, no one single strategy is going to be effective, so the emphasis in the project was on integrated pest management. “Working with fruit growers, both chemical and nonchemical methods have been evaluated in the field,” says Dr Audsley. “This includes the use of nets to protect the fruit from flies,
Brown spot on Kiwi vines infected with Pseudomonas syringae. © F. Spinelli (UNIBO).
and orchard management, such as mowing, pruning diseased plants, and removing and destroying fruit after harvest.” The wider goal of this research was to develop improved management methods to help reduce fruit losses from insect pests and pathogens. Around 14 percent of all potential food production globally is thought to be destroyed by insect pests, while it has been estimated that the EU fruit industry loses 3 million tonnes of produce due to pests and pathogens, underlining the wider importance of the project’s work and sharing best practice. “We interact closely with the industry and with growers,” says Dr Audsley. “Workshops and clinics have been held over the course of the project for example, helping to inform growers about the best way to protect their crops.“
We’ve been looking at parasitic wasps that attack Drosophila suzukii. This is both parasitoid wasps that are native to Europe and also parasitoids from Asia, where Drosophila suzukii originates
Dr Neil Audsley
Dr Neil Audsley is an invertebrate physiologist with more than 30 years of experience. He provides strategic research and development for the management of native, invasive and emerging pests of agriculture, horticulture and tree health importance. His major research interests include invertebrate physiology and endocrinology, alternative pesticides, insecticide resistance and integrated pest management. He manages collaborative projects for UK government and the European Commission.
Parasitic wasp on a blueberry searching for Drosophila suzukii larvae © T. Hayes (CABI) .
Finding the Balance: Climate Change, Carbon Cycling and the Amazon The Amazon region holds great interest to climate scientists, because large amounts of CO2 enter and exit the atmosphere here, with strong impacts on the climate system. Now researchers in the ASICA project aim to gain a more precise understanding of the extent of CO2 uptake by the Amazon rainforest, as Professor Wouter Peters explains The Amazonian region
has been affected by a number of serious droughts since the turn of the century, which affected the amount of CO2 that the rainforest removed from the atmosphere. Normally, the Amazon region acts as a net sink of carbon, as Professor Wouter Peters explains. “In normal years the Amazon rainforest takes up a lot of CO2 through photosynthesis, as it’s an extremely large, green area with a lot of plants, yet it also releases a lot of CO2 mainly through wildfires. Overall, it removes a little bit of CO2 from the atmosphere,” he outlines. However, the region experienced serious droughts in both 2005 and 2010, and 2015 as well, which proved highly disruptive in these terms. “In 2010 we saw that the influence of the drought on vegetation was really turning the Amazon from what is normally a sink of CO2 to being a source of CO2,” says Professor Peters. This holds significant implications in terms of our understanding of the climate and how it is likely to evolve in future. While it is thought the Amazon region has been acting as a carbon sink for a fairly long period, the records in this area only extend back around 10- 15 years; now researchers in the ASICA project are gathering more atmospheric data. “We’re collecting air samples from light aircraft flying over the area,” explains Professor Peters, ASICA’s Principal Investigator. These samples are quite wellmixed. “Motion in the atmosphere has brought signals from the surface into a very deep layer of the atmosphere,” says Professor Peters. “So we’re not just measuring at the leaf, or over the head of the individual tree – we’re really sampling a very large area. We call this the integrating capacity of aircraft measurements.”
measurements, to measure different processes going on inside trees and leaves,” he says. The challenge is to build a deeper understanding of what’s happening on those scales and connect it with the wider picture in the Amazon as a whole. “The idea is that the behaviour of this one large area, of a million square kilometres, is in the end what drives CO2 in terms of climate. This is the climate signal that we need to understand and to simulate,” continues Professor Peters. This is a complex challenge, even in areas
challenging work. “There’s not a lot of CO2 in this air to begin with. In this programme we specifically try to measure the stable isotopes of CO2 – 13C, 17O and 18O – which are much, much less abundant,” says Professor Peters. These isotopes are also very tricky to handle. They’re a little heavier than normal CO2, which means that they tend to stick to things, so Professor Peters says a lot of care is required when collecting the samples. “You have to be really careful when collecting this
In normal years the Amazon rainforest takes up a lot of CO2 through photosynthesis, as it’s an extremely large, green area with a lot of plants, yet it also releases a lot of CO2. Overall, it removes a little bit of CO2 from the atmosphere where there are well-established atmospheric monitoring networks. ASICA is playing a pioneering role in this respect, with light aircraft used in the project to gather samples at different altitudes, up to around 6,000 metres. “There is a micro-computer on the aircraft, which we have programmed in the lab to fill twelve flasks at different levels in the atmosphere,” outlines Professor Peters. The pilot follows a pre-programmed schedule, gathering samples from different altitudes, which are then sent to a laboratory in Brazil for measurement and analysis; this is technically
air to make sure that you don’t come into contact with certain surfaces like liquid water, and that you don’t have pressure gradients,” he explains. It is also important to consider the overall composition of the air samples when looking to measure stable isotopes. “One of the biggest challenges we’ve been able to solve in this programme is getting rid of water vapour, which is very abundant in the tropical atmosphere. It completely destroys the signature of the oxygen isotopes that we’re interested in, the 17O and 18O isotopes in the CO2 molecule,” continues Professor Peters.
The exchange of CO₂ and its stable isotopes during uptake by leafs. When passing through small openings called ‘stomata’ the ratio of 13C, 18O, and 17O in CO₂ change by a measurable amount. ASICA aims to measure and use this signal to estimate photosynthesis over the rain forest.
Data gathering The data itself is being gathered from four sites across the Amazon region. This data is complemented by ground-based measurements, from which Professor Peters and his colleagues in the project can then look to build a more complete picture. “People on the ground are looking at specific eco-systems, or plots of trees. They’ve been monitoring them over long periods and doing very intricate
ASICA New constraints on the Amazonian carbon balance from airborne observations of the stable isotopes of CO2
View of the Madeira river shortly after take off for a CO₂ sampling flight over the Amazon rain forest. Flasks with dry air collected during this flight are analyzed for stable isotopes in the ASICA program.
CO2 uptake The approach to measuring 13C in CO2 is by contrast much more established, and researchers have been able to gather a lot of data over the course of the project. This provides the foundations for Professor Peters and his colleagues to investigate wider questions. “Some of our staff in the project are dedicated to modelling, numerical modelling analysis, and trying to understand what our measurements mean for CO2 uptake over the Amazon rainforest,” he says. Researchers are looking at two main questions in particular. “The first is, how large is the carbon uptake by the Amazon rainforest? This means the oneway uptake – the gross primary production (GPP), all the CO2 that is removed through photosynthesis,” outlines Professor Peters. The second question is – how does that number change during serious droughts?” The problem in addressing these questions is that the one-way flux of CO2, effectively how much goes into the forest through photosynthesis, is almost completely balanced by how much comes out again. However, once this exchange has happened there is a change in the amount of 13C in CO2 in the atmosphere, from which new insights can be drawn. “The plants that take up CO2 have a preference for the lighter isotope – 12C – over 13C. So that means that after you’ve had a lot of exchange there’s a bit more of the heavier molecule – 13 C – in the atmosphere. That’s a signal we are trying to measure,” says Professor Peters. A more detailed understanding of these processes and the overall CO2 uptake over the Amazon could help to inform the ongoing development of climate models, underlining the wider relevance of the project’s work. Alongside providing rigorous figures for the uptake of CO2 over the Amazon region, Professor Peters also hopes to gain new insights into the underlying processes behind these figures. “Those will
be the main results of the ASICA programme, acting as atmospheric constraints on the exchange of CO2 from the Amazon,” he says. “With the project infrastructure in place and the data-gathering practices now fairly wellestablished, we’re starting to focus more on numerical analysis and interpretation,” he says. Further data will still be gathered over the remainder of the project however, and looking further forward Professor Peters believes that it’s important to continue taking these measurements beyond the end of the ASICA Funding Term. It is predicted that the Amazon region will change dramatically in future due to the impact of climate change, so Professor Peters says it’s essential to carry on gathering data. “We only have one chance to measure things – you can always model and analyse them again afterwards,” he points out. “Future modellers are going to be looking for historical data about this area, to see how it’s changed over time. That’s what we’re gathering now.” Staff at the LaGee lab in Brazil receive training on isotope measurements using laser spectroscopy. With this technique, ASICA researchers have made the first ever measurements of C17OO/C18OO ratios over the tropical rain forest.
Severe droughts in Amazonia in 2005 and 2010 caused widespread loss of carbon from the terrestrial biosphere. This loss, almost twice the annual fossil fuel CO2 emissions in the EU, suggests a large sensitivity of the Amazonian carbon balance to a predicted more intense drought regime in the next decades. This is a dangerous inference though, as there is no scientific consensus on the most basic metrics of Amazonian carbon exchange: the gross primary production (GPP) and its response to moisture deficits in the soil and atmosphere. Measuring them on scales that span the whole Amazon forest was thus far impossible, but in this project we aim to deliver the first observation-based estimate of pan-Amazonian GPP and its drought induced variations.
EU contribution: EUR 2,275,993
• WAGENINGEN UNIVERSITY, Netherlands • UNIVERSITEIT UTRECHT, Netherlands • RIJKSUNIVERSITEIT GRONINGEN, Netherlands • UNIVERSITY OF LEEDS, United Kingdom • INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS, Brazil
Professor Wouter Peters Department of Meteorology and Air Quality Environmental Sciences Group Wageningen University, The Netherlands & Centre for Isotope Research Energy and Sustainability Research Institute Wageningen Wageningen University, Droevendaalsesteeg 4 6708 PB Wageningen The Netherlands T: +31 (0)317 486654 E: Wouter.Peters@wur.nl W: http://www.asica.eu
Professor Wouter Peters
Professor Wouter Peters investigates the rising levels of carbon dioxide in our atmosphere. His research combines measurements of greenhouse gases and their isotopic composition with powerful computer models of weather, and climate. His current research focuses on the rapidly changing carbon cycle of the Amazon rain forest.
Shining new light on environmental analysis Loess deposits cover about 10 percent of the earth’s land surface, and are an important resource in reconstructing palaeoclimate. We spoke to Dr Alida Timar-Gabor about the INTERTRAP project’s work in investigating and analysing loess samples from three different continents, with the aim of developing improved dating methods Many large deposits of loess can be found across the world, covering around 10 percent of the earth’s land surface and providing a rich source of data for scientists to investigate the development and evolution of the global climate over time. Loess itself is essentially a type of windblow dust, samples of which Dr Alida TimarGabor and her colleagues in the Intertrap project have been analysing in great detail. “We have been working on dating the loess-palaeosol deposits in Eastern Europe for some time. Now we will extend our work and date loess-palaeosol deposits in China, Europe, and the US,” she outlines. These loess-palaeosoil sequences represent important archives of past climate, from which scientists can draw new insights.
Luminescence and sedimentology sample collection from loess (Loveland, Iowa).
“They basically represent a succession of changes between glacial and warm past climate. So when we have a warm period, like the Holocene period which we are living in now, soils are formed,” says Dr TimarGabor. “During a glacial period, loess is deposited.” These are important considerations in terms of our understanding of the global
the development of climate models, an issue of which Dr Timar-Gabor is well aware. “The wider picture in research is that dust is an important factor in climate models, which is a major motivation behind looking at palaeoclimate sequences,” she acknowledges. “However, the main goal of the project at the moment is developing better dating methods, that’s the core of the project.”
We have been working on dating the loess-palaeosol deposits in Eastern Europe for some time. Now we will extend our work and date loess-palaeosol deposits in China, Europe, and the US climate, as dust has a significant influence on climate. For example, dust concentrations in the atmosphere affect the radiative balance of the earth, while Dr Timar-Gabor says there are also other factors to consider. “Accumulations of dust with a certain grain size can lead to the creation of nucleation centres, then more clouds and cooling. We know that historically during cold periods, we had two times more dust available than now – so dust is definitely an important factor in climate,” she explains. Dust is a correspondingly important consideration in
Alternating loess-palaeoosol deposits that reflect the Pleistocene/Holocene transition at Enders site, Nebraska.
INTERTRAP project This goal has been identified in recognition of the relative limitations of current dating methods. While optically stimulated luminescence dating methods are effective in dating some loess samples up to a certain point, inaccuracies have been identified when they have been applied on samples over around 40,000 years of age (40 ka). “We’ve shown that the results are not as accurate as we previously thought they were,” says Dr Timar-Gabor. The core of the project centers around combining more of the available dating methods in a new protocol, in order to identify why results from luminescence dating on these older samples are inaccurate. “We have the right chronology up to 40 ka – but then, for some reason, we don’t get the right results. In INTERTRAP, we use both luminescence methods as well as electron spin resonance for dating. By using the latter method we hope to improve our understanding of the mechanisms of optically stimulated luminescence production in quartz.”
Baicaoyuan: The “Sea of Loess” from the Chinese Loess Plateau. It comprises the most extensive and continuous archives of the Quaternary (~ the last 2.58 Million years) climate changes.
The main aim of the project is to develop better dating methods for these archives,” continues Dr Timar-Gabor. “These dating methods could then also be applied on other wind or water borne sediments, not only loess.” Researchers aim to integrate several different dating methods, in order to establish an improved dating protocol, bringing together optically stimulated luminescence, thermoluminescence and electron spin resonance. Dust samples have been gathered from several of the sites of interest in the project, now Dr Timar-Gabor and her colleagues will look to analyse them. “We have some proxies that we are looking at – so things like the grain size of the dust, and the magnetic susceptibility, which is a proxy that is usually employed when looking at these archives,” she outlines. Some differences have been observed in the samples that have been gathered so far. “The big climatic changes are generally synchronous around the world. But we also have some samples of many weak palaeosoils, which are characteristic to certain regions only,” says Dr Timar-Gabor. “There were periods where Eastern Europe was pretty dry for example, while Western Europe was more moist.”
Climate modelling The wider goal in this research is to develop correlation protocols that could be applied by climate scientists across the world, which holds important implications in terms of climate modelling. Loess deposits hold rich potential in terms of deepening our
understanding of the global climate, how it is formed and how it is likely to evolve, yet this potential can only be exploited more fully with highly rigorous dating methods based on absolute chronologies. “Loess is basically mineral dust, which is an important component in climate modelling, but this is not yet fully exploited,” stresses Dr Timar-Gabor. The project aims to make an important contribution in these terms. “We can basically recreate the past accumulation rates of dust, and so generate input data for climate models,” continues Dr TimarGabor. “In order to verify the way these climates models run, we need to feed them with past data and see if they give the right results. This can be achieved only if we have a rigorous chronology for these loesspalaeosol sequences.” These sequences represent the only archive that can be used to reconstruct palaeoclimate in some regions of the world, underlining their wider significance in understanding the history of the global climate. With an integrated approach to dating loess samples, researchers hope to also be able to probe regional differences in climate. “We already have some results that show that there is a lot of regional influence,” outlines Dr Timar-Gabor. A number of loess samples have been collected from different areas of the world, now Dr Timar-Gabor and her colleagues will look to carry out the analytical work. “We have had to set up a lot of equipment, in order to develop these integrated dating methods. Now we are starting with basic analysis of the available samples,” she says.
Intertrap Integrated absolute dating approach for terrestrial records of past climate using trapped charge methods
Project Objectives INTERTRAP will
(i) test for the existence of a terrestrial lead or lag in Pleistocene/Holocene boundary as recorded by loess/paleosol deposits from 3 continents, by explicitly testing the assumption of synchronicity of climate changes recorded by these terrestrial archives. (ii) will provide a significant improvement in our understanding of quartz luminescence production. We anticipate that INTERTRAP will extend the applicable range of quartz luminescence dating of loess and other sediments.
Project Partners and Funding
INTERTRAP is implemented at BabeşBolyai University, Cluj-Napoca, Romania and received funding from project received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme ERC-2015-STG (grant agreement No ).
Project Coordinator, Dr Alida Timar-Gabor Babeș-Bolyai University Faculty of Environmental Science and Engineering Environmental Radioactivity and Nuclear Dating Centre Interdisciplinary Research Institute on BioNano-Sciences Treboniu Laurean 42, 400271, Cluj-Napoca, Romania T: +40 264 454 554 E: email@example.com W: http://enviro.ubbcluj.ro/personal-daim-2/ W: http://cercetare.ubbcluj.ro/primulproiect-erc-implementat-la-ubb/ An interview with the PI (Dr. Alida Timar-Gabor) can be found at: http://oldnews.ubbcluj.ro/conf-univ-dr-alidatimar-gabor-facultatea-de-stiinta-si-ingineria-mediulu/
Dr Alida Timar-Gabor
The most studied Romanian loess-palaeoosol archive is found at Mircea-Voda. Dobrogea is thought to comprise at least five glacial/interglacial cycles.
The loess-palaeoosol complex from Roxolany site, S-W Ukraine, represents a nearly complete Pleistocene terrestrial record of paleoclimatic and environmental changes in the Northern Black Sea region.
Dr Alida Timar-Gabor pioneered the application of trapped charge dating methods in Romania. She is Associate Professor of Environmental Radioactivity at Babeş-Bolyai University, Cluj-Napoca, where she established the Luminescence and Electron Spin Resonance Dating Laboratories.
The Growing Problems of Food Security Global food supply is a ticking time bomb, given the multitude of threats to our resources. Population growth, climate change, excessive harvesting, the destruction of natural resources, urbanisation and desertification – all these factors increase the pressure on agriculture and feeding people. What kinds of solutions can science offer to ease the strain? By Richard Forsyth
here are 7.6 billion people on Earth but this figure will rise to 9.7 billion in 2050 and 11.2 billion by 2100. There will be disproportionate growth rates in different countries. It’s projected by the UN that Nigeria alone will surpass the population of the US by 2050. We can expect a lot of mouths to feed and more food needed in hot, less wealthy countries and cities. Global demand for food will grow by 40% by 2030 and 70% by 2050 according to Research Councils UK. While food security is essentially about supplying a nutritious, balanced and sustainable diet so people can live and grow at a normal rate, there is a second consideration to food security that is important for context. At the same time as the human population increases,
climate change will continue to go through transitions that will have serious impacts on food stocks. Essentially, without changes and interventions, there will be less food – yet more people to feed. Climate change is a threat to food. As greenhouse gas emissions raise the global temperature this will reduce crop yields, whilst encouraging pests and weeds. With a lack of precipitation there will be crop failure and long-term declines in production of foods. It’s true that in some areas agriculture may benefit from climate change but generally there will be a negative impact, unless we change the way we farm. Land for farming is a problematic issue alone. The world has already lost an estimated third of its arable land in the past 40 years due to pollution and erosion according to research by The University of Sheffield’s Grantham Centre for Sustainable Futures. Excessive ploughing of fields and heavy fertilizer use has degraded soils around the world – and erosion has occurred at a rate of 100 times greater than the rate of soil formation. To compound this, there are not vast reserves of available farm land to expand into. Food production already takes up almost half of the planet’s land surface. Whilst there is significant deforestation at present to meet farming demands, a crisis in itself – the available land for agriculture is ultimately a limited resource. It’s widely accepted that for any chance of saving existing natural ecosystems like forests from clearing, we have to find ways to be more efficient with farming. This will mean accepting a helping hand from science and innovation. Scientists are currently looking into interesting ways to bolster food security and making food production more reliable and efficient. Research addresses some of the key challenges, for example, developing genetically modified (GM) food for a higher yield.
Photograph by Michał Gałężewski
From lab, to field, to table Despite an element of public fear around the safety and uncertainties of GM food, where crops are engineered in the lab to be more robust, the advantages are potentially game changing, not just for feeding populations but also in terms of climate impact. A report published in Nature’s scientific reports, distilled over 6,000 papers to understand the impacts of using GM crops, and indications were that the risks paled compared to the benefits. It was found that crop yield could increase up to 25%, equivalent to a fifth less farm land to use – potentially slowing the destructive cycle of deforestation and greenhouse gas emissions. Bioengineering crops is about learning nature’s tricks for plant survival in harsh conditions and engineering them in the food that we eat. Drought resistant plants have a mechanism called crassulacean acid metabolism (CAM) so they survive with little water. A form of photosynthesis, CAM means the pores in the plant’s leaves close in the day when the sun is out, so water cannot escape from the leaves. By genetically engineering metabolic processes like these; rice, soybeans and wheat can be more drought resistant. For those who are squeamish of the GM concept, there are more natural ways to modify crops – such as advanced breeding techniques that influence strains with desirable characteristics. Recently, the International Centre for Research in the Dry Areas (ICARDA) developed such as a crop called Duram wheat, which is resilient to unrelenting 40C temperatures. It is being harvested in Senegal as a sustainable alternative crop for the harsh climate there. Even more basic a solution, is replacing Maize in hot countries with
existing drought resistant crops like millets and sorghum. In western Kenya during the short 2016/17 rainy season, close to 63 tons of sorghum, finger millet and groundnuts were produced as alternative crops. Such changes required re-training farmers and the results are promising. Harvesting marine plants has much to offer global food security. For example, Kelp is being studied for its potential as a food source that ticks all the boxes for sustainability. It grows fast and long, absorbs carbon dioxide from the sea, so it has a carbon negative footprint and it does not require much labour – nor is it threatened by many diseases or pests. Marine based farming like this would reduce the pressure on land resources.
Meat demand Growing alternative crops isn’t the only problem solving needed for food security. Some of the meat we choose to eat has some serious drawbacks for sustainability and our planet. As previously discussed, food security is as much about reducing the impact of our food production on the environment whilst demand for more food rises, as it is about simply meeting demand. Demand for meat is one of the big drivers of climate change because the land needed for cattle and other animals leads to deforestation. Cattle has a heavy carbon footprint. The world’s 1.5 billion cows produce methane, which does more harm to our environment than CO2. Suggested solutions to this have included: reducing our consumption
For those who are squeamish of the GM concept, there are more natural ways to modify crops – such as advanced breeding techniques that influence strains with desirable characteristics. www.euresearcher.com
of meat, eating smaller animals like chickens as opposed to cattle, and consuming insects as an alternative source of protein. The latter point has basic logic behind it. According to Proteins in Food Processing, a volume by Woodhead Publishing Series in
Food Science, technology and nutrition, insects have fats, proteins, vitamins and minerals and have a greater conversion efficiency, with lower greenhouse gas emissions, while requiring less water and land compared to traditional farm animals. One solution is to cut meat out altogether and put all the emphasis on vegetables. Veganism was one of the fastest growing food trends in 2017 and is set to continue its growth as a market. What’s interesting today is that veganism is now linked with reducing climate impact as a reason to change to the diet, so it’s no longer solely the preserve of animal rights activists. Becoming vegan can as much as halve your greenhouse gas emissions. A more radical scientific idea for reducing the carbon footprint and problematic issues of animal farming, is growing meat in the laboratory. Lab grown meat could be on sale by the end of 2018 according to a so called ‘clean meat’ manufacturer, called Just. The meat is cultured tissue grown in vitro from a few stem cells, harvested from a biopsy of a living animal. At present the process is too expensive, so it has to be developed, made more efficient and cheaper before becoming economically viable for retail.
Very human challenges An issue that food security science sometimes struggles to fully comprehend is the public perception of solutions, we’re talking about the consumer adoption side of food. Many of the proposed scientific solutions could suffer from lack of adoption. When it comes to taste, unless it is literally a survival emergency, what we put in our mouths can be down to cultural upbringing around food. For example, eating insect products, kelp, clean-meat from labs and GM crops - these may all pose sound scientific solutions but an important element to them working is that people opt to eat them. It would take a large marketing effort and a shift in attitudes to ensure some of the best solutions on paper become reality in a marketplace. In countries not yet in or perceived to be in food crisis, we know that if consumers find a solution unpalatable or believe there is a health risk, whether there is or not, there will be barriers to uptake of new and radical solutions, however good they seem.
However, having a choice of food is already a privilege for large proportions of our global population. This year in 2018 is seeing a record high for food insecurity. A mixture of climate impacts and man- made conflicts has meant that 124 million people in 51 countries are facing food insecurity, says an annual food crisis report by the Food Security Information Network (FSIN). This is up from 108 million people from 48 countries the previous year. A seemingly unsurmountable problem, as difficult as climate change to de-escalate – is war. War always impacts on food supply. FSIN highlights that in 2017, 74 million people live in 18 war torn countries like the Democratic Republic of Congo and South Sudan – living in the harsh food-starved conditions of wars. Many of these wars are relentless and prolonged in nature. The destruction of agriculture, food supply lines and whole economies during war means food is one of the casualties of conflict. Denying access to food is also used as a crude weapon of war in sieges. Whilst science can help in many aspects of food security, in the context of war, it is hard, ugly and complex to navigate with scientific solutions.
Waste not, want not Despite all the problems with food security, a great deal of food is simply thrown away. In the developed and developing world, food waste can be as high as 40%. The developing world loses it through food chain infrastructure (lack of cold storage, for example) whilst in the developed world a huge amount of food is wasted because it is not sold in an allocated time or is not grown to a prescribed standard. In truth, a lot of food that is thrown away is in fact fit for consumption but the safety margins and regulations forbid sale. Cultural attitudes
toward food also play a part in waste. There are clear potential wins on reducing waste in the future. For example – finding cost effective cold storage solutions for poorer countries and education around food lifespans and safe use. Intervening, post picking and prior to packing, can result in extending the best before date of fruit and vegetables, by stalling the spoilage process. A US startup called Apeel, funded in part by UK Aid, is pioneering a process for keeping moisture in, reducing oxidisation and keeping bacteria out. The company takes lipids and glycerolipids from edible plants (these exist in peels and seeds) and applies them to fruit and vegetables via a dip – creating a preserving barrier that protects against bacteria and keeps the fruit or vegetable fresh for many days, or even weeks beyond its natural decay rate. The beauty of this is that it’s a basic technique and can be applied on a packing line.
Europe leads the agenda It’s plain to see that there are many ways to use scientific methods, innovation and research to alleviate and tackle this pressing challenge of resourcing adequate food. There is still much research and work to be done if we are to successfully navigate our food needs. What we can do, through research efforts, will go a long way to helping feed populations in the future. The EU specifically, is funding many projects in the name of food security and is the biggest development actor in food and nutrition security, in the context of political and financial support. Europe is dedicated to keeping the topic of food and nutrition at the forefront of the international agenda. This is important because how we eat, what we eat and our cultural attitudes toward food will need to change and adapt as the world changes around us.
Despite all the problems with food security, a great deal of food is simply thrown away. In the developed and developing world, food waste can be as high as 40%.
SMART-E: Reshaping robotics research and training for Industry 4.0 Evolving fields such as Artificial Intelligence, big data analytics, embedded systems, cloud and human-robotic interactions will play a part in the 4th Industrial Revolution – an era dubbed ‘Industry 4.0’. This is why the SMART-E project created a training and research programme, to advance robotics in manufacturing Before
we delve into the accomplishments of the SMART-E project, let’s explore what Industry 4.0 looks like. One promise of this industrial revolution is the so called smart factory, where physical systems like production lines and robots communicate to each other, as well as humans, through the Internet of things and are linked with cyber systems that can make simple decentralised decisions with a level of autonomy. Human intervention is minimal and intuitive and where humans can benefit from help, robotic technologies can assist to make tasks both easier and safer. Such factories would be highly efficient and have a competitive advantage. With this scenario in mind, it’s clear that researchers devising more effective robotic technologies will have a huge impact on manufacturing and other sectors, in the near future. Today, an array of new and innovative technologies can be applied to robotics to enable a step change in the way robots can be used within industrial settings. In smart factories, we will work alongside robots, will be able to customise assembly lines, and robotic technologies will assist us to be more productive in our jobs. Preparing for this and ensuring the new, ground breaking robotic technologies are sustainable, is a challenge for a new
generation of scientists. The SMART-E project (Sustainable Manufacturing through Advanced Robotics Training in Europe) was created to facilitate research and training in the specialist fields related to advanced robotics, to support roboticists who aspire to play central roles in the 4th Industrial Revolution.
able to gain hands on experience, conducting experiments alongside European peers, whilst experiencing different working cultures in academic and business sectors internationally. SMART-E was led by many research institutions, including AGCO GmbH, the University of Zurich, Scuola a Superiore Sant Anna, the Italian Institute
It will create jobs for high-level, skilled operators and increase productivity, saving millions of pounds in capital and operational costs over the coming years. Preparing next generation expertise SMART-E developed a world-class doctoral training and research programme, providing a platform for the next generation of graduate engineers to nurture and progress advanced robotics in manufacturing. The programme’s ultimate purpose was to become a catalyst for shaping the future of this important field. The project involved 13 Early Stage Researchers (ESR) and 3 Established Researchers (ER) to guide them. The trainee engineers were
of Technology, the Technical University of Munich, the Advanced Manufacturing Research Centre (USFD), in addition to partners in the manufacturing industry and R&D companies. SMART-E, by necessity, needed to address emerging issues that will require attention in this new era, such as embodied intelligence, verification and testing, interoperability, worker-support by cyber-physical systems, autonomous delocalised decision making, plus practical business considerations like ensuring new manufacturing processes are sustainable and cost-effective. By using state of the art techniques and novel technologies the trainees became adept to the new opportunities and possibilities for industrial use. The programme went well beyond the purely technical side, to teach soft skills such as leadership, business and interpersonal
SMART-E second summer school in Livorno-Italy.
The 4 Industrial Revolutions 1st – 1765. Steam power, mechanisation and weaving looms. This was a transition from hand production to machines. 2nd – 1870. Known as the Technological Revolution it featured the use of electrical energy and production lines. The expansion of telegraph lines and rail led to the first true wave of globalisation. Design and development of a high-performing adaptive gripper for food industry.
An example of the controller moving point-to-point through an asymmetric trajectory intend for a bathing scenario. The manipulator is made up of pneumatics and cables.
3rd – 1969. This era encompasses nuclear energy, microprocessors, automation, telecommunications and computers. Also, it is the age of space technology and biotechnology. Robots are another feature of his period. 4th – Just beginning. Cyber-physical systems, networks, Internet of Things. This is a merging of lines between physical, cyber and biological systems. It includes Artificial Intelligence, big data and cloud-based systems. ICT, manufacturing and autonomous machines all make use of merging the virtual world with the physical world.
Adaptive and robust grasp control for heavy payload industrial manipulators.
skills, which could make the difference for success, when pioneering new technologies or partnering for commercial applications. “It will create jobs for high-level, skilled operators and increase productivity, saving millions of pounds in capital and operational costs over the coming years,” explained Prof Samia Nefti-Meziani, Project Coordinator.
A detail of the latest prototype exoskeleton.
their performance, set to grasp one way, with one gripping force. A more attuned sensitivity to environment is a key advance to ensure safe human-robot interaction. It takes a combination of a soft manipulator, learningbased control schemes and soft sensors. Such robotic applications will prove useful for high-precision tasks on assembly lines in factories
It will allow European manufacturing companies to adapt their production processes to the trends that will define Industry 4.0. It will ensure Europe’s competitiveness. State of the art robotics The scientific focus of the project was divided into three main areas, which when combined covered the concepts relevant for the majority of emerging robotic technologies we expect in Industry 4.0. These areas were forged into three Work Programmes covering the following: • Dexterous, soft and compliant robotics in manufacturing This is the development of ‘mechanically intelligent’ machines which adapt, manipulating soft and hard, light and heavy objects, with variable stiffness and dexterous motion in changing or new environments. Traditionally, industrial robots are ‘blind’ in
and for performing assisting roles for humans, in settings like surgery or working in marine or nuclear reactor scenarios. For pick and place tasks, or for example, giving close to real-life grip sensitivity in prosthetic hands, this technology is invaluable. One positive outcome from the project is that the UK nuclear industry has already recognised the potential of the Smart-E gripper, which has been incorporated into a recent project for use in nuclear decommissioning.
• Reconfigurable and logistics robotics A problem with robotic production lines is the upheaval and logistical challenge associated with changing that line’s operation. For a SME with budget and time constraints to consider, changing its automation could be an investment that’s unpalatable and time consuming. The SMART-E project set out to address this issue with the understanding it could lead to substantial economic benefits for businesses. The solution was in a quickly deployable, flexible automation system for sustainable manufacturing. It worked by advancing the control-system-related technology of compliant and modular, reconfigurable robots. These systems adapt efficiently to frequent changes in the production line. A new learning approach means robots can be trained in-line, without interruption of the production cycle. European manufacturers can adapt their production lines which means they are competitive and as a direct result will drive employment for operators. Advanced machine learning techniques also improve the monitoring and maintenance of complex
The proposed method has been used to teach the coordination required during a pouring task. The snapshots show a reproduction in which the orientation of the bottle is automatically inferred from the position of the left and right hand.
Sustainable Manufacturing through Advanced Robotics Training in Europe
The proposed training network will prepare the next generation of leading Advanced Roboticists to secure a Sustainable Manufacturing (SMART-E) sector in Europe. It will train 13 Early Stage Researchers (ESRs) and 1 Experienced Researcher (ER) and develop a leading European doctoral training programme, sustainable beyond the network’s duration.
Funded under Marie Curie Action FP7-PEOPLE-2013-ITN
The SMART-E network draws together 7 partners, with world class expertise in robotics, autonomous systems, and advanced manufacturing. Each partner is hosting at least one Marie Curie researcher – either ‘early stage researchers’ (ESRs) or more experienced researchers (ERs). • University of Salford (USAL), UK • Advanced Manufacturing Research Centre (USFD), UK • University of Zürich (UZH), CH • Fondazione Istituto Italiano di Tecnologia (IIT), IT • Scuola Superiore di Studi Universitari e di Perfezionamento Sant’Anna (SSSA), IT • Technische Universität München (TUM), DE • AGCO GmbH (AGCO), DE The SMART-E consortium is delighted to be working with additional partners from across Europe to deliver the training and industry experience which will allow our researchers to become future leaders in robotics and advanced manufacturing. Each of our associate partners are involved in training activities, secondments, summer schools and other events under the umbrella of SMART-E. • Festo Didactic GMBH, DE • RURobots Ltd., UK • The Shadow Robot Company Ltd., UK • ranfield University, UK • arvic University College, NO • Food Manufacturing Engineering Group, UK • BMW, DE • Marel, IS • KUKA, DE • Rolls Royce, UK • DLR, DE • Airbus, FR • Robotnik, SP • Istanbul Technical University, TK
The University of Salford The Crescent 43 SALFORD M5 4WT United Kingdom T: +44 161 295 4540 E: s.Nefti-Meziani@salford.ac.uk W: http://smart-e-mariecurie.eu/ Prof. Samia Nefti-Meziani
Professor Nefti-Meziani holds a Doctorat D’etat in robotics and artificial intelligence and is Director of the Centre for Autonomous Systems & Advanced Robotics, and Chair of Robotics at the University of Salford. In this role, she leads a multidisciplinary team of 6 academics and 12 researchers. She has 25 years’ experience in advanced theoretical research in the areas of embodied intelligence, advanced robotics where the focus of her contribution is in the development of concepts, mechanisms and algorithms. She has pioneered the first application of Soft Robotics in manufacturing.
Learning task-space synergy controllers from demonstration.
robotic systems, making the manufacturing process more sustainable. • Safe human-robot interaction and cooperation The ‘holy grail’ of robotics in terms of its importance for socio-economic benefits, is in developing robots that work safely alongside humans. By creating an artificial ‘skin’ for the robot, a skin with flexible sensors that detect points of contact, interaction capabilities improve. The stretchable material of this skin does not interfere with the robot’s mechanics. Another advance in robotics that is key to shaping Industry 4.0, is in the development of a user-friendly programming system which allows programming by physical demonstration, essentially tracking and copying movements. This means robots can be intuitively trained by non-experts. The upshot of this for industry is that SMEs can use and instruct robots effectively without the need for hiring specialist programmers. Finally, there is a very exciting aspect to the project with the development of robust control techniques for wearable assistive robots – namely exoskeletons. Such exoskeletons, with assistive components strapped on to a worker’s arms, legs and torso, have the potential to reduce physical strain for workers in industrial settings when carrying out physically demanding tasks. These exoskeletons have the potential to reduce risk of injury (and negate subsequent claims for injury settlements) and will be a welcome relief for many workers with physically demanding jobs in industrial settings.
A modular robotic arm and a BMW car door with elastic joints built during the development of a modular robot application to car manufacturing.
Outcomes and next steps SMART E achieved a great deal, from the development of bio-inspired manipulators, synthesis of modular robots, the design of exoskeletons to the control of flexible and rigid manipulators. The project also succeeded in its contribution to scientific research, publishing 50 papers in high impact scientific journals and conferences including Soft Robotics, International Conference of Intelligent Robots (IROS) and the International Conference on Robotics and Automation (ICRA). Most importantly, SMART-E played a part in the research and training and support of a new generation of pioneering researchers and developers in the field of industrial robotics. Since the project ended in 2017, many of the research fellows have been recruited by world class research organisations and industries where they will continue their ground-breaking work into robotics. SMART-E research is currently being used to improve automation with robotics for the Food and Aerospace sectors and is finding applications in partner companies. Although the project is complete, there will be an application for further European funding to focus on commercialisation of the outputs. The knock-on effects and positive outcomes from the work undertaken by this research network will continue to have impact as we embark into the next industrial revolution. “It will allow European manufacturing companies to adapt their production processes to the trends that will define Industry 4.0. It will ensure Europe’s competitiveness,” concluded Prof Samia Nefti-Meziani. The goal of the teleoperation is to remove a protective cap (in green) from the collimator mock-up.
Getting to the core of social innovation Social innovation projects help to change the way we live and work, yet the field itself is relatively under-researched, now the SI-DRIVE project is taking a fresh look at the topic. We spoke to Jürgen Howaldt, Christoph Kaletka and Antonius Schröder about their work in extending knowledge on social innovation and laying the foundations for further research The concept of social innovation has a
major role to play in addressing contemporary social and economic challenges, as Europe moves towards a more knowledge-based economy. While technological innovation has clearly been crucial in shaping modern society, social innovation is also a major factor in changing the way we live, work and travel. “Social innovation can be broadly defined as new social practices that diffuse into wider society and influence change processes,” says Jürgen Howaldt. Based at the Technical University of Dortmund, Howaldt is a key member of the SI-DRIVE project team, an EC-backed initiative investigating and analysing the concept of social innovation, together with his colleagues Antonius Schroeder and Christoph Kaletka. “We are looking at social innovation in a broad sense,” says Kaletka. “One of the misunderstandings which we believe has emerged over recent decades regarding social innovation is that it is quite narrowly defined. Social entrepreneurship is a very important element of social innovation for example, but it’s certainly not the only one.”
Researchers in the project are taking a broader social view, looking at more than , innovation initiatives across the world in seven major policy areas, with the aim of building a deeper understanding of their nature, characteristics, and wider impact. These social innovations are often related to specific societal challenges, commonly on the local level. “It might be that established systems have failed in some way, or that new demands have emerged from say government or civil society, or other motivated actors and innovators. Social innovation is a way of finding new solutions and changing the social practices of the population,” outlines Schröder. This might mean a car-sharing scheme to combat traffic congestion for example, or the development of new healthcare models; one of the main goals of the project is to improve our understanding of the relationship between social innovation of this kind on the ground, and wider social change on the macro level. “While of course some social innovation initiatives influence social change, most social innovators do not
actually start out with the ambition of creating social change,” says Kaletka. The focus for social innovators is more typically on addressing specific social challenges in their local area, such as alleviating poverty, or tackling loneliness among elderly people. Technology can play an important role in addressing these types of issues, yet Howaldt says it must be embedded in social practice if it is to have a sustained impact. “Technology alone is not the solution, but in some cases it enables new social practices to develop, to cope with major societal challenges or deal with emerging demands,” he explains. The relationship between social and technological innovation is a major area of interest in the project, and researchers have been looking at case studies across each of the seven different policy areas. “For example, in the eHealth field, technology plays a very important role, but it’s less integral in fighting against poverty,” continues Howaldt. “It’s really a very interesting picture, where we can look to understand the relationship between social innovation and technological innovation,
across different case studies and social innovation initiatives.” A social innovation project may be initially rooted in a local area, but technology can help to heighten awareness of its impact, potentially then inspiring people in other areas to establish similar initiatives. This point of how a social innovation is perceived by wider society, and whether it is then taken up by other actors, is an important aspect of the project’s research. “Is that social innovation diffused into society, is it widely accepted? Does it lead to the establishment of new institutions that help us to deal with those challenges? SI-DRIVE is looking at the impact of social innovation initiatives in terms of social change,” says Howaldt. Researchers adopt an objective perspective in this regard, looking at the full impact of social innovation, not just the positive effects. “From a scientific and research perspective we always try to understand not only the positive outcomes of a social innovation, but also the possible negative repercussions of such developments. It’s very difficult to say if a specific social innovation is intrinsically ‘good’,” explains Kaletka. An example could be a social innovation organising the re-distribution of excess food from restaurants and supermarkets to the homeless for example. While this has positive
social effects, bringing food to people in need and helping to allocate resources more efficiently, it may also lead to some level of disruption for others. “Some actors might be negatively affected. For example, those who maybe previously found work in generating energy out of food waste,” points out Kaletka. These different perspectives need to be considered in terms of understanding the impact of social innovation, and also the
whether it is part of a wider eco-system, with links to other stakeholders. Many NGOs and not-for-profit organisations are involved in social innovation initiatives, along with other actors. “When considering solutions to a social challenge, it is very important that there is a kind of social innovation eco-system, integrating all the relevant stakeholders from different sectors and areas. For instance, it could be relevant to integrate the church, the
When considering solutions to a social challenge, it is very important that there is a kind of social innovation eco-system, integrating all the relevant stakeholders from
different sectors and areas potential for these types of initiatives to be replicated elsewhere and contribute to wider social change. “An important question is whether an initiative or idea is sustainable. Could it be replicated or diffused to other regions? Or at least, if it has been implemented in this specific city, has it been successful? Has it been maintained over a longer period? So the sustainability question is very important,” stresses Kaletka. This may be affected by the organisation of the specific social innovation initiative and
public administration, local businesses and other civil society actors to address a problem in a common and sustainable way,” says Schröder. Analysis of social innovations shows that research institutes and universities are major players in a relatively small proportion of cases, a finding which surprised Howaldt. “Universities and research institutes played an percent of the important role in less than initiatives that we analysed,” he says. “I think there is undeveloped potential in social innovation.”
SI-DRIVE This stands in stark contrast to technological innovation projects, in which universities typically play a far more prominent role in research and development. The scaling and diffusion of social innovation projects is another important point in this regard. “In some cases social innovation initiatives may have developed, and yet the participants are maybe not aware that something similar may already have been done in other areas,” says Schröder. This points to a need for more effective information-sharing and support for social innovation. “It’s not only about support for the development of scaling strategies, that’s only one side of the diffusion of research and innovation. It’s also about enhancing the capability of society to take up and imitate solutions that have been developed in other parts of the world,” outlines Howaldt. “We found from our mapping that a greater part of the social innovation initiatives that we analysed had utilised ideas from other social innovation initiatives. Imitation and innovation are closely connected.”
Eco-systems There are clearly a multitude of different factors to consider in analysing social innovations and understanding why some
scale successfully, while others fail to have a lasting impact. Researchers in the project aim to investigate these factors, to build a clearer picture of the innovation eco-system, with the wider aim of informing social policy development. “We are developing a policy declaration that we will present at our final conference, together with our colleagues in the project. We will describe the insights that have been drawn from the project,” says Howaldt. The project team is also closely involved in the development of a more comprehensive innovation policy in Germany. “We focus on social innovation as part of a comprehensive innovation policy, describing new ways and concepts of promoting social innovation,” outlines Howaldt. The second major outcome of the project will be to further strengthen the social innovation research community, laying the foundations for continued investigation. A European School of Social Innovation has been established, bringing together researchers from different countries, and Kaletka believes it’s important to encourage continued collaboration between researchers. “Different social innovation projects and research communities are starting to exchange their views and help one another,” says Kaletka.
Social Innovation – Driving Force of Social Change Project Objectives
The project’s research is guided by the following four objectives and expected outcomes: • To determine the nature, characteristics and impacts of social innovation as key elements of a new paradigm of innovation (strengthen the theoretical and empirical base of social innovation as part of a wider concept of innovation that thoroughly integrates social dimensions) • To map, analyse and promote social innovations in Europe and world regions to better understand and enable social innovations and their capacity for changing societies • To identify and assess success factors of social innovation in seven particular policy areas, supporting reciprocal empowerment in various countries and social groups to engage in social innovation for development, working towards Europe 2020 targets and sustainable development (e.g. Sustainable Development Goals (SDG)) • To undertake future-oriented policy-driven research, analyse barriers and drivers for social innovation; develop tools and instruments for policy interventions.
FP7 Programme for Research of the European Union – Collaborative project Socio-economic Sciences and Humanities SSH.2013.3.2-1 Social Innovation – empowering people, changing societies?
Please visit website for full details
Antonius Schröder, Member of Management Board - European Research / Infrastructure Research Area 3 “Work and Education in Europe” Sozialforschungsstelle Dortmund - sfs Technische Universität Dortmund Evinger Platz 17 D-44339 Dortmund T: +49-(0)231-8596-243 E: firstname.lastname@example.org W: www.si-drive.eu Antonius Schröder, Professor Jürgen Howaldt and Dr Christoph Kaletka
Antonius Schröder (Left) is a Senior Researcher and member of management board of the social research centre (sfs) at TU Dortmund University. Professor Jürgen Howaldt (Centre) is Director of Sozialforschungsstelle Dortmund, TU Dortmund University and professor at the Faculty of Economics and Social Sciences. Dr Christoph Kaletka (Right) is a Senior Researcher and member of the management board at Sozialforschungsstelle, central scientific unit of TU Dortmund University (TUDO).
Teaching plastic to be fantastic Current approaches to polymer synthesis are relatively imprecise in comparison to natural methods. Researchers in the SCPs project are drawing inspiration from nature as they aim to develop new methods of synthesising sequence controlled polymers, which could have interesting new functions, as Professor Rachel O’Reilly explains A lot of
attention in chemistry research over recent years has been focused on controlling the molecular weight of a polymer, now Professor Rachel O’Reilly and her colleagues in the SCPS project are looking towards the next level of complexity in development. This involves thinking not just about controlling the molecular weight of a polymer, but also actually controlling the individual monomer units and how they’re located along the polymer chain. “We’re investigating the sequence of how the monomers are put together. So we’re trying to find methods of controlling how the monomers are put together,” says Professor O’Reilly. Methods have already been developed for this purpose by scientists, yet many are specific to a particular type of monomer; Professor O’Reilly is taking a slightly different approach. “We are looking to draw inspiration from the ribosome and think about templation and segregation, to allow for control of monomer additions,” she outlines. This research is built on strong foundations, as Professor O’Reilly and her colleagues in the project hold long experience in polymer investigation. A lot of inspiration is drawn from nature in this work. “We’ve been working on programming DNA and templating chemistry, so looking at how you can use DNA sequences to
New approaches developed for the synthesis of sequence controlled polymers using DNA templated chemistry or polymer template approaches.
induce specific small molecular reactions, and thinking about how we might bridge some of those very specific oligomers. Then we can look to learn from some of the templation methods we use in that to try and extrapolate that to develop robust methods for polymers,” outlines Professor O’Reilly. The chemistry in a polymer chain
has not historically been utilised to expand function, now researchers are looking to manipulate it, with the goal of developing sequence controlled polymers. “If you reorder a polymer, or fold it, or assemble it in a particular way, you can do a lot more rather than just by self-assembly,” explains Professor O’Reilly.
SCPS Developing sequence controlled polymers for organization, templation and recognition
Sequence control The current focus in the project is on achieving precise sequence control, which effectively means controlling where the actual monomers are placed within the polymer chain. The questions around this are being addressed in different ways within the project. “In one part of the project we’re really thinking about precision, about trying to make perfect polymers. In another part, we’re looking at more approximate approaches, where we might not need to make perfect polymers,” explains Professor O’Reilly. This does not mean developing entirely new chemistries, but rather using known monomer units in a different way. “We’re not making new polymers in the sense of new chemistries and new functional groups - we’re using monomers that have been used before, but we’re putting them together in such a way that they behave differently,” continues Professor O’Reilly. “We are using polymerization methods and DNA-templated chemistries, to try and
to put functionality in a particular place,” says Professor O’Reilly. This research could help lay the foundations for the future development of materials with particular properties tailored to certain functions. However, Professor O’Reilly says the project’s work is more exploratory at this stage, rather than being directed towards the development of a specific material. “It’s very much about discovery and developing the methods and routes at the moment,” she stresses. One of the major challenges in this area is that researchers do not know the end result of these methods, so Professor O’Reilly is also involved in another programme running in parallel to the SCPs project, looking to relate the composition of the polymer to its eventual properties. “We’re working together with colleagues from the University of Oxford, looking to try and do discovery. So this is about being able to make libraries of materials, looking at their functions, and then going back a step to learn more about their sequence,” she outlines.
We’re looking at more generalised approaches, so then we can use new methods to grow particles that have sequence specificity, in the sense that we are able to put functionality in a particular place embed certain molecules into our polymer materials.” A lot of the tools currently used to assemble these materials are relatively basic. Hierarchical self-assembly and folding have not historically been used to generate function in polymers, but if researchers can fold polymers in a particular way, Professor O’Reilly believes new possibilities may emerge. “We might be able to start to get some more interesting functions,” she says. Using selective recognition or crystallisation enables researchers to make fundamentally different shapes or constructs, that might have new functions because of both their shape and their chemistry. “One of the sequence elements we’re interested in is using crystallisation to drive the formation of nanostructures, which is quite difficult as it requires quite pure phases. We’re looking at more generalised approaches, so then we can use new methods to actually grow particles that have sequence specificity, in the sense that we are able
Current approaches in polymer synthesis lack the precision and complexity of natural polymers such as proteins which use just 20 amino acid monomers in a 1D specific sequence to give 3D structures such as enzymes with specific functions such as catalysis. Or DNA which uses just 4 nucleobase monomers in specific sequence to store and propagate information. As polymer scientists we have more monomers available but don’t yet have robust methods to program a sequence and hence connect them in a controlled manner. This means we can’t yet access some of the advanced functions of natural SCPs. The aim of this grant is to develop robust chemical approaches for the synthesis of SCPs to enable us to bridge the biological-synthetic materials divide as well as go significantly beyond the current state of the art.
ERC Consolidator Grant - Synthetic Chemistry and Materials
Project Coordinator, IDEAS-ERC Professor Rachel K. O’Reilly FRSC Chair in Chemistry School of Chemistry University of Birmingham Edgbaston Birmingham B15 2TT T: +44 (0)121 414 7757 E: email@example.com W: https://www.oreillygrouplab.com/ W: http://www2.warwick.ac.uk/fac/sci/ chemistry/research/oreilly/oreillygroup Twitter: @RORgroup
Rational design This could in the long-term lead to a more rational, tailored approach to materials development, while new avenues of investigation are also emerging out of the project’s research. While one important outcome from the project is the development of methods for preparing SCPs, another is simply demonstrating the wider potential of polymers. “There’s actually a lot more to a polymer than just a coil,” stresses Professor O’Reilly. In future, Professor O’Reilly plans to continues her work in this area, looking to build a deeper understanding of polymer synthesis and the development of precision materials. “We’re starting to think about discovery in our collaboration with the group in Oxford, so making libraries of compounds,” she outlines. “Basically we’re trying to make our approaches a bit more scaleable in a combinatorial sense. We might be able to make a certain quantity of a material, now we’re looking at making many different derivatives, and seeing if we can select for function.”
Professor Rachel O’Reilly
Professor Rachel O’Reilly is Chair of the Chemistry Department at the University of Birmingham. She is interested in using polymer synthesis and precision synthesis to allow for the preparation of functional and responsive nanostructures, which can be utilised in a wide range of applications, from materials science to medicine.
Multiscale modelling strategies for designing new materials The discovery of materials with novel or enhanced properties is central to technological progress, opening new possibilities across many areas of industry. The VIRMETAL project aims at accelerating this process by means of multiscale modelling strategies that will enable scientists to design, process and test advanced metallic alloys in silico before they are manufactured, as Professor Javier Llorca explains The development of
new materials plays a central role in technological innovation. There are many instances throughout history where the synthesis of a material with novel properties has led on to significant technical breakthroughs. This is the case, for instance, with the synthesis of multilayers with giant magnetoresistance, which led to a dramatic increase in the capacity of hard disk drives. An alternative route to technical progress is through the progressive improvement of existing engineering materials for novel applications, as has been seen in superalloys and composites in the aerospace industry for example. In practice, both of these routes act as a brake on technological progress, yet recent developments in modeling tools, along with advances in multiscale modeling strategies and continued increases in computational power, promise to open up new possibilities to accelerate the discovery and design of new materials for engineering applications. A wide range of modeling tools are available nowadays to simulate the behavior of materials for particular length and time scales, including density functional theory, molecular mechanics, computational thermodynamics, finite elements, etc. These techniques have already been used to design materials with improved properties or unexpected structures, such as new catalysts or Lithium (Li)-based materials for batteries. However, this is only possible because the critical structure or properties depend on phenomena which take place at particular time and length scales, which can be simulated using just a single one of the aforementioned techniques. This is not always the case, and in fact it is unlikely to be observed in materials intended for structural applications. Balanced mechanical properties like stiffness, strength and toughness are dependent on many different processes which take place along nine or more orders of magnitude in length scales, from nanometers to meters. This dependence on different length scales is even more apparent in multifunctional (smart) materials.
VIRMETAL Project This challenge forms the backdrop to the work of the VIRMETAL project, an ERCbacked initiative which aims to develop novel multiscale modeling strategies to carry out virtual design, virtual processing and virtual testing of advanced metallic alloys for engineering applications. The ultimate goal in the project is to enable the design, testing and optimization of new metallic alloys in silico before they are actually manufactured in a laboratory, which would dramatically reduce the time necessary to discover and incorporate new materials in industrial applications.
Magnesium) and Mg-Al-Zn (MagnesiumAluminum-Zinc) systems, both of which hold considerable industrial interest. Some exciting results have already been achieved. For instance, a multiscale modelling strategy has been developed to predict the homogeneous and heterogeneous nucleation of θ’ (Al 2Cu) precipitates in an Al-Cu alloy during high temperature aging. The model parameters that determine the different energy contributions (chemical free energy, interfacial energy, lattice parameters, elastic constants) were obtained from computational thermodynamics or first-principles density functional theory.
Researchers aim to demonstrate that multiscale modelling can be used to predict the microstructural development during solidification and thermomechanical processing, as well as to extend the virtual testing capabilities to include damage and fracture Nevertheless, not everything can or should be computed, and critical experiments constitute an integral part of the research program for the calibration and validation of the multiscale strategies. Research is focused on two metallic alloys from the Al-Cu-Mg (Aluminum-Copper-
From this information, the evolution and equilibrium morphology of the θ’ precipitates is simulated in 3D using the phase-field model. The model was able to reproduce the evolution of the different orientation variants of plate-like shaped θ’ precipitates with orientation relationship
Figure 1. (a) Multiscale simulation of the nucleation and growth of θ’ (Al2Cu) precipitates on dislocations during high temperature aging of an Al-Cu alloy. (b) Transmission electron microscopy micrograph showing the formation of a staircase structure of θ’ precipitates on a dislocation. (From H. Liu, B. Bellón, J. LLorca. Acta Materialia 132 (2017) 611-626.)
(001)θ’//(001)α and θ’// α during homogeneous nucleation as well as the heterogeneous nucleation on dislocations, leading to the formation of precipitate arrays (Fig. 1). Heterogeneous nucleation on pre-existing dislocations was triggered by the interaction energy between the dislocation stress field and the stress-free transformation strain associated to the nucleation of the θ’ precipitates. Moreover, the mechanisms controlling the evolution of the morphology and the equilibrium aspect ratio of the precipitates were ascertained. All the predictions of the multiscale model were in good agreement with experimental data, demonstrating the capability of the bottom-up multiscale approach to predict the structure of the material from first principles data. The next step once the precipitate structure has been obtained is to predict the hardening induced by their presence. This can be achieved by means of dislocation dynamics simulations in which a dislocation has to propagate through a forest of precipitates. The lattice parameters, elastic constants and stress-free transformation strains of the precipitates were obtained by ab initio calculations, while molecular dynamics simulations were used to determine the dislocation mobility. Thus, the multiscale simulations to predict the mechanical properties of the alloy are, again, based on information obtained from simulations at lower length scales. The information obtained from the dislocation dynamics simulations can be used to develop a dislocation-based crystal plasticity model that can be used to simulate the behaviour of polycrystals. These models can take into account the storage of dislocations at grain boundaries and so can be used to predict the strengthening of
polycrystals as a function of the grain size (known as the Hall-Petch effect)[Fig. 2a]. The experiments and simulations again show close agreement, and the mechanism responsible for the grain size strengthening – the accumulation of dislocations at the grain boundaries – is clearly revealed in the contour plot of the dislocation density in Figure 2b. This work is ongoing, with researchers aiming to demonstrate that multiscale modelling can be used to predict the development of a microstructure during solidification and thermo-mechanical processing, as well as to extend the virtual testing capabilities to include damage and fracture, both of which are major concerns to industry.
Expected outcomes The wider goal is to demonstrate that the structure and properties of two standard engineering alloys can be obtained from first principles by bridging a cascade of modeling tools at the different length scales. Once this has been proven, further research will lead to continued growth in the number of multiscale simulation tools, as well as the extension of their capabilities. This holds important implications for both academia and industry, enabling the virtual design, processing and testing of new materials before they are manufactured, saving money and helping researchers to identify potential applications of a new material at a very early stage in development. Once the benefits of these simulation tools become more apparent, it is expected that they will be widely applied across many important areas of European industry, including in the aerospace, automotive, rail transport, energy generation and engineering sectors.
VIRMETAL Virtual Design, Virtual Processing and Virtual Testing of Metallic Materials Project Objectives
The VIRMETAL project aims to develop multiscale modelling strategies to carry out virtual design, virtual processing and virtual testing of advanced metallic alloys for engineering applications, so that new materials can be designed, tested and optimised, before they are actually manufactured in the laboratory. The focus of the project is on materials engineering, namely understanding how the structure of the material develops during processing, the relationship between this structure and the properties, and how to select materials for a given application.
The VIRMETAL project is funded under H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Advanced Grant.
Project Coordinator, Professor Javier Llorca Calle Eric Kandel 2 Tecnogetafe, 28906 Getafe, Madrid, Spain T: +34 91 549 34 22 E: firstname.lastname@example.org W: http://materials.imdea.org/proyecto/ virmetal/
Professor Javier Llorca
Professor Javier Llorca is Scientific Director at IMDEA Materials Institute and Professor at the Technical University of Madrid. His main research interests lie in establishing the relationship between processing, microstructure and mechanical behaviour of materials at different length scales by means of the development of modelling tools and multiscale simulation strategies.
Figure 2. (a) Experimental results and multiscale modelling predictions of the flow stress of Cu as a function of the inverse of the grain size, 1/dg (mm-1) for different values of the applied strain, ε = 0.5% and 5%. (b) Contour plot of the dislocation density (m-2) showing the storage of dislocations around the grain boundaries during deformation in a Cu polycrystals with an average grain size of 10 µm. (From S. Hauoala, J. Segurado, J. LLorca. Acta Materialia 148 (2018) 72-85.)
The new dawn of magnetic polymers The Magneto project is attempting to develop new smart composite materials with unusual magnetomechanical properties. Project lead, Kostas Danas, carries out experiments to give magnetic properties to polymers, which could pave the way for exciting new applications in many sectors Working at the
Ecole Polytechnique Campus, Principal Investigator, Prof. and CNRS Researcher Kostas Danas, is half way through his research term on the Magneto project, with just over two years to go. He is a pioneer in materials science and mechanics, a branch of research that has the potential to redefine the possibilities of how materials behave. It’s an important task since creation of a new composite material could essentially open up possibilities for engineering new applications that would be otherwise challenging to develop. “It’s about creating new materials that have interesting properties,” explains Kostas Danas. “Magnets are usually a hard material – they are made of hard metal, which means they are not easily deformable. The Magneto project’s idea is to develop polymers that have magnetic properties and operate them in their unstable regime. As polymers are soft materials, they can be easier for use in a number of applications because they are deformable. The question that drives the Magento project is, ‘how do you make a polymer magnetic but still keeping it deformable?’”
Kostas argues, that at a basic level, the methods he is using to give magnetism to a soft material consist of mixing the right substances and architecting them in the right way.
Cooking up a composite “The way we do it feels much like following a recipe when you are cooking. You need the right ingredients and the right quantities, so on one level – this is a straightforward chemistry experiment,” concludes Kostas.
then mix the liquid polymer with nano or microparticles of iron or permanent NdFeB micromagnets, which is a powder. You see how this is like cooking? I then create a new glove, which is almost as soft as the original glove but now, because of the particles, this material deforms or sticks, as it’s attracted by the magnet. Also, if bended or deformed towards a critically stable state, wrinkles and controlled roughness can be obtained on the surface of the material. All this is called
As polymers are soft materials, they can be easier for use in a number of applications because they are deformable. The question that drives the Magento project is, ‘how do you make a polymer
magnetic but still keeping it deformable?’ “For the sake of explaining, let’s take the example of a polymer we can all comprehend, a plastic glove, like the kind you do the washing up with in the kitchen. This glove is not magnetic, so how do I make a glove that is? I first make the polymer by mixing two liquids together. Suppose I
magneto-mechanical coupling because when you apply a magnetic field to it, it deforms and vice versa. Such materials can be magnetically activated and act as soft sensors or actuators.” Because of the methods and ingredients involved – creating such smart composite
materials is relatively cheap to do, which is a plus for the research but also for real life applications. For increased control of the final geometry and material architecture, the project makes use of 3D printing to accelerate and control better the process of creating moulds. “If we didn’t have 3D printing we would need to go to a technician who would have to produce the mould for us, which is much more difficult and sometimes impossible. With 3D printing we can design efficiently and fast complex geometries while we can fabricate moulds that can’t be done with start out techniques,” says Kostas. With 3D printing, the composite material can be architected, validated and probed for response.
Programmable curvature A technique known as programmable curvature brings hope for the development of a number of possible applications, relevant to the Magneto project’s research. “We begin with a flat material that doesn’t occupy much space and by compressing it or pushing it in different directions or even better magnetising it, it becomes 3D. One way to imagine this is to think of creating the Eiffel Tower from a film that is flat but by compressing it, it becomes 3D. The process can create properties, such as better stiffness, optimised geometry and even electro-magneto-mechanical properties. This is a very scalable process that can contribute to civil engineering to create, for example, buildings that you can transport easily. They don’t occupy space but when
Increasing Magnetic Field Full field numerical magneto-mechanical simulations of magnetoelastic films upon magnetoelastic substrates. The strain field is shown in color. Roof-type patterns called crinkles are obtained. Such patterns can potentially help unravel and read DNA sequences.
you reach the site, you can unfold them into 3D structures. This is obviously on a very big scale – but you can do the same thing on small scales, which is what we are interested in and have important applications in small electronic devices, biology and bioimplants.” Some of the application concepts born from the project rely on programming curvature at small scales. “Imagine stents that are used to keep arteries open in the human body. You could make it easier to put these objects inside people. You could introduce a small flat or compact object inside a body, drive it towards its desired position and unfold it into a 3D stent and all this by simply applying external magnetic fields.”
Limited only by imagination Whilst Kostas is working at the laboratory level, there are clearly identified ideas for future applications with such materials, which industry could exploit. A number of people from academia and industry interact with Kostas and have brainstorming discussions, where new ideas come to the fore and take shape. These ideas are academic, some of which can be taken to the laboratory. For example, a braille reading device. This could rely on a thin film that deforms when a magnetic field is applied. By using small magnetic fields, micro bumps could be made which would become braille on the film. You could achieve this with a high degree of control over manipulating the
Experimental setup for magneto-mechanical experiments. Several surface patterns of sinusoidal and roof type (crinkles) are obtained by depositing a magnetoelastic film on a passive substrate.
Magneto Active Magnetorheological Elastomers: From Hierarchical Composite Materials to Tailored Instabilities
The aim of the MAGNETO project is to develop new composite materials with extreme properties, specifically magnetorheological elastomers (MREs) which combine magnetic particles embedded in a soft polymeric non-magnetic matrix, giving rise to a coupled magnetomechanical response at the macroscopic scale when subjected to magentomechanical external stimuli.
European Research Council Starting Grant 2014
Principal Investigator • Kostas Danas Key Team Members • Laurence Bodelot • Nicolas Triantafyllidis Post-Docs • Gabriella Tarantino • Krishnendu Haldar • Vivekanand Dabade Graduate Students • Erato Psarra • Dipayan Mukherjee • Jean-Pierre Voropaieff • Othmane Zerhouni
Principal Investigator, Kostas Danas LMS, Ecole Polytechnique Laboratoire de Mécanique des Solides École Polytechnique, CNRS UMR 7649 Route de Saclay Palaiseau Cedex, 91128, France T: +33 (0)1 69 33 57 86 E: email@example.com W: http://www.kostasdanas.com W: h ttp://www.kostasdanas.com/erc-magneto/
instabilities, to form the bumps. Other ideas include new haptic and touchscreen technologies and even a way to enable movement in the joints of a prosthetic leg – where soft solids could bend in the right places when activated. Another research direction that Kostas and his team look into includes possibilities to activate the control of material stiffness, for cell growth.
Jet Propulsion Laboratory visited the Laboratory of Solid Mechanics (LMS) at Ecole Polytechnique in France and commented that one of Kostas’ experiments could have purpose for a NASA mass spectrometer in future missions to Mars, because magnetic fields can be applied in non-gravitational systems. “This is an application that I wouldn’t have in mind because I am not even close
It’s impossible for me to think of all
the applications possible from this research but new ones keep being revealed as time goes on. The potential is far reaching. “I am ultimately a scientist, so I am not necessarily the best person to develop devices for industry but the concepts are there for a company equipped and able to do the engineering for this and to take it to market,” says Kostas. Revelations about how these materials could become applied are regularly forthcoming. The former Director of NASA’s
to that field. It’s impossible for me to think of all the applications possible from this research but new ones keep being revealed as time goes on. The potential is far reaching. Close and brainstormingtype interaction between scientists and industrial people is perhaps the best way to go from an academic laboratory to a real life application.”
Refreshable braille display
Kostas Danas is a Senior Research Scientist in the Solid Mechanics Laboratory at CNRS and Associate Professor at Ecole Polytechnique. His main research interests are in the field of solid mechanics and physics with an emphasis on smart composite materials and structures.
Research coordination on industrial organisation Professor Patrick Legros tells us about the OIO project. By bringing together insights from the industrial organization and organization economics literature, the new approach shows how the market performance and firms organizational choices are co-determined The behaviour of
firms in the commercial marketplace has a direct impact on our everyday lives, as strategic decisions affect the price and availability of goods. Research into industrial organisation has historically focused on the consequences of major companies exerting market power. But situations that are harmful to consumers seem directly linked to dysfunctions in the organization and governance of companies, e.g., the Enron bankruptcy or the financial crisis. Research in organizational economics has paid attention to incentive problems, conflicts of interest in firms and on how the allocation of decision rights among stakeholders affects a firm’s performance. Researchers in the OIO project aim to bring these two elements together. “The goal of this project is to integrate the two, and to show that when it is difficult to solve conflicts of interests, the market conditions affect and are affected by - the way firms form (their ownership boundaries, their scale and scope of products), the way they are organized, and their performance,” says Professor Patrick Legros, the project’s Principal Investigator.
Industrial organisation and Decision Rights People in positions of authority within a firm will have a major influence on its performance, whether or not they are the owners. “Shareholders may want to maximise profits, while managers may want to manage their own career. That creates tensions and conflicts of interest in the firm, and also affects the investment, pricing and quantity decisions made in the enterprise,” Professor Legros explains. The way in which these conflicts are resolved affects how firms behave on the market, and in turn the returns that different stakeholders can get on the market affects the decisions made within enterprises. “Once you realise that you have these different stakeholders, the firm doesn’t really have one objective (like profit maximization). What you will observe in terms of how enterprises are organized and perform is something that cannot be understood in isolation to the market conditions,” continues Professor Legros. The external market conditions indeed
How the division of revenue (R) affects the willingness of A;B to merge.
favour integration (hence merger activity), low prices tend to favour non-integration (or divestitures if the firms are already integrated). Sometimes this dependence of organizational choices on prices leads to ‘reorganizational dampening’ of technological innovation, which results in little or no gain in industrial productivity. “So, what happens outside the firm will matter for how it is organised, and how a firm is organised will affect what happens in the market. This is why combining the two approaches is important,” says Professor Legros. So far the theory is independent of any single sector, but it will be applied in
What happens outside the firm will matter for how it is organised, and how a firm is organised will affect what happens in the market. Firms and market performances are intertwined and should be studied in conjunction. determine the means by which conflicts of interest can be resolved. A good example is that of coordinating decisions made by different suppliers who have different cultures on how to produce their goods. Generally, such differences lead to coordination failures (too little output is produced or the quality of the output is below standard), unless suppliers can be incentivized. One possibility is to offer suppliers monetary stakes in the output; but this may be difficult because individual contributions to output are difficult to identify, and this is also costly for the owners. An alternative could be a merger, such as a transfer of authority on decisions to headquarters, who can impose coordination; but the loss of control of the suppliers may lead to large ‘private’ costs for suppliers, making them reluctant to abandon their decision rights. A merger, and coordination, will happen only if the benefits of cooperation, in particular the monetary revenue from production is sufficiently large to compensate for the (unobserved) costs of losing control: this is a simple illustration where a market variable like the price of or the demand for - the output affects the way enterprises are organized and perform. In such an environment, high prices tend to
a more detailed, industry-specific way in future. “We will adapt the general theory to particular industries like the health sector,” continues Professor Legros.
OIO Organizational Industrial Organization Funded by the European Research Council under the European Union’s Seventh Framework Programme (FP7-IDEAS-ERC) / ERC Grant Agreement n. . Part of this research is done in collaboration with Professor Andrew Newman from Boston University. Patrick Legros, Professor of Economics Université libre de Bruxelles (ECARES) and Northeastern University T: +32 2650 4219 E: firstname.lastname@example.org W: http://www.plegros.net
Patrick Legros is a Professor of Economics at Université libre de Bruxelles in Belgium and Distinguished Professor of Economics at Northeastern University in Boston. His main research interests are in theory of contracts, microeconomics, industrial organization, competition policy and regulation. He is currently the Managing Editor of the Journal of Industrial Economics.
Knowledge, truth, and morality, the history of relativism Relativism has often been criticised as a threat to rational debate. Professor Martin Kusch and his colleagues aim to explore the historical roots of modern forms of relativism, and to determine which forms of the doctrine deserve a sympathetic reconstruction and defense The emergence of
relativism as a philosophical viewpoint was accompanied by a fair amount of criticism, with many philosophers arguing that it threatened to undermine the foundations of morality or knowledge. While relativism itself is difficult to precisely define, some core elements are clear. Consider epistemic relativism, the main focus of the Emergence of Relativism project. “It’s first commitment is to the thought that ‘epistemic judgements’ – judgements concerning the question of whether a belief qualifies as knowledge – are always relative to a set of standards or principles. Such standards or principles tell you what it takes for a belief to be knowledge (e.g. which instrument is reliable),” says Professor Martin Kusch, the Principal Investigator of the project. “The second ingredient of epistemic relativism is that there is – or could be – more than one set of standards or principles, say in different cultures or different historical periods. And the third element is the thought that none of these sets of standards is ABSOLUTELY correct. In the eyes of many critics, relativism undermines the thought that disagreements can always be resolved and decided by rational debate and discussion. “Relativism raises the spectre of us getting to a point where we just have to say; ‘well, this is right from your point of view, and this is right from mine.’ That’s the worry – that relativism undermines rational engagement with one another, and therefore also potentially undermines the pillars on which democratic societies are based,” explains Kusch. The historical thesis behind the project is that relativism as we understand it today is a child of German-speaking philosophers
and scientists from the 19th and early 20th centuries. “We have focused on the timeperiod when you find the first inklings of this modern understanding, in people like Herder, and we have traced it through to the 1920s and ‘30s, when our modern understanding of relativism is well established,” says Kusch. But the project is not just historical: “We also investigate whether one can give a sympathetic rendering of relativism; we want to test which of its many versions might be defensible.”
trying to understand how and why, in the 19th century, certain forms of relativism emerged, and how they spread and became a matter of concern,” he explains.
Philosophical research How can one defend epistemic relativism? One important area here is the history of science. It has been argued by some relativistically-minded philosophers that it is a graveyard of epistemic standards. Scientists have often modified or dropped even their
Relativism raises the spectre of us getting to a point where we just have to say; ‘well, this is right from your point of view, and this is right from mine.’ That’s the worry – that relativism undermines
rational engagement with one another Definition and emergence The wider goal of this research is to explore the history of relativism and to develop a sympathetic rendering of the position. The problem is that there have been few such sympathetic renderings. And thus, it has been the CRITICS of relativism who have been most prominent in defining what the position actually entails. Concerning the historical side of the project, Kusch stresses that “there has been a considerable philosophical preoccupation with relativism since the end of the 19th century: yet if you go back to philosophers like Kant or Hegel two centuries ago, relativism doesn’t even feature as a distinct philosophical view. It hadn’t even been identified conceptually.” Kusch and his colleagues study the causes behind the emergence of relativism. “We’re
most cherished epistemic standards in order to make sense of processes in the natural world. “Some philosophers have been so impressed by these fundamental changes in epistemic standards, that they have come to doubt that there is ultimately but one single correct system of norms for forming beliefs about the natural world,” says Kusch. In particular, Thomas Kuhn’s 1962 work, The Structure of Scientific Revolutions, is central for understanding the emergence of relativistic themes in the history and philosophy of science. “Kuhn pointed out that scientific work is conducted differently in different scientific communities – and that different scientific communities use different paradigms, model achievements which they follow,” outlines Kusch. ‘Paradigm’ was of course Kuhn’s central idea. It has two meanings. One is that of a model
achievement, as when, during the ‘Chemical Revolution’, Lavoisier analysed water with great precision, and identified its two elements. The second meaning of ‘paradigm’ was a set of assumptions, standards, and the aforementioned exemplary achievements. “So science is always conducted within the system of norms and models of how to do scientific work,” Kusch explains. The relativistic dimension of Kuhn’s paradigms become visible, once we take into account change. These paradigms may change over time, as new research emerges which challenges the previously accepted viewpoint. Kuhn argued that as standards of evaluation are internal to a given paradigm, there is no neutral point from which different paradigms can be compared. “This idea, though much criticised, is one important source of relativistic views,” says Kusch. Kuhn’s work was also influential in the sociology of science, another area of interest to Kusch and his colleagues. They seek to reconstruct the relativistic arguments that surface in these debates and fields, and test their plausibility and validity. Here they take into account influential contemporary criticisms and defences of epistemic relativism. Especially important here are recent developments in epistemology and feminist philosophy. For example, some feminist philosophers think that different groups – e.g. genders – have different ‘epistemic standpoints’, none of which is absolutely correct.
Historical research “Kuhn and other researchers suggested that norms of scientific enquiry changed throughout history. In our historical work we have tried to show that such themes already appeared in the 19th century, primarily in German-speaking philosophy and science,” Kusch continues. “Thus we have looked at the tradition of German historical writing. How did German historians of the 19th century argue for – or presuppose – ideas concerning the relative or absolute status of moral and epistemic norms? And how did they think about changes in the norms of historical writing?”
In the early 19th century, history was often regarded as a positive source of selfunderstanding and education, of fostering a national identity. It was thought that history was a guide to the correct way of understanding oneself and the world. This changed towards the end of the 19th century. “Historians increasingly looked in detail at different cultures, and different historical stages of their own culture. The historians came to feel that they should not analyse different historical periods by how rational or irrational they were – but rather try to explain and understand why people in different historical periods thought what they did,” says Kusch. This change in method, and the loss of certainty in historical work, led to relativistic themes gaining prominence in history as a field. “So from being the source of reassurance – that we are going towards an ever-deeper understanding of ourselves in the world – history actually became a source of relativistic worries,” outlines Kusch. The late 19th century also saw the emergence of sociology as a field of study. One important figure in the field was the German philosopher and sociologist Georg Simmel. He was interested in how different communities organise their knowledge and moral practices. Simmel was one of the first authors to systematically develop and defend epistemic relativism. Simmel is often thought to be part of the tradition of the ‘philosophy of life’. This school of thought maintained that thought and knowledge need to be understood as processes within human (social) life. It has its origins in Nietzsche and it later became part of Nazi ideology. Many positions within this tradition leaned towards relativism.
Relativism The Emergence of Relativism -Historical, Philosophical and Sociological Perspectives Project Objectives
The main objectives of this project are to: (1) retrace the intellectual history of the emergence of important forms of relativism (and the counterpart versions of antirelativism) in 19th and early-20th-century German-speaking philosophy and science; (2) explain some key junctures of this intellectual history in sociological terms; and (3) critically evaluate the central arguments for and against relativism as they evolved in the period under investigation, and as they have been developed further in more recent discussions.
This project is funded by an ERC Advanced Grant - ERC-AG-SH4 - The Human Mind and its complexity. 2.5 million euros.
• Dr Natalie Ashton • Dr Katherina Kinzel • Katharina Sodoma • Dr Robin McKenna • Dr Johannes Steizinger • Niels Wildschut
Project Coordinator, Martin Paul Heinrich Kusch Professor of Philosophy of Science and Epistemology University of Vienna. Universitätsring 1, 1010 Wien, Austria T: +43 1 4277 46422 E: email@example.com W: https://emergenceofrelativism.weebly.com
Outreach Kusch and his colleagues are also eager to bring their results to bear on the contemporary world beyond academia. “For example, in our blog we investigate the relationship between our research and political concerns around ‘alternative facts’, or relativistic challenges to scientific authority,” he says.
Professor Martin Paul Heinrich Kusch
Professor Martin Paul Heinrich Kusch was born in Leverkusen (Germany) in 1959. He studied in Berlin, Jyväskylä and Oulu (Finland) from 1979 until 1989. He held academic positions in Oulu, Toronto, Auckland, Edinburgh, Helsinki and the University of Cambridge (where he was Professor for Philosophy and Sociology of Science. Since 2009 he has been Professor for Epistemology and the Philosophy of Science at the University of Vienna. He is married with three children.
Getting analytical on our political power, the priority of authority Government institutions often do not work effectively in weak states, yet this does not mean that a country has descended into anarchy. In weak states some of the most central aspects of people’s lives, such as property rights and rights to political participation, seem to be governed outside statutory institutions, as Professor Christian Lund of the Rule and Rupture project explains A number of countries across the globe are characterised as weak states, in which government institutions are not working effectively. However, this does not mean that there is no government at all in these countries, says Professor Christian Lund. “Once you get closer to people’s daily lives in weak states, you see that it’s not a state of pure anarchy, there’s a lot of governance going on. It just doesn’t take place through government institutions,” he explains. Based at Copenhagen University in Denmark, Professor Lund is the Principal Investigator of the Rule and Rupture project, an initiative studying the local institutions which exercise political authority in different countries
across the world; this could be a village council for example, or even institutions with no formal mandate. “A village council is among the lowest official bodies in the state, but sometimes chieftancies take on governance roles, although they are not formal institutions, or it might be NGOs. In Indonesia and Colombia, you have areas which are occupied by farmers movements or other social movements, which are outside state control,” says Professor Lund. The focus in research is on six different countries in Asia, Africa and Latin America, all of which have experienced different kinds of political ruptures over recent years. While these countries have significant
social and economic differences, Professor Lund believes that a certain pattern can be identified in terms of governance. “The pattern is basically that people’s demand for land rights, or political rights, are addressed not just to statutory institutions, but to all kinds of local power-holders,” he outlines. This effectively empowers these local powerholders, and over time they come to be seen as the relevant authority, rather than the statutory institutions. “When we talk about failed states, it’s usually from the perspective of statutory institutions, but this is not always a true picture of how governance is
performed,” says Professor Lund. “When we talk about failed states, very often an almost unrealistic image of what is going on is presented. It’s either that it’s complete chaos, or that nothing happens. Both are wrong – it’s not chaotic, and a lot of things happen, that’s what we’re trying to investigate.”
Social contracts A statutory institution may takes responsibility for certain issues in weak states, while other areas of policy are left to local institutions or organisations, which local people themselves have created. This is a topic of great interest to Professor Lund. “How does this work?” he asks. In many weak states, some of the most central aspects of people’s lives, such as property or citizenship rights, seem to be governed outside statutory institutions, a topic at the core of the project’s research. “We’re essentially trying to look at social contracts, where some kind of authority authorises access to land or to political participation for example. The people who claim rights simultaneously acknowledge the authority of the institutions they address to grant it,” outlines Professor Lund. “It was important to find situations in which it’s not just the state that issues these rights, which re-confirms the authority
of the state – it was also about contexts where people were instituting authority in local organisations, which then exercise that authority. To do that, we needed to find moments of rupture, where old social contracts had broken down, and new social contracts were established.” The aftermath of the Maoist insurgency in Nepal in the ‘90s provides a good example, a time when central institutions were not in control of parts of the country. Similarly, at certain points in history, government has been virtually absent from parts of the Congo. Following these periods of instability, the population then had to reestablish relationships with the authorities. “We aim to identify those points in time when people’s rights to land and to political participation have to be re-established,” says Professor Lund. Points of rupture, of a breakdown of conventional political authority, give researchers a window into how it is then re-established. “I’ve spent some time in Indonesia for example, in rural areas where people have been occupying land. The people who occupied the land believe it was their land, but it was taken over by a Dutch plantation in the early part of the 20th century,” explains Professor Lund. “With Indonesian independence, they got the land
back, but when the Suharto regime took power in 1965, they were excluded from the land again. Then from 1998, they occupied the land once again.” This land occupation was technically illegal, as the Indonesian government did not recognise the land rights of these people. The people involved could try to gain recognition of their rights to the land from either the local farmers movement, that helped them get the land back initially, or the local village council. “Now, the village council is part of the Indonesian government – they can’t recognise the land as the property of these farmers. However, the farmers basically collected taxes among themselves and paid it to the village council as a kind of property tax,” outlines Professor Lund. The village council isn’t mandated to collect taxes, but of course money is always welcome, so in the end it took the money; while this was technically illegal, Professor Lund says this to an extent legitimised the actions of the farmers. “It gave some kind of public recognition of people’s land claims, and it also gave the village council some level of status as a public authority with a mandate to recognise people’s land rights,” he explains. “Even though it’s outside the law, a kind of political relationship is built around this exchange.”
When we talk about failed states, it’s usually from the perspective of statutory institutions, but this is not always a true picture of how
governance is performed
Field at the urban perimeter, Medan, Indonesia. Photo: Fachrizal Sinaga
Rule and Rupture State Formation Through the Local Production of Property and Citizenship
Rule and Rupture is an interdisciplinary research programme directed by Christian Lund. We aim to investigate how political authority is constituted after moments of rupture. The focus is on the global south.
The programme is funded by the European Research Council (ERC). ERC Grant: State Formation Through the Local Production of Property and Citizenship (Ares (2015)2785650 – ERC-2014-AdG – 662770_Local State).
Professor Christian Lund, Director Eric Komlavi Hahonou, Associate Professor at Roskilde University Mattias Borg Rasmussen, Assistant Professor at the Department of Food and Resource Economics, University of Copenhagen Michael Eilenberg, Associate Professor of Anthropology at Aarhus University Veronica Gomez-Temesio, Post Doc Rune Bolding Bennike, Post Doc Penelope Anthias, Post Doc Prathiwi Widyatmi Putri, Post Doc Kasper Hoffmann, Post Doc Inge-Merete Hougaard, PhD student Tirza van Bruggen, PhD student
Police tape marking out land in conflict, Indonesia. Photo: Christian Lund
Professor Christian Lund, Director of Rule and Rupture, Copenhagen University The programme is based at the Department of Food and Resource Economics, IFRO, at the University of Copenhagen, and runs for five years (2016-2020). T: +45 28 49 69 82 E: firstname.lastname@example.org W: http://www.ruleandrupture.dk
Professor Christian Lund
Christian Lund is Professor of Development, Resource Management, and Governance, at the Department of Food and Resource Economics, at Copenhagen University. He previously held a professorship in International Development Studies at Roskilde University for a number of years. He has been a Visiting Scholar at University of Leiden, University of California, Berkeley, London School of Economics, Centre for Development Research, Copenhagen, and École des Hautes Études en Sciences Sociales in Marseille.
These are the kinds of micro-dynamics that Professor Lund and his colleagues in the project are looking at, with researchers working on both rural and urban field sites. One postdoc student in the project is working in the Democratic Republic of Congo, a vast country with a history of political instability. “We’re looking at who authorises people’s land rights in urban areas. I doubt that we will find anything linked to the national government as such – there’s just a different kind of governance structure emerging, sidelining the statutory institutions. That’s what we are trying to explore,” says Professor Lund. The political culture is an important factor in this respect; Indonesia for example has a long history of very strong central control, but following Suharto’s resignation in 1998, there was a move towards de-centralisation. “There’s a strong will among local politicians in Indonesia to extend their jurisdictions and possibly exceed their mandate. It’s been a centrally governed country almost since it gained independence, but this has changed since 1998 and there’s now strong demand for the empowerment of local politicians,” outlines Professor Lund.
Political landscape This points to a wider shift in how we think of state governance. While many of us commonly think of political authority as residing solely in national parliaments and statutory institutions, Professor Lund believes we should take a wider view. “When we look at the political landscape of a particular country, we shouldn’t limit ourselves just to looking at the institutions that have been assigned roles of governance. We should look at the institutions that actually do governance, even though they may not have been assigned specific roles,” he says. This means trying to remove preconceived ideas and look more deeply at how fundamental social contracts and property rights are governed. “What is the institutional configuration of this, in these settings where statutory government seems to be failing somehow?” asks Professor Lund. “Many different institutions are part of the governance of society. If we exclude them from our analysis, just because they haven’t been assigned an official role, then we are missing the point.”
For more information, please visit: www.euresearcher.com
Science and Politics : A Toxic Relationship? Science and politics, when mixed, can skew, warp or misrepresent facts to extremes. That’s never stopped politicians leveraging arguments with ‘science’ this way, from justifying eugenics for ethnic cleansing to undermining facts about climate change to justify policy. Science is seen as a tool for influence and that makes it powerful and vulnerable at the same time, depending on who is using it and the point they intend to make. By Richard Forsyth
nyone tracking the news may feel like partisan policies are sometimes created from emotional decisions first, whilst the logic to justify them is worked out as an afterthought. There is certainly evidence that this is a way that science has been abused, throughout history. The death of facts, in favour of invented or warped ‘scientific’ justifications, is in no way a new phenomenon. Take Hitler’s eugenics programme, which was set to ‘biologically improve’ German citizens in the Aryan race, to become what he defined as the ‘master race’. This truly unscientific form of discrimination helped shape Nazi social policies and identify human lines of hereditary the party wished to extinguish. Measuring skulls, nose shapes – in fact, all sorts of physical measurements were made to define what was deemed as ‘scientifically’ racially advantageous, to the point of forcibly sterilising and later murdering those that did not fit into the preferred categories. This is the most extreme example of how to misuse science but today we see plenty of modern cases where science is tweaked, twisted or mocked for political advantage.
Replacing the evidence The science of climate change and Man’s influence on it, is confirmed by an array of scientific experts and institutions that unanimously conclude human activity is accelerating a global temperature rise. However, there are prominent politicians who not only disbelieve it but see such science as a challenge to industrial progress, President Donald Trump being the most high-profile antagonist. Trump promotes the idea that news that discredits him or his policies is Fake News. When such Fake News accusations are aimed at scientists who oppose his understanding, it stands to reason he is trying to make facts not only lose their power but their popularity. His original assertion was China had invented the scenario of climate change as a conspiracy against the US, as he explained in a Tweet: ‘The concept of global warming was created by and for the Chinese in order to make US manufacturing non-competitive’. President Trump is someone who has long disbelieved in climate change and sees it as a barrier to economic growth. To this end he has surrounded himself with those with a similar world view and
Scientists need to be heard by politicians.
The science of climate change and Man’s influence on it, is confirmed by an array of scientific experts and institutions that unanimously conclude human activity is accelerating a global temperature rise removed climate change specialists from government, for instance, he dismantled the 15 member climate change advisory panel in August 2017. Jessica Whitehead, a coastal communities hazards adaptation specialist on the committee, told the news channel, CNN: “It’s now going to be a big challenge for government entities to easily understand how to use the science when making decisions on things like land use and infrastructure. If states or towns, for example, need to install new storm-water pipes, those pipes won’t be very effective if they make those decisions without a good understanding of the science of climate change and how it’s impacting that community.” The Trump administration has since continued to favour climate change skeptics for high positions associated with environmental decision making. For instance, Trump appointed Kathleen White as the White House Environmental Advisor. White had previously stated that carbon dioxide was ‘the gas of life on this planet’ and labelled renewable energy as ‘unrealistic and parasitic’ whilst setting out the moral case for fossil fuels. Her performance at her hearing exposed a profound lack of knowledge on climate processes as well as revealing views that clashed with existing scientific studies. When Senator Sheldon Whitehouse asked Ms White if she thought climate change was real
she responded she was ‘uncertain’ and she ‘didn’t have numbers’ on how much of Earth’s heat was stored in the oceans (93%). Under considerable pressure by opponents, Kathleen Hartnett has subsequently withdrawn her nomination to lead the Council of Environmental Quality but putting White up for the role in the first instance reveals how authorities can position key scientific advisors, so they agree and align with their government’s policies. Whilst some see accepted science as useful, others see it as a barrier to what they intend to achieve and will therefore undermine it and construct an opposing viewpoint.
When numbers lie One way to create urgency in a political campaign, with a ‘scientific’ flavour, is to use data, surveys and all sort of research numbers. Trend statistics can be subjective and misused to great effect as a tool of persuasion and to lend credibility to a political argument. For example, a data sample may not be truly representative in size. A 1,000 respondent sample may have wildly different results to a sample size of 1 million, or those surveyed may not represent communities objectively. What’s more, an ‘average’ can be misrepresented depending on how it is formed. For instance, if there are extremes in the data – it can skew the average to be closer to the extremes. This may be the case
Fake news is a phrase in the US used to dodge accusations and facts.
Data is open to manipulation.
when looking for an average salary for a job – if there are spikes that are disproportionally high or low in the data. Scientists know this of course but many citizens will not interrogate data which is presented to them via the media or official government channels. There are many ways to make data ‘look’ the way you want it to, so the viewer is left with an impression. What’s counted in the data is also important – for example – which data sets are in the unemployment numbers? Those on government schemes, unpaid training or those who do not claim benefits but are still without work, may all be missing from data. In the US there is a perfectly legal practice called Gerrymandering, designed to manipulate how votes are counted so that they favour a political party. By drawing a borderline on a map around the area that predominantly votes for your party, excluding those that you suspect will not vote your way, you can sway polls to your favour. It looks on the surface that a district simply voted for you but in reality, the voters that you suspect will vote your way have been separated and made to appear to represent the area. Creating an impression with statistics is in every big political story, from stories trying to link immigration to crime rates to attempting to prove an economy is better off than it was under a previous government.
Removing experts Another way to get political results you want is to simply prevent research or remove scientists.
In the USA there has been government resistance to look into any research that would undermine the second amendment for citizens to own firearms, despite around 33,000 people being killed by guns annually and high-profile incidents where there are mass murders at American schools with guns. The second amendment is a small line in the Bill of Rights saying: A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed. It effectively means Americans can arm themselves with firearms and that’s been a point of debate for a long time in the country. The Annuls of Internal Medicine said in an editorial in 2015, that Congress has been to all intents and purposes banned from researching the high rate of gun violence for more understanding and for solutions, because of pressure from the National Rifle Association. Governments can often find ways to distance themselves from awkward truths that don’t fit their policies. This distancing can mean that reputable scientists are sacked by their government paymasters because they have published their research, which did not align to government positions and policies. Professor Nutt, who was an advisor to the UK Government lost his job for expressing his research on drug harm – stating for instance that some drugs, like LSD were less harmful than alcohol. He published a study in The Lancet on the harms of drug use which eventually led to his dismissal from the Advisory Council of the Misuse of Drugs. He became held up as a
modern example of how a scientist working for a government had to tow the line in ethically sensitive areas. Even more controversially, there are occasions when scientific evidence is simply ignored, even when commissioned and relevant to decision making for policies. The UK Government’s environmental department has been under fire for more than one instance of research dodging. For example, it simply ignored the advice of its own £49m study recommending not to commence a nationwide badger cull, when under pressure from farmers. It also initially refused to ban pesticides that were killing bees in the face of environmental evidence. Public outcry and reexamining the research, did eventually lead to Michael Grove, Secretary of State for Environment, Food and Rural Affairs, overturning the decision and he has since banned bee-killing neonicotinoids – the main bee killing pesticide – a decision that may well be adopted across the EU. Scientific definitions have also been hotly debated in relation to our farming practices, such as the argument around the two words: animal sentience. This is an argument about whether we can say farm animals feel pain or not. In the process of removing a small part of EU legislation around recognising animal sentience in a Withdrawal Bill (Brexit) a media storm erupted where campaigners accused the UK’s government that they were manipulating law so animal pain would not be recognised, therefore not impeding industrial farming practices etc. If anything – this shows a level of public mistrust of government motivations when dealing with some scientific concepts.
What can be done? In a study Science, politics, and rationality in a partisan era, James Kirchner says: “Science is just one of many voices in the policy process,
The UK Government ignored advice that a badger cull would be ineffective.
one that is at increasing risk of being marginalised and distorted. Nonetheless, there are things that scientists should do. First and foremost, we should do the best possible science, and portray it as clearly and honestly as we can, striving to make our work policyrelevant but not policy-prescriptive.” Good scientists are often naturally cautious and unassuming and don’t quite comprehend the power of good PR when attempting to bridge the gap between a study and the media. Sometimes, when stonewalled and undermined by government pressure, the only way scientists can maintain the integrity of their research is to communicate in better ways with wider, non-scientific audiences in a language they can understand. Politics will continue to have some uncomfortable moments with science as their goals do not always align. It is left for scientists and the public to guard the truths and monitor the ways politicians present their facts and figures.
Governments often back up argument with facts or figures.
Research on oxygen reduction gives traction to solar power Important insights can be drawn from the study of natural catalysts, which can then be applied in the development of artificial systems. We spoke to Dr Dennis Hetterscheid about the work of the Cu4Energy project in studying molecular copper catalysts for water oxidation and oxygen reduction, reactions which are central to the performance of fuel cells The majority of artificial catalysts have heterogenous metal surfaces, which react via relatively simple mechanisms, yet typically energy is lost during the process. The underlying mechanisms need to be modified if these energy losses are to be reduced, as Dr Dennis Hetterscheid explains. “More degrees of freedom are required, more complexity, in order to reduce barriers.That cannot be achieved with a simple, flat metal surface,” he says. Nature builds catalysts in an entirely different way to artificial systems, using for example an enzyme called laccase. “The active site of laccase contains three copper atoms, it’s called a trinuclear copper centre, and the environment of this copper centre is completely controlled. So it’s perfectly oriented, there are gas channels, water channels and polar channels, to and away from the active site,” explains Dr Hetterscheid. “That’s perfectly aligned. Researchers have previously shown that the laccase enzyme is an excellent electrocatalyst for the oxygen reduction reaction.” This is a central part of the motivation behind Dr Hetterscheid’s work in the Cu4Energy project. Based at the University of Leiden in the Netherlands, Dr Hetterscheid and his colleagues in the project are drawing inspiration from nature in the study of molecular catalysts. “We aim to understand how a laccase does this, then look at how we can do this in the lab with simple molecules. If we can understand that, then at some point we could potentially implement that knowledge in the development of electrolysers and fuel cells,” he outlines. Attention is currently focused primarily on fundamental research around two main reactions, namely water oxidation (WO) and oxygen reduction (OR), both of which are key reactions in terms of the performance of electrolysers and fuel cells. “A lot of energy loss in electrolysers and fuel cells is related to the oxygen reduction and water oxidation reactions in those systems,” says Dr Hetterscheid.
A lot of energy in research is currently devoted to improving the efficiency and overall performance of these kinds of artificial systems, reinforcing the wider relevance of the project’s work. Dr Hetterscheid believes much can be learned in this respect by studying the superior performance of natural catalysts. “We aim to understand how natural enzymes do it – and then to see whether we can make molecular catalysts that react in
Catalytic cycle The relative inefficiency of artificial catalysts is typically attributable to one particular step in the process, which researchers in the project aim to address by treating and modifying molecules, looking to gain new insights into the mechanisms behind the catalytic reaction. The molecules themselves consist of copper atoms and a surrounding ligand, which both determines the electron density of the metal
The active site of laccase contains three copper atoms, it’s called a trinuclear copper centre, and the environment of this copper centre is completely controlled. So it’s perfectly oriented, there are gas channels, water channels and polar channels, to and away from the active site very similar ways,” he continues. Researchers are investigating the fundamental processes involved in catalysis, aiming to build a deeper picture of the factors that influence the speed and efficiency of a reaction. “We’re looking at things like electron transfer, proton transfer, proton-coupled electron transfer, and at the overall catalytic cycle,” says Dr Hetterscheid.
and also imposes geometric constraints. “By changing the structure of the ligand, we can tune what happens in the metal,” outlines Dr Hetterscheid. Researchers aim to investigate laccase molecules, and to develop what Dr Hetterscheid calls functional models, which react in the same way. “We look at molecular compounds, and these are really developed so
A typical potential energy landscape of the water oxidation reaction mediated by a simple catalyst. For such a system it is difficult to reduce reaction barriers without creating new ones. The arrows symbolize that a thermodynamic sink is created at the *OH intermediate when one tries to reduce the potential energy of the *OOH intermediate.”
Biomimetic Copper Complexes for Energy Conversion Reactions
that we can investigate different intermediates and look at what the mechanism is like,” he says. These factors will have a major influence on the speed and efficiency of a catalytic reaction, which are of course important considerations in terms of performance. Dr Hetterscheid says the speed and efficiency of a catalyst are often inversely related. “Catalysts that operate very fast typically require an additional driving force to do so. That driving force – that overpotential – results in energy losses, so a catalyst that shows high catalytic rates may not necessarily be a catalyst that is energy efficient,” he explains. Measuring the efficiency of a catalytic reaction is not entirely straightforward however, as every catalyst works differently “We want to achieve the highest possible turnover frequency, at the lowest possible overpotential.” This depends to a significant degree on a deeper understanding of the structure of natural catalysts, which forms an important part of the project’s overall agenda. Alongside achieving enhanced catalytic rates, Dr Hetterscheid and his colleagues also aim to understand the underlying factors behind those higher rates. “It’s not only the high active rates that’s a deliverable from the project, but also the knowledge of how and why we get those high active rates,”
he stresses. A major research objective for Dr Hetterscheid is to find a reversible catalyst for oxygen reduction and water oxidation, and also to understand the finer details, laying the foundations for future applications. “We want to understand how we can get such a catalyst, and what makes a natural system like laccase such a good redox catalyst,” he continues.
Catalytic activity The Cu4Energy project itself received funding for five years, and Dr Hetterscheid says that there is still a lot to achieve over the remaining two years of the term. Nevertheless, Dr Hetterscheid is also keen to explore other avenues of research. “I’m not just interested in the catalytic side of enzymes, but also the complete way of how enzymes tune catalytic activity. One of the things I find very interesting is how water molecules are perfectly arranged in hydrogen-bonding networks,” he says. “Those water molecules that are effectively in a confined environment will have a totally different reactivity, a totally different chemistry, to the water that other electro-chemists are currently using in fuel cell electrolysers. So I’m very interested in understanding and harnessing those types of features.”
The aim of the proposal is to significantly increase our fundamental understanding of the design principles for molecular oxygen reduction (OR) and water oxidation (WO) catalysts and to deliver new and very active molecular copper catalysts for OR and WO at the end of the project. Experiments will be carried out wherein the structure of the catalyst is linked to the observed catalytic activity and the potential energy surface of the catalytic cycle. The proposal is in particular focused on the rate-determining step of the catalytic reaction, as improvements here will directly lead to enhanced catalytic rates. A functional model system of the copper enzyme Laccase will be designed to study the rate limiting proton-and-electron-coupled O–O bond scission reaction, which is the rate limiting step in OR by Laccase.
Funded by the ERC-StG-2014 - ERC Starting Grant Cu4Energy.
Assistant Professor, Dr Dennis Hetterscheid, PhD. Mathematics and Natural Sciences Leiden Institute of Chemistry LIC/Catalysis & Surface Chemistry Science Campus Einsteinweg 55 2333 CC Leiden Room number EE4.19 T: +31 71 527 4545 E: email@example.com W: http://lic.leidenuniv.nl/spotlight/dennishetterscheid
Dr Dennis Hetterscheid, PhD.
Dennis Hetterscheid has obtained his PhD at the Radboud University of Nijmegen under the supervision of Prof. Bas de Bruin. He then moved to the Massachusetts Institute of Technology where he worked in the lab of Prof. Richard R. Schrock, and to the University of Amsterdam where he worked with Prof. Joost N. H. Reek. Since 2013 Dennis is an assistant professor in physical chemistry of sustainable energy at Leiden University. The main research theme in his group is to understand and mimic bioinorganic multi-electron processes that are relevant to a future energy infrastructure. The thermodynamic pathway of the oxygen reduction reaction is the inverse of the water oxidation reaction. In case of an ideal catalyst, where the potential energy surface is flat, one would expect to find activity for both reactions at a very low overpotential.
The success of Cyprus and Solar Energy The European Research Area (ERA) Chair for the Eastern Mediterranean (CySTEM) project aims to enhance Cyprus’s research capacity in solar energy, putting in place the foundations on which tomorrow’s technologies can be built, as Professor Manuel Blanco and Professor Costas N. Papanicolas explain. Solar Energy and in particular solar thermal is viewed as a major enabling technology for the amelioration of the forecasted harsh impacts of climate change in the area The geographical location of Cyprus is a major advantage in terms of research into solar energy, with consistently high levels of sunlight exposure allowing scientists to investigate key questions in the field and develop innovative technologies. The Cyprus Institute (CyI) has already established a strong reputation and knowledge base in this area, and now the ERA Chair CySTEM project aims to build on this further. “The CySTEM project is about enhancing the already existing capability of the Cyprus Institute in relation to solar technologies and desalination,” says Professor Costas Papanicolas, the project’s Principal Investigator and the President of CyI. This work is very much in line with the wider regional agenda throughout the Eastern Mediterranean and Middle East (EMME), where research into Concentrating Solar Thermal (CST) energy has been identified as a major priority. “CST technology is expected to play a major role in future energy provision, particularly to countries in Southern Europe, North Africa and the Middle East; it has also an important role to play in developing the green economy of Cyprus,” says Professor Papanicolas. Climate change This research takes on even greater importance in the context of concern around the impact of climate change. The EMME region in particular is expected to be severely affected by climate change. “The predictions look serious, including rising temperatures and declining rainfall. The region around Cyprus is a large and densely populated area, so the challenge needs to be addressed to prevent serious consequences, some of global concern - like large-scale migration or destruction of agriculture,” outlines Professor Papanicolas. The CyI has
a major role to play in both heightening awareness of these challenges, and developing and promoting technologies to address them, in which it has wider support from the Cyprus government and the EU. “The Cyprus government initiated a strategy around the cogeneration of electricity and desalinated water, which is part of the country’s contribution to addressing challenges around climate change and water scarcity. The EU is supporting this research thrust in a major way,” says Professor Papanicolas. The main focus of research in the CySTEM
solar energy field, Professor Blanco believes this technology is very much in line with wider priorities. “This technology and this approach is closely aligned with the need for CST technologies in Cyprus and in the island and coastal areas of this region and other sunny regions around the world,” he stresses. This particular system not only delivers electricity on demand at any time of the day or night thanks to its energy storage, but also simultaneously delivers heat that has several possible applications, one of which is desalination. This system has been tested
The CySTEM project is about enhancing the already impressive capability of the Cyprus Institute in relation to
solar technologies and desalination
project is on CST technologies, with the aim of reducing dependence on conventional power plants and over the longer-term helping to de-carbonise the whole energy system. One particular area of interest is the development of smaller-scale systems suitable for islands and coastal environments. “For example, the CyI has been developing a very interesting cuttingedge technology based on a relatively small, concentrated solar-thermal polygeneration system utilizing molten salt,” says Professor Manuel Blanco, who is the CySTEM’s ERA Chair holder. With deep experience in the
at the proof-of-concept level at PROTEAS, a research and development facility located just outside the coastal city of Limassol. The versatility of PROTEAS is an important attribute in these terms. The PROTEAS research facility gives the Cyprus Institute and its collaborating Institutions the opportunity to test technologies related to CST alongside the desalination issue, which is very important in islands and coastal regions. “The PROTEAS facility is recognised by the EU as part of ESFRI (European Strategy Forum on Research Infrastructures),” says Professor Papanicolas. A major research priority is to investigate how this technology can be developed at different scales, which would help more closely match energy production to demand. Historically, it was thought that CST technology was well suited to large plants, but now researchers are exploring the possibility of downscaling it. “Downscaling
CySTEM Cyprus Solar Thermal Energy Chair for the Eastern Mediterranean Project Objectives
The Cyrus Solar Thermal Energy Chair for the Eastern Mediterranean (CySTEM) aims in consolidating and upgrading the already substantial activity at the Cyprus Institute in Solar Energy, principally on Concentrated Solar Power (CSP) technologies for electricity production, desalination, air conditioning and heating, either in isolation or in multi-generation modes. This will be accomplished by attracting and installing a cluster of outstanding researchers and pursue a programme of excellence in Cyprus with local and regional focus in the region of Eastern Mediterranean and Middle East (EMME).
Horizon 2020 ERA Chairs (H2020WIDESPREAD-2014-2)
• The Cyprus Institute
Arial view of the PROTEAS solar research facility at the south coast of Cyprus. Novel solar energy and solar desalination technologies are being developed in collaboration with regional and European partners.
this technology, together with providing effective storage and generation capabilities, would be ideal for small and moderate-sized islands, such as Cyprus. It could mean bigger islands, or small islands like the Greek islands in the Aegean Sea, or the Spanish islands, or even certain isolated communities,” says Professor Papanicolas. The technology itself can be adapted for different communities, which Professor Blanco says represents an efficient approach to energy provision. “Distributed generation helps to resolve a lot of problems around power plants, including mitigating their environmental impact,” he outlines.
Global demand Beyond the EMME region, the potential demand for this type of technology is global, with climate change and water availability major concerns across large parts of the world. Researchers are now keen to explore the possibility of exporting this technology, not only to islands, but also communities in need of improved energy provision more generally. “This type of community-sized
technology will be very much in demand in future. We are already in discussions with some commercial enterprises,” says Professor Papanicolas. For tens, even hundreds of islands around the Mediterranean coastline, for example, this technology might represent an effective energy solution for their needs in electricity and desalinated water. This work still has some way to go, with the project set to run until the middle of 2020, and researchers are keen to further explore the wider potential of these technologies. Over the longer term, this will help to strengthen the research and innovation base in Cyprus, part of the wider goal of enhancing technological capacity beyond the EU’s traditional leaders, and of fostering a knowledge-based economy. The CyI is also leading the EU NESTER network, a twinning project funded by the EC’s Horizon 2020 programme, with the participation of some of Europe’s research leaders in the field: Aachen (D), CIEMAT (ES), CNRS(F) and ENEA (I), which helps build stronger relationships with other solar technology initiatives throughout Europe. Schematic drawing of the pioneering solar thermal cogeneration of desalinated sea water and electricity. Energy storage in molten salt endow it with 24/7 operational capability.
Professor Costas N. Papanicolas The Cyprus Institute 20 Konstantinou Kavafi Street 2121, Aglantzia Nicosia Cyprus T: +35 7 22208703 E: firstname.lastname@example.org W: https://www.cyi.ac.cy/index.php/eewrc/ research-information/ongoing-researchprojects/cystem.html Prof. Costas N. Papanicolas Professor Manuel Blanco
Prof. Manuel Jesus Blanco (European Research Area Chair): Holds a Ph.D. (Applied Physics) from the University of Massachusetts, USA, and a Doctor in Engineering from the University of Seville, Spain, and has over 30 years of experience contributing to the advancement of Concentrating Solar Thermal (CST) technologies internationally. He has held positions at CIEMAT and CENER (Spain), University of Texas (USA), and CSIRO (Australia). He is Vice-Chair of SolarPACES the Technology Collaboration Program of the International Energy Agency on CST and Solar Chemistry technologies. Prof. Costas N. Papanicolas (President of CyI): Holds BSc (Physics) and Ph.D.(Nuclear Physics) degrees from MIT and has over 35 years of research experience on the fields of Hadronic, Medical Physics, and in Solar Energy and Energy Policy. He has held positions at CEA (France), University of Illinois (USA), and University of Athens (Greece). He is a Fellow of the American Physical Society and a member of the Academia Europe.
A breakthrough in disposable fuel cells ICREA Professor Neus Sabaté and her team at the Instituto de Microelectrónica de Barcelona, are developing a new kind of fuel cell that can power diagnostic readers from a bodily fluid being analysed. The fuel cells and the diagnostic devices in the SUPERCELL project are made from paper Scientists often say
they have put ‘blood, sweat and tears’ into their work but with the SUPERCELL project this is almost a literal description of the research. That’s because the project’s objective is to extract enough energy from biological fluids like blood, urine and sweat to power biosensor devices. Such devices could have practical uses in healthcare, for example: pregnancy detection or diabetes management. To that end, the project is developing a new generation of single-use, disposable and low environmental impact fuel cells, representing a significant milestone in the fuel cell field. SUPERCELL is a project that drives the kinds of manufacturing and financial efficiencies that make it an irresistible innovation. As an alternative power source for diagnostic micro devices in healthcare, the potential is enormous.
Paper powered diagnosis “This is about giving some power to already existing diagnostic paper devices. I thought maybe I can fabricate both in the same platform, so if we can use paper to fabricate a diagnostic device, I can also fabricate the fuel
Single-Use paPER-based fuel CELLS The SUPERCELL project is developing a new generation of single use, low environmental impact fuel cells that can run on the biological samples being analysed, such as urine or blood, which is useful for point-of-care healthcare devices. Neus Sabaté, ICREA Research Professor Instituto de Microelectrónica de Barcelona, IMB-CNM (CSIC), C/ del Til·lers Campus Universitat Autònoma de Barcelona (UAB) 08193 Cerdanyola del Vallès (Bellaterra) Barcelona (Spain) T: +34 935 947 700 ext. 2116 E: email@example.com W: http://www.speedresearchgroup. com/supercell/ Neus Sabaté is a Physicist and ICREA Professor that is passionate about microsystem devices development. She works at the Microelectronics Institute of Barcelona (CSIC). Along her career, she has transitioned from the development of industrial-oriented sensors in silicon technology to single use point-of-care devices made with rapid fabrication techniques.
cells within the same platform, in the same materials,” said Sabaté. “If people use this paper device to diagnose biological liquids, then I may also extract the energy needed to fuel the reader from the same fluids. The idea is to have everything integrated – so that the liquid you analyse is the same that provides you the energy to make the analysis.” Whilst you can extract power from blood, glucose and urea, not all body fluids have molecules that are suitable for generating electricity and you only produce microwatts from them. The end user applications therefore, need to be carefully considered. However, the advances demonstrated in this research should not be underestimated. Traditionally there has been a reliance on the use of bulky, battery-powered readers, which is only cost-effective if the device is used thousands of times – therefore these are often limited to the hospital environment.
Cheap and biodegradable
very practical. I realised a capillary could work as a natural pump. Of course, if you work with paper it will saturate and the flow will stop, so instead of fuel cells working for a long time, I started to think about them working for several minutes. If you think you can have a fuel cell working for several minutes then you are not going to focus on powering phones and things like that, and it makes no sense to power long-lasting lab-on-a-chip devices but instead to power disposable paper microfluidic devices. Overall, The aim was to fabricate a power source that follows the same lifecycle of the diagnostic test.” The devices, if further developed and mass produced, would answer a worldwide need for cheap, accessible healthcare technology which can be used in the home setting. The implications these devices would have for third world healthcare are also exciting. Beyond healthcare, other industrial sectors could also benefit, such as the fitness industry, where sweat monitoring, for instance, might have benefits. Neus Sabaté is patenting designs and making inroads into industry, showing that sometimes the simplest technological innovation has the strongest, widest appeal. “This is basic science and basic technology but you know, engineering will change the world,” states Sabaté, “Scientists create the next industries.”
There are many advantages with a paper fuelcell, as Sabaté explains: “It’s cheap to produce and environmentally friendly. You don’t need much energy to produce these in the fabrication process and you don’t pollute with them. “What is motivating me a lot is that I use cheap technology and I hope to start by creating an industry here in Barcelona. I don’t need to fabricate this in say, China, because it is cheaper to do that, I can just as easily do this in Barcelona.”
Less is more One of the breakthroughs in the direction of the research for Neus, came from finding a suitable, simple method to pump fuel in the cell. “We had to pump fuel to make fuel cells work and I wanted to do it with the least amount of components. We wanted to simplify the design and the fabrication techniques as we are very application oriented, so the concept we came up with was
Fig 1 (a) Scheme of the individual layers forming the packaged microfluidic paper-based device. Numbers from 1 to 8 indicate the layer number whereas figures in μm account for the layer thickness. (b) Front-view of the sealed paper device. (c) Side-view of the paper fuel cell. (d) External connections using a PMMA holder.
Detail from Rāmcaritmānas mansucript (18th Century), Mehrangarh Fort, Jodhpur.
Mapping the religiosity, history and culture of yoga Over the past thirty years or so yoga has become an increasingly mainstream activity, yet little is known about its historical roots in South Asia. Dr Jim Mallinson tells us about his research into the origins of haṭha yoga, and how it evolved into the practices that we see in yoga studios across the world today The practice of yoga is becoming increasingly popular across the world, with many people getting into the habit of attending regular classes for a variety of reasons, whether to counter the indulgences of modern life, to relax, or simply to socialise and meet new people. This is very different to earlier practitioners of yoga, says Dr James Mallinson. “All our evidence of yoga practitioners up until around the last 200 years is that they were full-time religious professionals. They were almost always celibate, ascetic men, who had basically given up normal family life,” he outlines. Based at SOAS University of London, Dr Mallinson is the Principal Investigator of an EU-backed initiative investigating the origins of modern yoga, in particular the important branch of Hatha Yoga, which lies at the root of contemporary practice. “We’re trying to map out the development of yoga, from about the 11th to the 19th century,” he explains. Yoga practice This involves investigating the early evidence of yoga practice, with researchers both analysing historical manuscripts and spending time with yoga practitioners to gain deeper insights. While it is known that some of the physical elements of yoga practice date
Unidentifed goddess (13th Century), Mahudi Gate, Dabhoi, Gujarat, India.
back to at least the time of the Buddha, it was only around 1,000 years ago that they were first written down. “We’re specifically looking at texts on yoga in which the physical methods of practice predominate,” says Dr Mallinson. Research is focused on ten core texts, an unusually small and self-contained corpus mostly written in Sanskrit, which Dr Mallinson and his colleagues plan to edit and translate. “We’ve been helped in establishing that corpus by a text from around the 15th century called the Hathapradipika, which
means the light on hatha yoga,” he continues. “We’ve identified about twenty texts that the Hathapradipika borrowed from, quite a few of which are among the ten texts that we’re editing in the project.” These texts reveal a process of evolution in the practice of yoga. Some of the early Buddhist texts refer to ascetics holding particularly difficult postures as part of their practice, for hours, days, or even sometimes years on end. “Ascetics were known for standing up for years on end for example,” says Dr Mallinson. About 1,000 years ago came the first textual descriptions of specific postures like balancing on the hands, which clearly cannot be held indefinitely. “There’s a quantum change in the notion of postural practices as part of yoga, but it remains a very minor element until about the 15th or even 16th century, when we start seeing texts which give lists of large numbers of such postures,” continues Dr Mallinson. “From the 16th/17th century onwards they are listed, described and analysed, and more and more emphasis is put on that. Then we start seeing dynamic postures, so we really have seen a process of development.” The more ascetic practices such as holding difficult postures for extended periods are absent in the textual treatments, in which
that the practices that bring spiritual benefits when you’re healthy, will also improve your health when you’re unhealthy,” says Dr Mallinson.
Gaur Lena Cave (containing 14th Century images of yogis), Panhale Kaji, Maharashtra, India.
posture practice is for cultivating rather than mortifying the body. However, these types of practices haven’t completely disappeared, and in fact are still current amongst the living yoga traditions, a major area of interest to Dr Mallinson. “We’ve spent time in India talking to living yogis, amongst whom a lot of these ancient practices are still current,” he outlines. The influence of globalised yoga is nevertheless leading to changes in traditional practice in India, underlining the importance of documenting these practices now. “There’s a definite level of cultural exchange going on. Some of these wandering holy men are now incorporating sequential posture sequences into their yoga practice, which is not something they would have done around 2030 years ago,” acknowledges Dr Mallinson. A process of change was underway well before globalised yoga started to make its mark on traditional practice however. The more esoteric teachings and practices
originated amongst fairly extreme ascetics, who were often unconcerned by material comforts, yet the act of describing them in texts by nature broadened the audience. “Once these practices are taught in texts, the audience becomes students, priests and scholars. The practices are re-fashioned in
These practices predate the texts that describe them, raising the question of why these texts were written at that specific point in time. Dr Mallinson has gained new insights into this by looking into the wider context of the early period, around the 11th/12th century. “That basically coincides with the rise of monasteries in central and southern India, which seems to be where almost all of the texts were composed. So we believe that they were written within a monastic environment,” he says. These monasteries were not just religious retreats, but also functioned a bit like universities. “There would be students travelling between them, and they didn’t have to strictly adhere to the doctrines of whoever founded the university or monastery,” continues Dr Mallinson. “So even though the texts themselves were
There’s a definite level of cultural exchange going on. Some of these wandering holy men are now incorporating sequential posture sequences into their yoga practice, which is not something they would have done around 20-30 years ago such a way that they’re better suited to a less extreme lifestyle,” explains Dr Mallinson. The benefits of yoga are another area of interest in the project; over the last 1,000 years there has been a shift in the texts towards describing more of the physical benefits of yoga practice. “One explanation we find in some texts is
produced by different religious traditions following different gods, they will also say that anyone can do the practices that are taught in them.” This is a major factor in explaining why these practices have proved very adaptable over the years. As these monasteries grew
Yogis (16th Century), Hampi, Karnataka, India.
in power, they would write their own texts, a much more organised way of passing on yoga practice to the next generation than was previously the case. “What we see in the texts is a kind of domestication of the practices of these hitherto extreme and wild wandering yogis,” explains Dr Mallinson. While the practices of these wandering yogis included some elements that we would recognise today as yoga, they also practiced some more extreme techniques. “They might sit surrounded by fire in the hot sun for example, or stand up for years on end. There’s no texts on that kind of practice, it’s always been passed down orally,” says Dr Mallinson. “It’s only the aspects of their practice that are more adaptable to a less extreme lifestyle that then get passed on in the texts.” The larger sects of holy men are doing well today, with many of their members practising traditional forms of yoga, but Dr Mallinson says that traditional practice is in some cases being influenced by recent developments in globalised yoga. “Their yoga techniques are changing under the influence of the ubiquitous modern globalised yoga, which in some ways is really quite different from traditional practice,” he says. In many cases the yogis themselves recognise this, and so differentiate their yoga from modern forms of practice. “Yoga is a sanskrit word, but in modern Hindi, the final ‘a’ of sanskrit words is dropped. So they’ll say that what they do is yog, and meanwhile what people in metropolitan centres are doing is yoga, which is different,” explains Dr Mallinson. “Nevertheless, they also recognise that they are seen to be the originators of these modern practices.”
HYP The Haṭha Yoga Project
The Haṭha Yoga Project (HYP) is a five-year (2015-2020) research project funded by the European Research Council and based at SOAS, University of London which aims to chart the history of physical yoga practice by means of philology, i.e. the study of texts on yoga, and ethnography, i.e. fieldwork among practitioners of yoga. The project team consists of four researchers based at SOAS and two at the École française d’Extrême Orient, Pondicherry.
Lokeśvara (11th Century), Kadri, Karnataka, India.
This underlines the importance of documenting the traditional practices now. As global trends exert an ever greater influence on traditional practice, it grows harder and harder to identify what’s old and what’s new. “As a historian I’m interested in the historical record. In order to document what’s going on now, we need to identify what’s old and what’s new,” stresses Dr Mallinson. The project will make an important contribution in these terms, producing critical editions of ten texts on hatha yoga. “These ten texts have never been published before in any form, and are currently only available in libraries in India in manuscripts,” says Dr Mallinson. “We’ve gathered as many manuscripts of each text as we can, which we will then compare in order to produce our editions – versions of each text at a particular point in its development, together with an introduction and an annotated translation. We’re also going to produce four monographs.”
The project’s primary outputs will be critical editions and annotated translations of ten Sanskrit texts on haṭha yoga, four monographs, and a range of journal articles, book chapters and encyclopedia entries. In September 2016 a workshop for twenty scholars working on critical editions of Sanskrit texts on yoga was held at SOAS and All Souls, Oxford. A public conference will be held at SOAS in 2020.
Funded by the European Research Council.
• Mehrangarh Fort, Jodhpur, India • Ecole française d’Extrême-Orient, Pondicherry • Sahapedia (https://www.sahapedia.org)
Project Coordinator, Dr James Mallinson SOAS University of London Thornhaugh Street Russell Square London WC1H 0XG T: +44 20 7898 4368 E: firstname.lastname@example.org W: http://hyp.soas.ac.uk Dr James Mallinson
Yogic adepts and the goddess Tripurasundarī (14th Century), Panhale Kaji, Maharashtra, India.
Dr James Mallinson is Senior Lecturer in Sanskrit and Classical Indian Studies at the SOAS University of London. He is the Principal Investigator of the ERC-funded Hatha Yoga Project and the Chairperson of SOAS’s Centre for Yoga Studies. He is the co-author, with Dr. Mark Singleton, of Roots of Yoga (Penguin Classics 2017) and numerous books and articles on the texts and history of Haṭha Yoga.
Was There An Early Modern Religious Conjuncture? Although the process of differentiation between Sunni and Shii Muslims started in the early days of Islam, it took on much sharper theological contours in the early 16th century, roughly coinciding with the Protestant-Catholic polarisation in Europe, and triggering similar social dynamics. Historians have dismissed this as a coincidence, but was it? Researchers in the Ottoconfession project are taking a fresh look at the topic, as Dr Tijana Krstić explains The concept of
confessionalisation, relating to the convergence of theological and political ideas in the overarching rationale of rule, has long been important to the historiography of post-Reformation Europe, but it is also a valuable heuristic device for the scholars of the Ottoman Empire, argues Dr Tijana Krstić. The process of polarisation between Sunni and Shia Islam started in the seventh century CE; however, it came to be articulated in much sharper theological and territorial terms in the early 16th century with the rise of the Sunni Ottoman and Shii Safavid Empires. This roughly coincided with a process of polarisation between Protestantism and Catholicism – and later Calvinism – in Europe, and triggered similar socio-political dynamics, from the persecution of dissenters and a greater focus on catechisation and social disciplining of believers, to state building. Yet, historians have been wary of drawing parallels. “The fact that we see similar confessionalizing initiatives in both early modern Europe and the Middle East is often dismissed as a coincidence on account of the supposedly profoundly divergent historical trajectories of Christendom and Islamdom,” says Dr Krstić. As the Principal Investigator of the Ottoconfession project, she and her team are taking a fresh look at the topic. “Our question was: what accounts for the striking parallels? Are they just a coincidence or is there more to them?” she explains.
Confessional Polarisation in Comparative and Entangled Perspectives The processes of confessional polarisation in Europe and the Turco-Iranian world resulted from different dynamics specific to the two regions. However, the subsequent initiatives at confession-building, and the evolution of discourse on religious orthodoxy in both Europe and the Ottoman Empire, specifically during the 16th and 17th centuries, were actually entangled, argues Dr Krstić. “At some point these processes converged, and they mutually affected one another,” she says. There were several reasons for this, primary among them the fact that the
a) Profession of faith by an Armenian Patriarch from 1671; b) treatise on blasphemous utterances in Ottoman Turkish (17th century); c) Detail from an attestation of faith by a Greek Orthodox Patriarch (17th century).
Sunni Ottomans were engaged in imperial competition with the Shiite Safavids in the east and the Catholic Habsburgs in the west, which resulted in the imperial enterprise becoming closely identified with the protection of the true faith. “This
imperial competition entailed shared political theologies (of universal empire, for instance), and the authorities used their power in similar ways in order to shore up their imperial claims (to be a messainic ruler or mahdi who will renew religion). It’s not just that we as historians want to compare them, but it’s the fact that they compared themselves to each other at the time,” stresses Dr Krstić. Comparisons were facilitated also by the ease of mobility between empires. “Some Muslim populations were contested between the Ottoman and Safavid Empires or weaved in and out of service and loyalty to them, while Ottoman Christians travelled, traded, and entered the service of the Ottoman neighbours both to the east and west,” explains Dr Krstić. This meant that particular religious sensibilities and ideas about orthodoxy – as well as resistance to it – were shared across territorial boundaries, often through the phenomenon of religious conversion. Another factor was the openness of the Ottoman authorities towards the activity of Christian missionaries from post-Tridentine Europe. “They were allowed to come in and proselytise among the Ottoman Christians, which was another way of disseminating specific concepts, ideas and styles of piety. These were perhaps extraneous to the Ottoman Empire, but they then became domesticated in new ways among both Ottoman Christians and Muslims,” says Dr Krstić. “We’re trying to look at being a Sunni Muslim, an Orthodox Greek or an Apostolic Armenian not as something fixed in time or divorced from political, social and legal dynamics, but rather to capture how the boundaries of those labels and pious sensibilities were shaped through interaction of all these forces, as well as by the dialogue and polemics among the communities within an imperial and inter-imperial framework.”
Sunni and Other Orthodoxies in the Early Modern Ottoman Empire This work centres on digging deeper into the historical background to investigate how the Ottomans used the legal and theological
resources of medieval Islam to develop their own interpretation of Sunni orthodoxy, and how that notion of orthodoxy in turn evolved over time under the impact of various contingencies the empire faced between the 1450s and the early 1700s, largely in Anatolia and the Balkans. “The Ottomans did not ‘invent’ Sunni orthodoxy – rather, they adapted various earlier legal and theological articulations of who belonged to the community of true believers to the needs of a growing empire experiencing a direct challenge to its legitimacy from a Muslim neighbour. But the discourse of Sunni orthodoxy was not uniform: in addition to the scholars and imperial administrators, many different individuals, groups, and institutions aspired to have a say in what constituted correct belief and practice, which they expressed in a variety of written genres, from legal opinions and treatises to catechisms, heresiographies and histories. Nor was it universally enforced or embraced, since different parts of the empire experienced different dynamics and many groups in particular among Sufis - resisted the imposition of strict confessional boundaries, as our Senior Researcher, Dr Derin Terzioğlu, shows in her research,” Dr Krstić explains.
OTTOCONFESSION The Fashioning of a Sunni Orthodoxy and the Entangled Histories of ConfessionBuilding in the Early Modern Ottoman Empire, 15th-17th Centuries
Patriarch Kyrillos Loukaris (d. 1638) sparked one of the most important episodes of confessional strife within the Greek Orthodox church with his “Calvinist” profession of faith.
Ottoman Greek Orthodox and Apostolic Armenian communities who were dealing with the challenges of Catholic, Lutheran, and Calvinist proselytization. Especially in the 17th century Istanbul became the centre for the exchange of confessional views, polemical literature and proselytizing strategies across communal boundaries,” says Dr Krstić.
We approach religious identities not as something fixed in time or divorced from political, social and legal dynamics, but rather we aim to capture how their boundaries were shaped through interaction of all these forces, as well as polemics among various communities within and beyond the Ottoman Empire The Ottoman Empire was in some ways at the peak of its power during the mid-16th century, but it experienced numerous new challenges during the early 17th century, another topic of interest to Dr Krstić and her team. “There are major social changes in the Empire in this period, primarily growing social mobility starting around the second half of the 16th century, which begins to change the social order.” A general perception of decline, moral, military and otherwise, was quite pervasive in the Ottoman Empire during the 17th century, which also affected religious discourse. Many views were offered on how the ills of the empire should be remedied, and in many cases they entailed greater knowledge of faith and a return to the roots of religious practice. “Debates come to be focused on what is pure tradition and what is innovation? What is bad innovation and what is good innovation? This very much echoed the contemporary debates among Christians in Europe, as well as within
There have been many studies on how the Ottoman Empire shared in the dynamics of the early modern world, covering topics as diverse as trade, the spread of military technology and even climatic patterns, yet comparisons always stop short of considering religious politics and sensibilities. Now Dr Krstić and her colleagues aim to show that certain kinds of pious sensibilities and concepts were in fact very much comparable and mutually intelligible across the Muslim-Christian divide, focusing on the relationship between belief and unbelief, the importance of actions for faith, the attitudes towards the nature of communal boundaries, and other topics. “There are many ways in which one can draw parallels. For instance, one thing I have been working on is the genre of catechism which seems to become increasingly similar in form and organization of information, regardless of whether we are talking about Protestant, Catholic, Sunni or Shii primers from the late sixteenth and seventeenth centuries,” Dr Krstić explains.
The OTTOCONFESSION project examines what prompted Ottoman statesmen and literati to begin to articulate and enforce the boundaries of what they understood as a Sunni orthodoxy in the early sixteenth century, against the backdrop of what has been described as “confessional ambiguity” or even “metadoxy” in the late medieval Turco-Iranian world. The project sets this phenomenon in a dialogue with confession-building initiatives among other Muslim and Christian communities both within the Ottoman Empire and in the polities connected to it, both in Europe and in the Middle East.
Funded under H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Consolidator Grant. Agreement number: 648498 - OTTOCONFESSION
OTTOCONFESSION project partner is Boğaziçi University (Istanbul, Turkey), and the team there is led by Professor Derin Terzioğlu, who is a Senior Researcher on the project.
Project Coordinator, Dr Tijana Krstić, PhD Central European University Nador u. 9 1051 Budapest Hungary T: +36 1 327 3000 E: email@example.com W: https://cems.ceu.edu/ottoconfession Dr Tijana Krstić, PhD
Dr Tijana Krstić (PhD University of Michigan, 2004) is a historian of the early modern Ottoman Empire and Associate Professor in the Department of Medieval Studies at Central European University. She is the author of Contested Conversions to Islam--Narratives of Religious Change in the Early Modern Ottoman Empire (Stanford University Press, 2011) and numerous articles.
Getting to the root of corruption Many corrupt acts are built on cooperation between partners, who have a shared interest in bending or even breaking the rules. Researchers are using an approach built on behavioural ethics to probe the roots of corruption, which could help inform the development of policy that encourages ethical behaviour, as Dr Shaul Shalvi explains The ability of humans to work together and cooperate in shared endeavours is central to material and scientific progress, leading to important discoveries and new products that enrich our lives. However, there are also situations in which cooperation between different parties helps to embed corrupt relationships and abuses of power. “If we think about corruption in general, there are situations in which teams of people work together in order to succeed in unethical endeavours,” points out Dr Shaul Shalvi. Based at the University of Amsterdam, Dr Shalvi is the Principal Investigator of a project investigating the basis of corrupt collaborations. “Corruption is defined as the abuse of power for personal gain. So there may be situations in which a single public official uses their power for personal financial gain, without engaging others,” he says. “But, arguably more often than not, these actions are done by groups of individuals.”
Corrupt behaviour A company might bribe a public official to approve a development application for example, an act in which both sides have engaged in corrupt behaviour. Researchers now aim to probe deeper into the roots of these corrupt collaborations, building on recent investigations in a field called behavioural ethics. “This is the scientific approach to studying ethical questions. For years ethics has been taught in business schools from a normative perspective. So you would go to a class on behavioural ethics, learn which acts are wrong and which are acceptable, and follow a normative philosophy, based on work by leading philosophers throughout history,” explains Dr Shalvi. “Over the past 10-15 years, researchers in behavioural ethics have started to look at questions like, when do people violate rules? So, it’s less about defining what’s right and wrong, but rather acknowledging that there are rules and asking; ‘when do people decide to violate them?’” This research involves looking at different settings and investigating which are more or less likely to push people to violate the rules. It may be that an individual comes into a setting in which they perceive other people as violating the rules for example.
“There are organisations in which corrupt norms emerge. A new employee joining that organisation would be exposed to corrupt norms, and therefore would be likely to start adapting,” says Dr Shalvi. One of the projects Dr Shalvi and his colleagues are working on at the moment looks at the question of what happens when people move between different groups. “Think about job rotation – people work in one position, then every few years they move to another group, or unit, for a certain amount of time, and perhaps go back. So we were wondering – what happens when one person moves to another group and is exposed to corrupt norms?” outlines Margarita Leib, a PhD student on the project who contributed to this work.
rules of the game, the participants have a shared financial interest in reporting the same high outcome, even if that wasn’t what they actually observed. “If they want to maximise financial profit, they should both report 6. We call these individuals brazen liars – they are there to make money and don’t care about the rules of the game,” says Dr Nils Köbis, another postdoctoral fellow from the team. There is also a further, ethical dimension to the experiment. “Before the experiment, we tell the participants that we are going to make a donation of 2,000 Euros to an organisation that reduces carbon footprint. Every time a person in an experiment earns money – based on an incorrect report – we subtract the amount earned by the participants from the total overall donation to the charity,” continues Dr Köbis.
Over the past 10-15 years, researchers in behavioural ethics have started to look at questions like, when do people violate rules? The individual concerned may adapt to the situation in which they’re working and become corrupt themselves, or alternatively they might decide to reject those values. Researchers are running experiments in the laboratory to investigate how people respond in this type of situation. “The game that we often used in this project involves two players – the first and second mover. The first mover rolls a die on a computer screen and is asked to report the outcome of the roll into the computer. A second mover observes the report of the first, and also rolls a die on the computer screen and reports the outcome – if both report the same outcome, they win that amount,” explains Dr Ivan Soraperra, a post-doctoral fellow working on the project. The researchers know what the participants actually observed, but individuals themselves see the report of the other player. “After they’ve played the game once, the first and second mover learn about each others’ reports and their shared payoffs. Then they do the task again,” says Dr Soraperra. This gives researchers the opportunity to observe how the relationship between the participants develops and the extent to which they adapt to each others behaviour. Under the
This may deter some instinctively honest individuals from engaging in corrupt behaviour, while others may carry on regardless, motivated purely by self-interest. Researchers have adapted this experiment in different ways, with the wider goal of investigating what happens when participants in the experiment are given the opportunity to switch partners, to someone who maybe has a similar outlook to themselves. “After every three rounds we ask participants whether they want to stay with their current partner or switch and work with another,” outlines Dr Shalvi. Researchers expect the liars, the dishonest individuals who are there purely to make money, to respond in a fairly predictable way. “We expect that when a liar is linked with an honest person, they will be upset. Therefore they will switch until they find a partner in crime that will maximise profits with them, even at the expense of the charity,” says Ms. Leib. The response of the honest individuals is more difficult to predict. Some may feel uncomfortable about securing profit based on the corrupt acts of other people and so seek a different partner, others might decide they are responsible only for their own behaviour
and not that of their partner, while a third possibility is what Ms. Leib describes as ‘ethical free-riding’. “There are honest individuals who always report the outcome that they observe on the computer screen. However, if they are paired with a dishonest partner, they stay rather than switch – so they secure profit based on the acts of the corrupt partner, while remaining honest themselves,” she explains. This behaviour in itself perpetuates corruption however. “Anti-corruption strategies are commonly about encouraging whistle-blowing, about not tolerating actions that violate rules, laws and organisational norms,” points out Dr Shalvi. “However, sometimes people stay silent and believe it isn’t their business.”
Volkswagen scandal A good example is the recent scandal around Volkswagen’s efforts to get around vehicle emissions tests. While not all the details are known, it seems probable that knowledge of these methods was not limited solely to a small group of engineers. “It’s likely that some people at least understood that things were too good to be true, but they didn’t stand up and say that,” says Dr Shalvi. Combatting corruption is a major concern for policy-makers
across the world, so the project’s research holds wider relevance in terms of incentivising ethical behaviour; Dr Shalvi points to the example of recent research in Colombia. “We ran an experiment in Bogota, where there are issues with corruption in education, where teachers are bribed to give higher grades,” he explains. “One of the main insights we gained was around the fair wage hypothesis – that if teachers were paid better, they would not have an incentive to ask for bribes.” This might seem logical, yet increasing a teacher’s salary is not enough in itself to eradicate corruption, as the financial incentive to accept a bribe remains. A greater degree of competition between schools can help to encourage ethical behaviour, believes Dr Shalvi. “In our experiment, teachers earn a fixed salary, but on top of that they may earn more money if more students choose to study in their school. Since a bribe-free school produces a better quality diploma, students have an incentive to attend such schools. Allowing for competition between schools by paying teachers based on the number of students they have might establish a market for honesty - a market for schools that provide a bribe-free environment,” he says.
Corruption Roots At the roots of corruption: a behavioral ethics approach
Cooperation is essential for completing tasks that an individual cannot accomplish alone, and deciphering the roots of human cooperation has been one of the major interdisciplinary scientific challenges of recent decades. However, much is still to be done, especially with respect to understanding the building blocks of the potential darker side of human cooperation – specifically, the tendency to join forces by bending ethical rules to achieve personal success at society’s expense. Such joint self-serving acts of dishonesty are at the roots of corruption and are the core of the current investigation. The societal impact of corruption – defined as the abuse of power for personal gain – was recently assessed in a European Commission anti-corruption report (EU report, 2014), which suggests that corruption costs the European economy more than 120 billion Euros annually. Home Affairs Commissionar Cecillia Malmstroem, who presented the report, claimed that the extent of European corruption is ‘breathtaking’, as the annual cost of corruption within the EU equals the bloc’s annual budget (BBC News, 2014). Gaining a basic scientific understanding of the (psychological) roots of corrupt behavior and the settings likely to remedy it, is the goal of this proposal.
Funded under H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Starting Grant
Shaul Shalvi, PhD Faculty of Economics and Business Section Microeconomics Roetersstraat 11 1018 WB Amsterdam The Netherlands T: +31 205 254 293 E: S.Shalvi@uva.nl W: http://www.uva.nl/profiel/s/h/s.shalvi/s. shalvi.html W: https://sites.google.com/site/morallabshalvi/
Fig. 1. (A) Simulation of reported outcomes assuming honest reports. Each dot represents the reports of player A and player B in a single trial. The simulation assumes that each number (1 to 6) is reported with a probability of 1/6 in any given trial. The position of dots is jittered to allow visibility of identical outcomes. (B) The observed distribution of reported outcomes in aligned outcomes. Each dot represents the reports of player A and player B in a single trial. The position of dots is jittered to allow visibility of identical outcomes. High values on the diagonal—especially pairs of 6’s—which yield the highest payoffs, are overrepresented.
Weisel, O., & Shalvi, S. (2015). The collaborative roots of corruption. Proceedings of the National Academy of Sciences, 112(34), 10651-10656.
Professor Shaul Shalvi
Shaul Shalvi is an Associate Professor at CREED, the Centre for Research in Experimental Economics and political Decision-making at the University of Amsterdam. His main research interests are in experimental economics, moral psychology, cooperation and in particular behavioural ethics. Fig. 2. Four prototypical dyads. The horizontal axis represents the 20 trials; the vertical axis represents the die roll outcomes; an “O” represents player A’s report; and an “X” represents player B’s report. (A) A brazen dyad, reporting a “double 6” 20 times; (B) player A is brazen, player B appears honest; (C) player A appears honest, player B is brazen; (D) corrupt signaling. After mutual reports of 4 in the first five trials, A reported a 4 once more, but B replied with a 6, arguably to suggest to A that switching to higher numbers would be more profitable.
Path breaking research on the role of the individual in war War has historically been waged by collective actors, yet the nature of armed conflict is changing, as normative and technological changes result in individuals playing an increasingly prominent role. Researchers in the IOW project are investigating the wider impact of this shift, as Professor Jennifer Welsh explains The course of history has been marked by wars waged by collective actors, referred to as states or ‘warring parties’, often causing significant collateral damage to civilian populations. However, the conduct of war is evolving, away from a practice dominated by groups to one that is becoming more individualised. “The two main factors driving this are normative and technological,” says Professor Jennifer Welsh. “They stem from the increased prominence of human rights norms, which has led some to argue that war no longer has its ‘own’ law and morality, but rather should be guided by our broader ‘ordinary’ morality that is built on the rights of individuals. In addition, individualisation arises from changes to technology that allow those who wage war to be more precise in their targeting.” On the one hand, this technological innovation enables actors to be morally progressive, in that precision technology helps to reduce collateral harms. But on the other hand, as Professor Welsh explains, individualisation is changing the nature of risk in war. “A weapon can now be fired from a location far from the actual theatre of conflict and precisely targeted at specific individuals, so it might not be necessary to commit large numbers of troops to achieve a strategic objective,” she points out. IOW project Based at the European University Institute in Florence, Professor Welsh is the Principal Investigator of the IOW project, an ERC-backed initiative bringing together researchers from different disciplines to analyse the impact of these changes. Researchers in the project aim to investigate the impact of the shift towards individualisation on different actors in war, focusing on the post cold-war period. These actors include the militaries of nation states, international and security organisations, and humanitarian actors. “We’re investigating
the behaviour of these actors in response to different aspects of individualisation. For example, if there is an imperative to elevate the individual, how does this change the reasons for going to war, and how war is conducted? Does individualisation affect how militaries are trained, how they’re deployed, and what risks they take?”
forces are now either helping one side, penalising one side, or helping some civilians and not others.” Peace-keepers are therefore faced with a difficult balance: acting in ways that protect individual rights, while at the same time maintaining the support of the host state, whose consent is a necessary precondition for them to operate effectively.
Individualisation is changing the nature of risk in war. A weapon can now be fired from a location far from the actual theatre of conflict and precisely targeted at specific individuals, so it might not be necessary to commit large numbers of troops to achieve a strategic objective outlines Professor Welsh. Humanitarian actors are also affected by the trend towards individualisation claims Professor Welsh, often making it harder for them to “carve out a neutral space in the context of war.” In addition to mapping out the different ways individualisation plays out in contemporary armed conflict, a second dimension of the project’s research centres around the dilemmas that arise as a result of individualisation. Professor Welsh points to the changing role of peace-keeping forces as an example. “Peace-keeping was, for a long time, the practice of inserting armed but purely defensive forces between warring parties to monitor a ceasefire. Now, peacekeeping forces are sometimes inserted when armed conflict is still raging,” she explains. Moreover they may have a much more robust mandate and a fundamentally different role. “In many cases their mandate is not primarily to keep warring parties apart – it also includes the imperative to protect particular civilians from an imminent threat.” One result of this shift, suggests Professor Welsh, “is that peace-keeping operations may be negatively perceived by their host states as having abandoned impartiality. The claim could be made that peace-keeping
Peace and justice? The on-going conflict in Syria offers an additional example of the dilemmas created by individualisation. In this case, the imperative to hold individual perpetrators of international crimes accountable – which stems from post Cold War advances in international criminal law – seemingly clashes with the objective of negotiating a peace to end the bloodshed. How does this ‘peace vs. justice’ dilemma get resolved by actors in armed conflict? “One of the key objects of study in the project,” Professor Welsh explains, “are the different ways in which tensions are addressed. In some cases, the solutions are ad hoc and case-bycase. In others, we see concrete institutional reforms that try to avoid the tensions or to minimise their impact. And, in other cases, there are efforts to reconcile two apparently conflicting goals by developing one overarching priority or concept. This latter strategy we often see used by international legal scholars or judiciaries.” In the case of Syria, Professor Welsh observes that “the UN Security Council’s power to refer situations to the International Criminal Court has not been acted upon. In May 2014, there was a veto on a resolution to do just that.”
IOW The Individualisation of War: Reconfiguring the Ethics, Law, and Politics of Armed Conflict Project Objectives
The research of our interdisciplinary team is directed at two main outcomes: • the first integrated conceptual framework for understanding individualisation (how it is manifest and the dilemmas/tensions to which it is giving rise); and • concrete recommendations for policy actors - both on how to respond to particular ethical, legal, or political challenges that arise from individualisation and on the likely longer term trajectory of individualisation.
One of the core reasons was that certain Council members believed it would be counter-productive to indict members of the Syrian administration, and that in fact it would be a barrier to resolving the crisis. Other states took a different view, and worked within the General Assembly to create the International, Impartial and Independent Mechanism (Triple I-M) to investigate possible prosecutions for war crimes. “It’s not going to deliver justice immediately,” Professor Welsh argues, “but the claim is that it might do over the longer term.” While the individual is playing an increasingly prominent role in armed conflict, collectives still matter, creating dilemmas that Professor Welsh and her colleagues are investigating. “What are the political ways of resolving these dilemmas, and what are the institutional and legal ways?” she asks. The project is devoting considerable attention to the jurisprudence that has developed over the last 15-20 years, that seeks to accommodate individuals’ claims for human rights – most notably detainees and soldiers – with the regime of international humanitarian law that applies in situations of armed conflict. Courts are grappling with the challenge of how to reconcile these two bodies of law, which has arisen as a consequence of this process of individualisation. This
demonstrates that individualisation is having an impact not just in the world of academia, generating debates among moral philosophers, legal scholars, and political scientists, but also affects ‘real world’ decision-making. A final dimension of the IOW project is an examination of how individualisation is being contested, and what shape this process might take going forward. “Not all theorists of armed conflict, and not all actors involved in armed conflict, believe that individualisation is a progressive move,” Professor Welsh argues. More specifically, the human rights norms that underpin significant aspects of individualisation are being challenged in multiple ways, as a result both of geopolitical shifts that see liberal states declining in their relative strength, and of push-back against attempts to elevate human rights in armed conflict. Professor Welsh and her team also observe contestation of the effects of technological innovation, particularly with respect to the increased use of ‘drones’ (unmanned aerial vehicles). “Technology has always been part of the changing character of war. But while that change has in some ways helped to reduce the severity of war, it has also arguably made the resort to the use of force easier and less costly. We are debating the pros and cons of that reality.”
The research leading to these results has received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement n. . The European Commission funded the project through an Advanced Grant for the period 1 May 2014 to 30 April 2019.
The project is hosted at the European University Institute in Florence, with the University of Oxford as project partner and beneficiary, and assembles an international research team under the scientific guidance of Prof Jennifer Welsh.
Professor Jennifer M. Welsh European University Instiitute Via dei Roccettini 9 50014 Fiesole Italy T: +39 055 4685 436 E: Jennifer.Welsh@eui.eu W: https://iow.eui.eu Professor Jennifer M. Welsh
Jennifer M. Welsh is Professor and Chair in International Relations at the European University Institute (Florence, Italy). She was previously Professor in International Relations at the University of Oxford, co-founder of the Oxford Institute for Ethics, Law and Armed Conflict, and the Special Adviser to the UN Secretary General on the Responsibility to Protect (20132016). She is the author and editor of several books and articles on humanitarian intervention, the evolution of the notion of the ‘responsibility to protect’, the UN Security Council, and Canadian foreign policy. Professor Welsh sits on the Advisory Boards of the Peace Research Institute in Frankfurt, the Auschwitz Institute for Peace and Reconciliation, and the Global Centre for the Responsibility to Protect.
This path-breaking interdisciplinary project critically analyses the impact of the increased prominence of the individual in the theory and practice of armed conflict.
The stunning Atoms for Peace galaxy was given its nickname due to its superficial resemblance to an atomic nucleus, surrounded by the loops of orbiting electrons. “Atoms for Peace” was the title of a speech given by President Eisenhower in 1953, in an attempt to rebrand nuclear power as a tool for working toward global peace. Somewhat ironically this galaxy has had anything but a peaceful past — it was formed in a catastrophic merger between two smaller galaxies nearly 1 Gyr ago. Massive star clusters were formed in the merger. Credit: NASA & ESA, Acknowledgement: Judy Schmidt (Geckzilla)
Understanding the connection between galaxies and globular clusters Globular clusters can be found around almost all galaxies, yet questions remain about their formation and evolution and how they relate to their host galaxies. Recently, it has been found that the stars within globular clusters show chemical anomalies, not found in stars outside clusters. We spoke to Professor Nate Bastian about the Multi-Pops project’s work in studying globular clusters, which could lead to new insights into how galaxies are assembled A type of
star cluster, globular clusters can often be observed in the night sky, and continued study of them could lead to new insights into the formation of galaxies. For around the past ten years, Professor Nate Bastian and his colleagues have been studying the formation and evolution of these clusters in nearby galaxies. “That includes the Milky Way, but also galaxies where we cannot resolve individual clusters into their constituent stars. In the local universe we can see, in detail, things like how they’re forming, how their properties depend on the local conditions, and how long they live,” he says. Globular clusters are formed fairly rapidly by astronomical standards, over a period of around one or two million years, the result of dense gases being brought together in large molecular clouds. “The star clusters that are formed could be very low-mass objects – they might have just a few hundred stars within them – or they could host 10 million stars, which would be a big globular cluster. The size of the cluster depends on the cloud properties themselves,” explains Professor Bastian.
E-MOSAICS Researchers in the Multi-Pops project, along with LJMU colleague Dr. Rob Crain and Dr. Diederik Kruijssen and his group in Heidelberg, now aim to use the information that has been gained about globular clusters to build a deeper understanding of their formation
that are forming today are essentially the same as the clusters that formed in the early universe,” says Professor Bastian. This is by no means fully established as fact in the field, yet the results of research so far broadly bear out the initial assumption. “With E-MOSAICS, we have found that we are able to broadly
With E-MOSAICS, it is now possible to trace back the globular clusters to the extreme conditions under which they formed, and to see how globular cluster populations are shaped by the growth of their host galaxies and evolution. “We’ve taken what we’ve learned from these earlier studies, and used it to guide the development of a new suite of simulations called E-MOSAICS (Modelling Star Cluster Population Assembly in Cosmological Simulations within EAGLE). These simulations incorporate Kruijssen’s ‘MOSAICS’ model of cluster formation and evolution into the EAGLE simulations of galaxy formation, in whose development Crain played a leading role. The big leap of understanding is that the clusters
reproduce the globular cluster populations that we see today, more than 9 billion years after their formation,” continues Professor Bastian. The simulations show that the peak of globular cluster formation occurred between 10 and 11.5 billion years ago, which was also the peak time at which stars formed in the universe, indicating that the formation of globular clusters is related to the formation of stars. “Our model suggests that globular clusters are not in fact particularly special,
“Visualisation of the E-MOSAICS simulations. Ten Milky Way-like galaxies were chosen from the full box of the high resolution (“Recal”) EAGLE box. The main panel shows the dark matter distribution in the box, with yellow circles highlighting the positions of the galaxies resolved in the zoom-in simulations. The panels towards the right show for a single zoom-in simulation the gas density (top) and simulated optical images of the face-on (middle) and edge-on (bottom) views of the final galaxy. The five panels in the bottom row show the evolution of the gas density in the galaxy and its star cluster population from high redshift (z=10) to today (z=0).”
but rather the culmination of the average star formation process, hence globular clusters are just tracing the star formation history of the Universe” says Professor Bastian. This contrasts sharply with other theories of globular cluster formation, which suggest that they were all formed even earlier in the Universe. These theories invoke special conditions only present in the early universe to form globular clusters, so that all globulars will have ages greater than 12 or 13 billion years. Professor Bastian says the project’s simulations show a different picture. “In our simulations, no special conditions are needed, in fact we find that some globular clusters are still being formed today,” he says. The E-MOSAICS project is jointly led by the LJMU team (including Drs. Joel Pfeffer and Rob Crain), Ms Meghan Hughes and the team of Dr. Diederik Kruijssen at Heidelberg University.
chemically homogenous, and that stars within a globular cluster have different chemical abundances; researchers in the project are also investigating the origin of these abundance variations, known as multiple populations. “In the vast majority of clusters iron is constant. But, we see star-to-star variations in very specific elements, in particular helium, carbon, nitrogen, oxygen and sodium,” explains Professor Bastian. “These variations are not random. For example, if a star is rich in sodium,
it is poor in oxygen – if it is rich in nitrogen, then it’s also poor in oxygen.” Another branch of the Multi-Pops project is investigating the origin of these chemical anomalies and what insights can be drawn into globular cluster formation. This part of the project is observationally driven. Professor Bastian and his colleagues are using data gathered on globular clusters from the Hubble Space Telescope and the Very Large Telescope (VLT) in Chile, with the aim of testing different scenarios that have been put forward to explain these variations in abundance. “Certain predictions are derived from these scenarios, which we then test against the data,” he outlines. Some exciting results have been gained from this work. “The first is that none of the models work. So we have essentially ruled out all of the scenarios that have been put forward in the literature. This might seem disappointing, but it’s also exciting, as it opens up new avenues of research,” says Professor Bastian. “The other exciting finding is that we’ve seen a surprising age effect in the presence of these multiple populations. We see these chemical anomalies within all of the clusters above about 2 billion years old (Gyr), whereas we do not find this in clusters younger than 2 Gyr. However, the finding of chemical anomalies in a cluster as young as 2 Gyr is one of the strongest pieces of evidence that we have linking young and ancient globular clusters.” This part of the Multi-Pop project is being led by Ms. Silvia Martocchia (a PhD student at LJMU) and Drs. Ivan Cabrera-Ziri (Harvard) and Carmela Lardo (EPFL, CH).
Future Directions A lot of the detail around these findings still needs to be filled in, and researchers continue to investigate the underlying causes behind the differences that have been observed.
Abundance variations Another radical change in our understanding of globular clusters is that they have traditionally been thought of as the quintessential simple stellar populations (i.e., all stars within a cluster have the same chemical abundances and age, within some small tolerance). However, it’s now known that globular clusters are not in fact
Part of the Multi-Pop team (from left to right): Dr. Joel Pfeffer, Dr. Ivan Cabrera-Ziri, Dr. Carmela Lardo, Dr. Chris Usher, Ms. Silvia Martocchia, Dr. Sebastian Kamann and Prof. Nate Bastian (missing Ms. Meghan Hughes, Ms. Hannah Dalgleish, and Drs. Maria de Juan Ovelar and William Chantereau).
Multi-Pops Fulfilling the Potential of Globular Clusters as Tracers of Cosmological Mass Assembly
Globular clusters (GCs) are among the oldest luminous sources in the universe, bearing witness to the earliest stages of galaxy formation as well as their evolution to the present day. While GCs have played a pivotal role in our understanding of the assembly of galaxies, their full potential remains unfulfilled due to our lack of understanding of how they form. One of the largest stumbling blocks has been the anomalous chemistry (both metallicity distributions and abundance patterns) of GCs relative to field stars within galaxy. This project turns the problem around to exploit these differences to understand the co-evolution of GCs and their host galaxies.
The project is funded by an ERC Consolidator Grant (PI N. Bastian), Royal Society University Research Fellowship (PI N. Bastian). The work on “E-MOSAICS” is being carried out in collaboration with Dr Diederik Kruijssen, who is funded through an ERC Starting Grant and an Emmy Noether Independent Group Award and Dr Rob Crain who is funded by a Royal Society University Research Fellowship.
Professor Nate Bastian Royal Society University Research Fellow Head of Research - Astrophysics Research Institute Liverpool John Moores University T: +44 151 231 2933 E: N.J.Bastian@ljmu.ac.uk W: http://www.astro.ljmu.ac.uk/~njb/MultiPops_ERC.html W: http://www.astro.ljmu.ac.uk/~astjpfef/emosaics/
This will form an important part of Professor Bastian’s future research agenda, alongside running further simulations. “Together with Dr. Kruijssen’s group, we’ve been trying to simulate a Milky Waytype galaxy, and we’ve run well over 300 simulations, in which we have changed the parameters. We’ve explored parameter space, and how we implement the physics of cluster formation and evolution,” he outlines. “Every time we run a new simulation for E-MOSAICS, we can turn off different physical processes, in order to highlight which process is dominant. That’s really what we’re looking for here – what’s the major factor behind the observations that we see?” The next-generation of cosmological simulations of galaxy formation will adopt higher spatial resolution and more detailed treatments of the interstellar medium. By incorporating these developments in E-MOSAICS, Prof. Bastian’s team and colleagues in LJMU and Heidelberg will be able to probe deeper into the origin and evolution of globular clusters. “We’ll use those models, and insert our model on the top of it,” he says. The wider goal in this research is to place globular cluster formation and evolution within the wider cosmological context, which is a long-held ambition in the field. “This has been tried many times in the past, with varying degrees of success, but this
study is breaking new ground,” says Professor Bastian. “We are following the formation of globular clusters – depending on the local conditions – then we follow them through time. We trace where the globular clusters actually end up in the galaxy, so we have spatial information as well.” A number of collaborations have been established to investigate these predictions and compare them with observed data. While in the ideal case the simulations exactly correspond with observed data, inaccuracies help researchers to improve models further. “We’re trying to really push the models to their breaking point – to figure out exactly what’s going wrong and what’s going right – and then insert new physics,” explains Professor Bastian. On the observational side of the project, Professor Bastian plans to investigate the age effect that has been observed in the composition of globular clusters in more detail. “Our application for time with the Hubble space telescope next summer has been approved, and we also hope to get some time with the VLT in Chile,” he continues. “We have two relatively large proposals to explore the age effect in more detail. For example, what is the exact age? So we are going to sample that age range more finely, and try to figure out exactly when this multiple populations phenomenon comes in.”
©NASA, ESA, and Martino Romaniello (European Southern Observatory, Germany).
Professor Nate Bastian
Professor Nate Bastian gained his PhD from Utrecht University (The Netherlands) in 2005. He has held Postdoctoral positions in University College London, Cambridge and Exeter Universities. He’s been awarded the STFC Advanced Fellowship (now known as the Rutherford Fellowship), and the Royal Society University Research Fellowship. He was a Senior Scientist at the Excellence Cluster Universe in Garching, Germany for 2 years. He is currently a Professor of Astronomy at Liverpool John Moores University.