EU Research Spring 2019
12 Years to save the Earth: An in depth look at startling climate change warning
Carlos Moedas on the importance of European innovation Donald Trump to cut billions from research funding
Brexit latest: the effects on global science research and its funding Air pollution a bigger killer than smoking
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
Editor’s N T
he internet, it has changed us in the way we behave, the way we communicate, the way we buy things and interact with each other. The internet is used by 55% of the world’s population and it would be a stretch for a lot of people born into recent generations, to imagine life without it.
Initially it was perceived as the ultimate in connective freedom and transformed the speed we were able to access any information but recently we have seen it manipulated as a tool for influencing politics. It’s also been linked with mental health problems, crime and bullying. All these darker developments burdened the inventor of the world wide web, Sir Tim Berners-Lee, who has set out to promote a contract devised for using the web, with the aim to thwart fake news, spreading hate and devious manipulation. It must be hard to see your creation that was meant to serve humanity, turn into something of a monster. But Sir Tim Berners-Lee shouldn’t feel too burdened with guilt. If he had not invented the internet someone would have. That’s because there is a sort of inevitability around successful innovation, as the saying goes, ‘necessity is the mother of all invention.’ Our powerful imaginations conjure better ways of doing things and we build on each development in a predictable way.
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
Way back in 1974, way before Tim thought up the internet, science fiction author, Arthur C. Clarke stood in front of a TV camera, within a massive room, a room packed with bulky machines, which combined made up an early computer. Confidently, he predicted not only the internet but how it would change our lives. Referring to the interviewer’s young son, who was with them in the giant computer room, he said: “The big difference when he grows up in the year 2001, he will have in his own house a console with which he can get all the information he needs for his everyday life, like his bank statements, his theatre reservations, all the information you need over the course of living in a complex modern society. This will be in a compact form in his own house … and he’ll take it for granted like a telephone.” The interviewer asks how it will affect us socially, with our whole life based around a computer. Arthur C. Clarke answers, “It will make it possible for us to live anywhere we like, any businessman could live anywhere on Earth and still do his business through a device like this.” Innovation seeps into our lives because we strive to function more efficiently. If you see a process that is slow, a physically demanding task, a challenge that we could manage without, any human problem, it will soon be dealt with through innovation and that is a future prediction you can bet on. There will always be issues we need to guard against, when technology and people meet in the middle, but nothing will change our unceasing drive to create the next best thing.
Hope you enjoy the issue.
Richard Forsyth Editor
Contents 4 Research News EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation
10 MANGO Protein aggregation is an important factor in the development of many different diseases, now researchers in the MANGO project are investigating the underlying basis of the process, as Professor Joost Schymkowitz explains
13 PIPE Inflammatory bowel diseases and irritable bowel syndrome affect millions of people across the world, yet there is still no cure. We spoke to Doctor Nathalie Vergnolle about her work in investigating the physiology of the intestine
16 ChromAdict We spoke to Dr Genevieve Almouzni about the work of the ChromADICT project in investigating the general principles that control chromatin states, work which could lead to a deeper understanding of many pathological cases
18 MultiCO A relatively small proportion of young people choose science studies after leaving school. The MultiCO project aimed to heighten awareness of scientific careers through the introduction of careerbased scenarios in lessons, as Professor Tuula Keinonen explains
28 Climate Change Report 21 Exchange Forensic DNA databases are an important tool for the control of crime, yet European nations have different approaches to regulating, gathering, using, and sharing forensic DNA data. This is a topic at the heart of Professor Helena Machadoâ&#x20AC;&#x2122;s research
24 WALLWATCHERS Cell walls within a plant are highly dynamic, as they need to change shape to allow for growth and development. Professor Julia Santiago aims to dissect how the information that comes from a cell wall is integrated inside the cell
fluid highways Little is known about pore connectivity and how fluids flow through shale rock. We spoke to Dr Maartje Houben about her work in characterising fluid flow pathways in shale rock and the wider implications of her research
26 EPIFISH Effective breeding strategies are crucial to the sustainability of the aquaculture sector and its ability to meet growing demand for seafood products. Professor Jorge Fernandes tells us about his work in investigating the role of epigenetics in fish domestication
Climate change is never far from the news, and debate continues about the extent and likely future impact of climate change. How is our climate changing? And what can we do to adapt?
By Richard Forsyth
32 DC FlexMIL Photons are an important resource in the development of quantum technologies, with researchers seeking to control and harness their properties. We spoke to Dr Michael Kues about his research into flexible on-chip light sources
35 SpinCaT Scientists are searching for new ways to control heat, charge and spin currents in nanostructures, topics central to the work of the SpinCaT priority programme, as Professor Christian Back explains
36 Market Design Market frictions currently limit access to beneficial technologies in some markets. We spoke to Professor William Fuchs about his research into these imperfections, which could lay the foundations for more effective regulation in future
37 MOEEBIUS Improving energy efficiency performance in buildings is a major priority for the European Commission. The MOEEBIUS project is introducing a totally new approach to reducing energy consumption in the building sector
EU Research Spring 2019
12 Years to save the Earth: An in depth look at startling climate change warning
Carlos Moedas on the importance of European innovation Donald Trump to cut billions from research funding
40 Soft Photoconversion Conventional method of making solar cells have some significant shortcomings which limit their effectivess. Dr Micheál Scanlon is now exploring a new approach to solar energy conversion
42 DREEAM The DREEAM project is looking at energy efficiency in residential buildings on a larger scale than previously possible, which could make green renovations more feasible in future, as Rolf Bastiaanssen explains
The EC has set ambitious goals around retrofitting Europe’s housing stock to improve energy efficiency. We spoke to Maarja Meitern of about the Revalue project’s work in investigating the relationship between the energy efficiency of a property and its value
Space Agency (Mars) With renewed interest in space exploration, and plans for a mission to Mars, the European Space Agency is set for an exciting future.
By Richard Forsyth
48 GALNUC Recent observations have shed new light on astrophysical dynamics and the behaviour of stellar systems, now Professor Bence Kocsis and his colleagues in the Galnuc project aim to build on these foundations
50 FIRST LIGHT Researchers in the First Light project are looking back into cosmic history, aiming to pinpoint the time at which the universe was bathed in starlight for the first time, as Professor Richard Ellis explains
53 DELPHI Researchers in the Delphi project are investigating the relationship between galaxy formation and reionization, while also addressing other major questions in physical cosmology, as Dr Pratika Dayal explains
56 GloBe There is a long history of migration between South Asia and Africa, but interactions intensified around the latter part of the 19th century. Dr Margret Frenz is investigating circular migration between South Asia and Africa in the GloBe project
60 HOM A lot of human behaviour is acquired through imitation, from language, to cultural tastes, to political opinions. Dr Nidesh Lawtoo spoke to us about his work in exploring the contemporary relevance of mimesis
61 JCR An increasing number of civil and criminal cases are resolved outside the courtroom, leading to a changing role for the judiciary. This is a topic of great interest to Professor Michal Alberstein and her colleagues in the JCR project
64 ChronHIB TThe early Middle Ages was a productive period in Ireland’s literary history, offering a window into the language used at the time. This is the focus of a lot of attention in the ChronHib project, as Professor David Stifter explains
66 DIGITALMEMORIES The disappearance of 43 students from the Ayotzinapa Rural Teachers college in Mexico prompted widespread protests. Professor Silvana Mandolessi is investigating the role of digital media in shaping how this case has been perceived and remembere
Brexit latest: the effects on global science research and its funding Air pollution a bigger killer than smoking
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
EDITORIAL Managing Editor Richard Forsyth email@example.com Deputy Editor Patrick Truss firstname.lastname@example.org Deputy Editor Richard Davey email@example.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks firstname.lastname@example.org PRODUCTION Production Manager Jenny O’Neill email@example.com Production Assistant Tim Smith firstname.lastname@example.org Art Director Daniel Hall email@example.com Design Manager David Patten firstname.lastname@example.org Illustrator Martin Carr email@example.com PUBLISHING Managing Director Edward Taberner firstname.lastname@example.org Scientific Director Dr Peter Taberner email@example.com Office Manager Janis Beazley firstname.lastname@example.org Finance Manager Adrian Hawthorne email@example.com Account Manager Jane Tareen firstname.lastname@example.org
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: email@example.com www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
The EU Research team take a look at current events in the scientific news
Carlos Moedas says innovation vital for the future of Europe
The Commissioner for Research, Science and Innovation believes that science and innovation are vital for the future of Europe, more of which is detailed here. Carlos Moedas, Commissioner for Research, Science and Innovation has a number of important responsibilities. One is promoting the international excellence of the research and science in the European Union (EU), as well as strengthening research capacities and innovation across all Member States. Another is ensuring that research funding programmes, in particular, Horizon 2020, contribute to the European Commission’s jobs, growth and investment package.
Moedas says that education held the power to create a shift in society and that today, the same is needed from future political decision-makers. 4 As we have seen in this article, science and innovation are vital topics for the scientists, innovators and decision-makers of the future. Innovations such as the light bulb changed the world and future innovations could have the same impact but making sure we use the powerful tool of education is vital in such endeavours.
In addition, it is worth mentioning that Commissioner Moedas is also responsible for evaluating how EU funded research can be used in a more effective way and ensuring that proposals by the European Commission are based on scientific evidence. A way in which the many challenges faced by society can be met is to encourage private companies to apply research in this direction, and also creates more high-quality jobs An example of the latter point can be found when in early October 2018, when the European Commission and Breakthrough Energy signed a Memorandum of Understanding to establish Breakthrough Energy Europe (BEE). In essence, this is a joint investment fund to assist innovative European companies develop and bring to the market radically new clean energy technologies. Commissioner Carlos Moedas comments on the benefits of the BEE: “We are delivering on our commitment to stimulate public-private cooperation in financing clean energy innovation. The €100 million fund will target EU innovators and companies with the potential to achieve significant and lasting reductions in greenhouse gas emissions.” In a lecture at Sciences Po on 6th December 2018, Commissioner Moedas says that science and innovation are very important subjects for politicians today. He believes that these topics are vital for the future of Europe, not only for scientists and innovators but also for the decision-makers of the future. Moedas then explains the link between a wider aim of the European Commission detailed at the beginning of this article and the policy area of science itself: “Jobs and growth are impossible without science at the centre of political priorities. They are about the direction that Europe will take in the future” Moedas then underlines that in Europe, politics and science and innovation should work in the same way, by which he means they should strengthen each other to create the best outcome. In the view of Moedas, there are three changes needed for the science and innovation landscape in the future, which are: (i) To increase collaboration; (ii) more work at the intersection of disciplines and; (iii) more disruptive innovation.
© European Union
Post Brexit research fund open to the world considered by UK Government The very realistic potential loss of European Funding leads the UK government to consider creating new granting scheme. The UK government is considering creating an international research fund to fill a gap left by the loss of prestigious European Union funding after Brexit. Adrian Smith, director of the Alan Turing Institute in London, will lead a “major” project with the research community to look at establishing such a fund, UK science minister Chris Skidmore told a parliamentary science committee on 5 March. He said that such a fund, if established, would be open to international as well as British scientists. The move comes in response to concerns that, after Brexit, UK institutions may no longer be able to host prestigious European Research Council (ERC) grants, which scientists around the world can apply for and take up at an EU institution, or fellowships called Marie Skłodowska Curie Actions that give researchers EU money to spend some time working in a lab in another country. Both funding streams are part of the EU’s major research funding programme, Horizon 2020, which ends in 2020 and are an important source of funding for UK researchers. UK scientists won 22% of ERC awards in the decade to 2016, and the Marie Skłodowska Curie scheme draws thousands of overseas researchers to Britain. But it’s not yet clear whether UK-based researchers will have access to these schemes in the next EU research programme — Horizon Europe, which starts in 2021 — because participation rules for non-EU nations are still being discussed. Skidmore said that the the fact that the UK government is exploring an international research fund shows that despite Brexit, Britain “is not leaving its participation with its European scientific
partners behind”. “We have to look responsibly about what we do about ERC, what we do about those other grants that may not be covered, even in association to Horizon Europe,” Skidmore told the House of Lords Science and Technology Committee. Smith’s project is likely to last until the summer, although its exact details are yet to be agreed, said a spokesperson for Skidmore’s ministry, the Department for Business, Energy and Industrial Strategy. Skidmore said the assessment would feed into the department’s bid for cash as part of a spending review by the Treasury that informs how money will be allocated across the government in coming years. Skidmore also updated the Lords on his department’s preparations for a ‘no deal’ Brexit, the possibility that the UK crashes out of the EU without out any trade agreements in place. In that scenario, EU research funding to Britain would cease overnight. The UK government has guaranteed that it would replace the money for existing EU research grants and it is collecting information about who holds these grants. Skidmore said that around 6,700 of a total of about 8,200 UKbased grantees have registered with an online portal designed to allow UK funders to stand in for the EU if needed. The recipients of the 1,500 unregistered grants are likely to be in small businesses that are part of consortia receiving funding from Horizon 2020, and might not be aware of the funding’s source, he said.
Donald Trump cuts billions from NIH funding in the USA The Trump administration’s 2020 budget proposal suggests cutting $4.5 Billion NIH research funding. President Trump’s third budget request, released Monday, again seeks cuts to a number of scientific and medical research enterprises, including a 13 percent cut to the National Science Foundation, a 12 percent cut at the National Institutes of Health and the termination of an Energy Department program that funds speculative technologies deemed too risky for private investors. NIH would face a roughly $4.5 billion budget cut, according to an HHS document. Among the big losers, if Congress were to sign off on the budget request, would be the National Cancer Institute, dropping from $6.1 billion to $5.2 billion, and the National Institute of Allergy and Infectious Diseases, going from $5.5 billion to $4.75 billion. The administration is highlighting its request for $1.3 billion for opioid and pain research “as part of the government-wide effort to combat the opioid epidemic.” The NSF, which funds roughly a quarter of all federally supported basic science and engineering research in the U.S., would see its budget fall from $8.1 billion this year to $7.1 billion in 2020. NASA faces a modest cut — 2.3 percent lower than the agency’s 2019 funding, which was approved last month by Congress. The $21 billion for NASA is more than the Trump administration asked for last year, as administrator Jim Bridenstine pointed out Monday in a statement describing the FY2020 budget as “one of the strongest on record for our storied agency.” Bridenstine said the budget keeps NASA on track for putting humans on the moon again by 2028. But the proposed NASA budget does not include money for a new space telescope, WFIRST, which would look for distant planets and study the mysterious “dark energy” permeating the cosmos. Two Earth science missions aimed at understanding climate would be eliminated, as would an educational effort, the Office of STEM Engagement. The White House also sought to defer upgrades to NASA’s Space Launch System — a powerful new rocket that is still in development — and move some its proposed payloads to other vehicles. The Trump budget proposes to eliminate three environmental programs at the National Oceanic and Atmospheric Administration: Sea Grant, which supports environmental research on the coasts and in the Great Lakes; the National Coastal Zone Management grants, which provides incentives for states to restore and sustainably develop coastal resources; and the Pacific Coastal Salmon Recovery Fund, established by Congress 19 years ago to revive plummeting salmon populations in the Pacific Northwest. The new budget request drew immediate criticism from the American Association for the Advancement of Science. “If enacted,
the Trump administration’s proposed cuts to the fiscal year 2020 non-defense discretionary budget would derail our nation’s science enterprise,” said AAAS chief executive Rush Holt, a physicist and former Democratic congressman. Rep. Eddie Bernice Johnson (D-Tex.), the chair of the House Science Committee, called the cuts to science “unreasonably deep.” “This proposal is simply absurd and shows a complete disregard for the importance of civilian R&D and science and technology programs,” she said in a statement. Trump has roiled the waters of the research establishment since he came into office, not only by embracing scientifically discredited theories and casting doubt on mainstream climate science, but also by proposing massive cuts to science and medicine programs funded by the federal government. His previous budgets have requested cuts to the National Science Foundation, the National Institutes of Health, the Centers for Disease Control and Prevention and the Energy Department. The leaders of the scientific and medical community were outraged. President Trump’s third budget request, released Monday, again seeks cuts to a number of scientific and medical research enterprises, including a 13 percent cut to the National Science Foundation, a 12 percent cut at the National Institutes of Health and the termination of an Energy Department program that funds speculative technologies deemed too risky for private investors. NIH would face a roughly $4.5 billion budget cut, according to an HHS document. Among the big losers, if Congress were to sign off on the budget request, would be the National Cancer Institute, dropping from $6.1 billion to $5.2 billion, and the National Institute of Allergy and Infectious Diseases, going from $5.5 billion to $4.75 billion. The administration is highlighting its request for $1.3 billion for opioid and pain research “as part of the government-wide effort to combat the opioid epidemic.” The NSF, which funds roughly a quarter of all federally supported basic science and engineering research in the U.S., would see its budget fall from $8.1 billion this year to $7.1 billion in 2020. NASA faces a modest cut — 2.3 percent lower than the agency’s 2019 funding, which was approved last month by Congress. The $21 billion for NASA is more than the Trump administration asked for last year, as administrator Jim Bridenstine pointed out Monday in a statement describing the FY2020 budget as “one of the strongest on record for our storied agency.” Bridenstine said the budget keeps NASA on track for putting humans on the moon again by 2028.
But the proposed NASA budget does not include money for a new space telescope, WFIRST, which would look for distant planets and study the mysterious “dark energy” permeating the cosmos. Two Earth science missions aimed at understanding climate would be eliminated, as would an educational effort, the Office of STEM Engagement.
for years, adjusted for inflation, which has forced the agency to terminate major programs (such as the space shuttle) to begin new ones (such as building a new rocket and capsule that could explore deep space). The civilian space agency accounts for roughly half of 1 percent of the federal budget.
The White House also sought to defer upgrades to NASA’s Space Launch System — a powerful new rocket that is still in development — and move some its proposed payloads to other vehicles.
But NASA has been relatively flush lately: Under the budget recently passed by Congress, NASA got a 3.5 percent boost for fiscal 2019, to $21.5 billion. That’s 8 percent more than the $19.9 billion requested by the White House in its 2019 budget proposal. This year the Environmental Protection Agency — one of the president’s favored political targets — was subject to some of the most severe cuts. Trump’s budget would reduce EPA funding $2.8 billion, a 31 percent cut from its current budget.
The Trump budget proposes to eliminate three environmental programs at the National Oceanic and Atmospheric Administration: Sea Grant, which supports environmental research on the coasts and in the Great Lakes; the National Coastal Zone Management grants, which provides incentives for states to restore and sustainably develop coastal resources; and the Pacific Coastal Salmon Recovery Fund, established by Congress 19 years ago to revive plummeting salmon populations in the Pacific Northwest. The new budget request drew immediate criticism from the American Association for the Advancement of Science. “If enacted, the Trump administration’s proposed cuts to the fiscal year 2020 non-defense discretionary budget would derail our nation’s science enterprise,” said AAAS chief executive Rush Holt, a physicist and former Democratic congressman. Rep. Eddie Bernice Johnson (D-Tex.), the chair of the House Science Committee, called the cuts to science “unreasonably deep.” “This proposal is simply absurd and shows a complete disregard for the importance of civilian R&D and science and technology programs,” she said in a statement. Trump has roiled the waters of the research establishment since he came into office, not only by embracing scientifically discredited theories and casting doubt on mainstream climate science, but also by proposing massive cuts to science and medicine programs funded by the federal government. His previous budgets have requested cuts to the National Science Foundation, the National Institutes of Health, the Centers for Disease Control and Prevention and the Energy Department. The leaders of the scientific and medical community were outraged. But Congress, which has the power of the purse, largely ignored the 2018 Trump budget requests and protected the agencies. NASA’s budgets have been generally flat
The budget describes the cuts as an effort to eliminate “lowerpriority” EPA programs and return the agency to its “core mission of protecting human health and the environment.” The administration would eliminate the Global Change Research office, which helped produced the National Climate Assessment last fall, warning of growing impacts of climate change. The White House has proposed similar cuts at the EPA the past two years, but even the Republicanled Congress refused to embrace the sweeping reductions. Now, the Democratic-led House is certain to reject the administration’s efforts to scale back the agency’s reach and ambition. The White House request for $31.7 billion for the Department of Energy would be a cut of 11 percent if embraced by Congress. The administration seeks to reduce the department’s science budget from $6.6 billion to $5.5 billion. For the third year in a row, the administration is seeking to terminate the Advanced Research Projects Agency-Energy (ARPA-E), the Energy Department’s incubator of new technologies, which is dedicated to “high-potential, high-impact energy technologies that are too early for private-sector investment.” The president’s budget request says that killing the agency will “promote effective and efficient use of taxpayer funds,” and it adds that “positive aspects” of the agency will be integrated into other programs. Killing ARPA-E has long been a priority for smallgovernment advocates, who think the private sector is fully capable of handling innovation. But ARPA-E, first proposed under President George W. Bush, has remained popular with Congress.
Scientists close to bringing Mammoths back to life Japanese scientists take ‘significant step’ towards bringing prehistoric giants back to life. The last woolly mammoth populations died out just over 4,000 years ago, but the prehistoric giants could soon be back and plodding about just like they were during the ice age. Scientists in Japan claim to have taken a “significant step” towards bringing the extinct species back to life, after they transplanted cells extracted from the carcass of a mammoth into a mouse, where they subsequently recorded positive biological activity. The cells were taken from the 28,000-year-old mummified remains of a woolly mammoth, named Yuka, found in Siberian permafrost in 2010. The animal, which died when it was about seven-years-old, is one of the best preserved mammoths known to science. The team extracted tissue samples from the animal’s bone marrow and muscle, which they described as “well preserved”. They then began searching for cell nuclei remains. In total, 88 nucleus-like structures were collected from the muscle sample.
This marks a “significant step toward bringing mammoths back from the dead”, researcher Kei Miyamoto, one of the study’s authors told Japan’s Nikkei news outlet. “We want to move our study forward to the stage of cell division,” he added, but acknowledged “we still have a long way to go”. Most mammoth populations died out between 14,000 and 10,000 years ago. The last mainland population existed in the Kyttyk peninsula of Siberia until 9,650 years ago. But the species survived for another 5,000 years on Siberian islands, which became cut off from the mainland by retreating ice following the last ice age. The last known population remained on Wrangel Island in the Arctic Ocean until 4,000 years ago – well beyond the dawn of human civilisation, but finally becoming extinct around the time of the construction of the pyramids of Giza in Egypt. There is no scientific consensus on the chief cause for the creatures’ demise, but climate change significantly reduced habitable parts of the globe for mammoths, and they were also hunted by humans.
Air Pollution kills more people than smoking researchers discover Research finds polluted air is killing 800,000 people a year in Europe, and urge the phasing out of fossil fuel burning. The number of people dying as a result of air pollution may exceed the number killed by smoking, a major new study suggests. German researchers estimate that as many as 8.8 million deaths per year globally can be attributed to dirty air. In Europe alone they estimate there are more than 790,000 additional deaths as a result – double the previous estimate, which did not properly account for the additional rates of cardiovascular disease. “To put this into perspective, this means that air pollution causes more extra deaths a year than tobacco smoking, which the World Health Organisation (WHO) estimates was responsible for an extra 7.2 million deaths in 2015,” said Professor Thomas Munzel, one of the authors from the University Medical Centre Mainz . “Smoking is avoidable but air pollution is not,” he added. Prof Metin Avkiran of the British Heart Foundation said: “Air
pollution is clearly a huge problem across Europe. We need to see WHO guidelines in UK law in order to drive decisive action to protect the nation’s health.” Penny Woods, the chief executive of the British Lung Foundation, said: “Toxic air doesn’t just cut lives short. It also seriously affects the health and quality of life of millions of people.” The scientists acknowledge there are large uncertainties in their early death estimates for Europe, which range from 645,000 to 934,000. Some deaths could have been misattributed to air pollution, but it is as likely that the true number of deaths could be even higher, they said. The effects of air pollution on infant deaths was not included, because the evidence is not yet as strong. The new work also only considered PM2.5 and ozone, and not other particles, nitrogen dioxide or other pollutants.
Third person ‘cured’ of the AID’s virus Stem cell transplant makes people resistant to the virus, but the procedure is difficult and expensive.
The world has just heard about a person referred to as the “London” patient, who has no detectable HIV. While it is still too early to say whether he is cured, he is now only the second person in the world, joining the “Berlin” patient, to have no detectable virus anywhere in his body since he stopped taking antiretroviral drugs - HIV usually re-emerges within weeks when treatment is stopped. In HIV, there is no simple way to define cure, because HIV can hide in deep cells in the body for many years and some people naturally have no detectable HIV in blood without treatment despite HIV continuing to grow and multiply in the body. For this reason, it is not possible yet to determine if a person has a “sterilising” cure, where HIV has been completely eliminated from the body. So, an alternative of a “functional” cure has been coined, where there is no evidence of HIV multiplying in the body and no evidence of the virus over a period without the use of ART (antiretroviral therapy). In a patient with a “functional” cure, the virus may reside in the deep cells (referred to as HIV reservoirs) but is not detectable with available laboratory tests. The “Berlin” patient is the only person in the world who has been functionally cured of HIV. This was achieved through a bone marrow transplant, a similar complicated and risky procedure as the “London” patient had about three years ago. This transplant replaces all the existing stem cells (that will become future blood cells) with a genetic variant that prevents HIV from entering these cells and infecting them. In scientific terms, HIV multiplies in CD4 cells by entering the cells through attaching to a co-receptor, which is usually the C-C chemokine receptor type 5 (CCR5). Some rare individuals, roughly one in 10000, have a genetic mutation of the CCR5 gene.
cells in the “Berlin” and “London” patients from their donor bone marrows, are not able to become infected with HIV because of the faulty CCR5 co-receptor. This is why HIV is no longer able to grow and multiply in their bodies. Bone marrow transplantation is a complicated and potentially lifethreatening surgical procedure and should only be done if there is a compelling medical reason other than HIV. Further, HIV-resistant bone marrow donors are rare. Hence, this procedure is not an option except in very unusual circumstances, and even then, has a high risk of failure. Regardless, the “Berlin” and “London” patients are providing us with important clues on the potential role of CCR5 genes in cure strategies and have inspired new research and advocacy toward an HIV cure. Globally, it is estimated that there were about 36.7million people living with HIV and 1.8million new HIV infections in 2017. South Africa is the worst-affected country in the world; about one in every five HIV-positive individuals in the world lives here. The world has made great strides in treating people living with HIV, with about 21.7million people on antiretroviral therapy in 2017. With sustained treatment with antiretroviral drugs, HIV-positive people can have a lifespan similar to that of HIV-negative people. Besides treatment, substantial programmes are under way in South Africa to reduce the number of new HIV infections. Despite this, HIV continues to spread. South Africa is at the forefront of research on HIV prevention and vaccines and has an important research initiative to find a cure for Aids.
Patients with the mutation, known as a delta 32 deletion (because 32 base-pairs are deleted from the CCR5 gene) are resistant to HIV infection because their CCR5 co-receptor is faulty. This mutation is harmless to people but lethal to HIV.
The Centre for the Aids Programme of Research in South Africa (Caprisa) in partnership with the South African Medical Research Council, the US National Institutes of Health and the national government Departments for Health as well as Science and Technology together with the UCT and the National Institute for Communicable Diseases in Joburg are undertaking research on HIV reservoirs and are using locally discovered potent broadly neutralising antibodies to neutralise HIV in the reservoir to render them non-infectious as a cure strategy.
Both the “Berlin” and “London” patients required bone marrow transplants for blood cancers unrelated to HIV. The “Berlin” patient had leukaemia while the “London” patient had Hodgkins lymphoma. These bone marrow transplants, which were required for their cancers, provided an opportunity to see if bone marrow from donors with the delta 32 genetic mutation in the CCR5 gene, could suppress the patients’ HIV infection. In short, the new blood
Recent promising data from monkeys indicate the potential value of this approach. Studies in humans are expected to start next year. At present, there is no cure for Aids. Treatment and prevention programmes need to continue with vigour. Research is our hope, as a long but essential road, to create better treatment and prevention strategies and eventually a cure for Aids.
Behind the rules of protein aggregation Protein aggregation is an important factor in the development of many different diseases, now the MANGO project is investigating the underlying basis of the process. This work could both lead to new insights into neurodegenerative conditions and also open up opportunities to develop new therapies against several different diseases, as Professor Joost Schymkowitz explains. A protein is
first synthesised in the cells of our body as a chain of disordered amino-acids, before it is then folded to acquire its shape and structure. This process is not completely efficient, however, and it is thought that around half of all proteins never reach their folded state, but are simply degraded instead, which has knockon effects. “These mis-folded states start accumulating and then stick together. In a process called aggregation,” says Professor Joost Schymkowitz, the Principal Investigator of the MANGO project. There are around 20,000 proteins in the human genome, yet only 30 or so aggregate in a diseasecausing way. Now Professor Schymkowitz and his colleagues in the project aim to probe deeper into the underlying reasons behind this. “Our project is concerned with understanding the rules of engagement of protein aggregation. Which two proteins will stick together and why? What are the rules behind this process?”
This process of protein aggregation continues throughout our lifespan, and over time, aggregated proteins are cleared from the body. However, as we age, that clearance process becomes less efficient, and the resulting accumulation of aggregated proteins can cause health problems. “If you look at
specific part of the brain, which is where the symptoms arise,” says Professor Schymkowitz. “Alzheimer’s starts in the hippocampus, while Parkinson’s is associated more with the substantia nigra. Different parts of the brain are affected by the presence of these aggregates, as they spread from there to connected areas.”
Our project is concerned with understanding the rules of engagement of protein aggregation. Which two proteins will stick together and why?
What are the rules behind this process? post-mortem tissue of older people, you often find all sorts of aggregates. It seems like we accumulate aggregates as we age,” explains Professor Schymkowitz. Every type of dementia is essentially an aggregation-associated disease, characterised by the aggregation of different proteins. “They start aggregating in a
By analysing the rules of engagement between proteins, Professor Schymkowitz hopes to gain new insights into how these conditions develop and progress. This could in the long run provide the basis for the development of novel, more effective therapies, a major motivating factor behind
the project’s work. “What you would like to be able to do is to look at a sequence like the ß-amyloid peptide associated with Alzheimer’s and say; ‘this transcription factor might stick to this peptide if you introduce it into a cell’,” outlines Professor Schymkowitz. This would give researchers a firmer basis on which to investigate the root cause of the disease. “It would give us a structured way of formulating hypotheses about what goes on in Alzheimer’s disease. We know that the ß-amyloid peptide is toxic, but we don’t understand why,” says Professor Schymkowitz.
MANGO project The hypothesis Professor Schymkowitz explores is that this toxicity is attributable to protein interactions. Now researchers aim to extract the sequence and structural determinants of co-aggregation, with the goal of developing a bioinformatics algorithm. The basis of aggregation activity is thought to be seeding. “When you add a small amount of pre-formed aggregates to a pool of fresh monomer protein, the addition of those seeds accelerates aggregation,” explains Professor Schymkowitz. By extracting aggregates from the brains of patients and injecting them in healthy mice, researchers can trigger the aggregation process. “The question is
whether only the protein identical to the one in the aggregate gets pulled into the aggregate? Or is other stuff also contributing, and perhaps modulating aggregation? That would be called cross-seeding.” says Professor Schymkowitz. This aggregation activity is not found along the entire poly-peptide sequence, in fact only small segments within a protein are really prone to aggregation. The composition of a region of between 5-10 amino-acids in a protein plays a major role in determining whether that protein will eventually aggregate. “If another protein has that exact same sequence, you would expect them to co-aggregate,” outlines Professor Schymkowitz. Not many proteins have identical regions, but when crossseeding and co-aggregation events are considered, then more possibilities for aggregation arise. Professor Schymkowitz and his colleagues are using a theoretical framework to investigate this in great depth: “Our theoretical framework is focused on the amino-acid sequence, and so are the computational models that we use.” The experimental work in the project largely centres around studying peptides and proteins both in vitro and in cells to gain deeper insights into protein aggregation. There are two main ways of approaching
this work. “One is to analyse systems that represent known aggregation diseases,” says Professor Schymkowitz. If the aim is to analyse co-aggregation in Alzheimer’s disease for example, then it is possible to work just with the purified Alzheimer’s ß-amyloid peptide, which will undergo aggregation in vitro. “We can then add fragments from other proteins that share a lot of sequence similarity to critical regions of the Alzheimer’s ß-amyloid peptide and simply ask the question – do they mix? Do they aggregate together? Or does ß-amyloid simply aggregate by itself and does it ignore all the stuff that’s around it? We can analyse that in vitro,” outlines Professor Schymkowitz.
Synthetic models A second approach in the project involves creating synthetic models, where researchers essentially steer aggregation and look to induce the aggregation of bacterial proteins, or tumour proteins. This is an area in which significant progress has been achieved over the course of the project, opening up new perspectives on major contemporary problems like anti-microbial resistance, says Professor Schymkowitz. “Drug-resistant bacteria are a major concern at the moment, and we
Proteins contain aggregation prone regions (indicated in red) that are normally buried inside the core of well folded molecules (the blue and green shapes). When mutations or changes in circumstances lead the proteins to loose their proper organisation, these aggregation prone region my become exposed, causing the molecules to stick together. When two different proteins contain similar aggregation prone regions, they can co-aggregate, causing a cascade.
The determinants of cross-seeding of protein aggregation: a Multiple TANGO
The key question that I aim to address in this proposal is how the beta-interactions of the amino acids in the aggregate spine determine the trade-off between the specificity of aggregation versus cross-seeding. To this end, I will determine the energy difference between homotypic versus heterotypic interactions and how differences in sequence translate into energy gaps. Moreover, I will analyse the sequence variations of aggregation prone stretches in natural proteomes to understand the danger of widespread co-aggregation.
Programme Funding: Horizon 2020 / Sub Programme Area: ERC-2014-CoG / Project Reference: 647458 / From 01.06.2015 to 31.05.2020 / Budget: EUR 1 995 523 / Contract type: ERC-COG.
Project Coordinator, Professor Joost Schymkowitz Switch Laboratory VIB‐KU Leuven Center for Brain & Disease Research Department of Cellular and Molecular Medicine KULeuven T: +32 (0)16 37 25 70 E: firstname.lastname@example.org W: https://www.kuleuven.be/ samenwerking/lind/doc/2013-02-06-lindcv-joost-schymkowitz.pdf196857_en.html Khodaparast L, Khodaparast L, Gallardo R, Louros NN, Michiels E, Ramakrishnan R, et al. Aggregating sequences that occur in many proteins constitute weak spots of bacterial proteostasis. Nature communications. 2018;9:866. Gallardo, R., M. Ramakers, F. De Smet, F. Claes, L. Khodaparast, L. Khodaparast, J. R. Couceiro, T. Langenberg, M. Siemons, S. Nystrom, L. J. Young, R. F. Laine, L. Young, E. Radaelli, I. Benilova, M. Kumar, A. Staes, M. Desager, M. Beerens, P. Vandervoort, A. Luttun, K. Gevaert, G. Bormans, M. Dewerchin, J. Van Eldere, P. Carmeliet, G. Vande Velde, C. Verfaillie, C. F. Kaminski, B. De Strooper, P. Hammarstrom, K. P. Nilsson, L. Serpell, J. Schymkowitz and F. Rousseau (2016). “De novo design of a biologically active amyloid.” Science 354(6313).
Professor Joost Schymkowitz
need to look at new ways of killing them,” he stresses. The idea here is that if aggregation can kill neurons in the brain, then maybe it can be redirected to have a similar impact on drug-resistant bacterial proteins. “This approach has proved successful. You can cause massive aggregation in bacteria, and it has lethal effects,” continues Professor Schymkowitz.
“We’re trying to learn from the underlying processes behind aggregation diseases, so that we can look into applying them to our benefit.” A lot of progress has been made in terms of possible applications, yet there is still more to learn about the underlying nature of protein aggregation, which will remain an important topic of research at the VIB Switch laboratory
Every type of dementia is essentially an aggregation-associated disease, characterised by the aggregation of different proteins. They start aggregating in a specific part of the brain, which is where the symptoms arise This work has sparked a lot of interest, and some of the technology has already been patented and licensed to Aelin Therapeutics, a spin-off company founded in 2017. This is something Professor Schymkowitz plans to investigate further in the future. “We’re still exploring the application-side of things. We’re working on a number of examples in disease contexts,” he says. A lot of success has been achieved in applying this approach to bacterial infections and tumours, and Professor Schymkowitz says it could potentially be applied on a wider basis in future. “This synthetic system is a technology. So if we can induce the aggregation of bacterial proteins, or tumour proteins, this could be a means of causing problems for a pathogen, or a particular disease process,” he explains.
in Leuven, where Professor Schymkowitz is based. Researchers are continuing to gather data on the determinants and sequence of protein aggregation. “We are still very much engaged in understanding the details of what happens structurally. So we’re looking at what happens when these protein segments meet - which mismatches stop the interactions, which are tolerated, and why,” outlines Professor Schymkowitz. The goal is to develop an effective bioinformatics algorithm, which could greatly accelerate medical research. “The computational approach is faster than working experimentally. We want to have a good, precise predictor, so that experimentalists can focus on a narrower range of proteins,” explains Professor Schymkowitz.
The image shows cell expressing a target protein and synthetic aggregates that are designed to enter the cell and cause the target protein to aggregate and loose its function. The background is an transmission electron micrograph of fibrillar aggregates of a synthetic aggregate that target a tumor specific protein.
Professor Joost Schymkowitz is a group leader at the Switch Laboratory of the KU Leuven-VIB Centre for Brain & Disease Research. He gained his PhD in 2001 from the University of Cambridge, working on protein folding at the Centre for Protein Engineering under the supervision of Laura Itzhaki, before moving to the European Molecular Biology Laboratory in Heidelberg.
Probing the constituents of homeostasis There is no known cure for inflammatory bowel diseases and irritable bowel syndrome, conditions which affect millions of people across the world. Researchers in the PIPE project are investigating the physiology of the intestine, which could lay the foundations for the development of new, more effective therapies, as Doctor Nathalie Vergnolle explains. The intestinal epithelium
is the first point of contact between the external world and the interior of the body, forming an important part of the mucosal layer. As the Principal Investigator of the PIPE project, Doctor Nathalie Vergnolle aims to investigate proteolytic homeostasis at mucosal surfaces, focusing specifically on the intestinal epithelium. “We’re looking at what kinds of proteases and inhibitors are produced and released by the intestinal epithelium, and under what circumstances,” she outlines. Homeostasis needs to be maintained at the intestinal epithelium so that it can function effectively, which is an important aspect of Dr Vergnolle’s research. “A large part of our work involves determining what are the constituents of homeostasis on the protease side, and also on the inhibitor side,” she says. This research holds important implications for the treatment of gastrointestinal diseases like inflammatory bowel diseases (IBD) or irritable bowel syndrome (IBS), chronic conditions that affect millions of people across the world. While the symptoms of IBD may be sporadic, and at times patients may experience remission, if left untreated it can have serious effects. “If we don’t prevent the recruitment of inflammatory cells it can get worse, to a point where it can even be fatal,” says Dr Vergnolle. “IBS is a low-grade inflammation pathology, which is mostly characterised by pain. It’s not a lethal disease, but it is extremely painful. There haven’t been many studies looking at IBS and how it develops as people age, so it’s difficult to say
whether it gets worse over time. But patients with the condition suffer all their lives.” A range of treatment options are available to mitigate the effects of these conditions, including immunosuppressants (for IBD), probiotics and antibiotics, while in some cases lifestyle changes such as dietary modifications might be recommended. While these measures may prove effective to a degree, Dr Vergnolle says current therapies have some significant limitations. “What is missing in existing therapeutic options is the ability to help repair tissues, and to help these tissues regain normal functioning,” she explains. The project will make an important contribution in these terms by developing a deeper understanding of the physiology of the intestine, which could help lay the foundations for the development of new therapies in future. “By understanding
how homeostasis is disrupted in those diseases, we could then understand how we could treat them more effectively,” says Dr Vergnolle.
Tissue samples The precise nature of homeostasis is likely to vary according to individual physiology, yet researchers have nevertheless still been able to gain more general insights through analysis of tissue samples from patients with IBD or IBS. Homeostasis can be broadly thought of as a kind of healthy equilibrium, and disrupting it can lead on to problems. “A broken equilibrium can mean that you have too many of some things, or not enough of others,” explains Dr Vergnolle. By analysing and characterising tissue samples, researchers aim to learn more about proteolytic homeostasis. “We try to understand the role of the factors that are either up-regulated or
Figure 1: Proteolytic Homeostasis warrants the Control of Mucosal Functions.
down-regulated, depending on what’s there and what’s absent, in patients suffering from IBD or IBS,” says Dr Vergnolle. “From this point, we can then look towards thinking about ways to interact with those factors, in order to gain new therapeutic options.” Researchers also used cell lines and animal models of diseases over the course of the project, as Dr Vergnolle and her colleagues probed deeper into the underlying nature of proteolytic homeostasis and the specific factors which lead to disruption. Researchers have observed that stimulating the intestinal epithelium leads to the release of different kinds of proteases, and so disturbs proteolytic homeostasis. “It does that under different circumstances, for example by an inflammatory stimulus, or an infectious stimulus. Even stress hormones can induce this release of proteases, that might interfere with homeostasis,” outlines Dr Vergnolle. “This is essentially the foundation of the idea that a disturbed epithelium reacts to that disturbance by releasing different proteases and protease inhibitors.” By investigating how the intestine reacts to the presence of these specific proteases and
inhibitors, Dr Vergnolle hopes to learn more about the physiology and pathophysiology of gastrointestinal diseases. The next step would be to build on these foundations by proposing new therapeutic interventions. “This could be drugs. It could be specific inhibitors, or maybe antibodies that block protease activity,” says Dr Vergnolle. A number of potential targets have been identified over the course of the project, which could
therapeutic benefit for patients, while there are also many more possible targets that haven’t been thoroughly assessed yet. The wider goal in this research is to improve treatment of IBD and IBS. “With IBS, it’s currently more about managing the condition than treating it. Patients may have associated symptoms, like either diarrhoea or constipation. We can intervene in these circumstances and prevent diarrhoea
By understanding how proteolytic homeostasis is disrupted in inflammatory bowel diseases and irritable bowel syndrome, we could then understand how we could treat them more effectively. open up new therapeutic possibilities in future. “We’ve defined some proteases that are new targets for drug development. Also, some of the inhibitors that we found were very helpful in terms of regaining a healthy situation in tissues,” continues Dr Vergnolle. Researchers are investigating these different inhibitors and antibodies in great detail, and looking to assess whether they could have a
or constipation, but the pain symptoms have not really been addressed - that’s why we are investigating this area,” says Dr Vergnolle. By building a deeper understanding of the physiology of the intestine, Dr Vergnolle hopes to lay the ground work for improved treatment in future. “From there we hope to have the ability to propose new, more effective therapeutic interventions,” she says.
Physiology of the Intestine: Proteases from the Epithelium
The aim of the PIPE project is to: 1.) characterize the protease/anti-protease balance issued from intestinal epithelium, 2.) study the impact of intestinal epithelium-derived proteolytic machinery on epithelial physiological functions 3.) study the impact of intestinal epithelium-derived proteolytic factors on enteric nervous system activation It is hoped that the results of this study will lead to the identification of previously unknown actors in intestine physiology and pathophysiology, and of the neuro-epithelial crosstalks in the gut. Figure 2: Ruptured Proteolytic Balance impairs Mucosal Functions.
Funded under: FP7-IDEAS-ERC • Overall budget: € 1 287 000
Nathalie Vergnolle Institut de Recherche en Santé Digestive Inserm - INRA - Univ Tlse3 - INP ENVT CHU Purpan - Place du Docteur Baylac CS 60039 31024 TOULOUSE Cedex 3 T: +33 6 78 44 13 61 E: email@example.com W: https://www.inserm.fr/rechercheinserm/portraits-chercheurs/laureats-erc/ nathalie-vergnolle-bacteries-alimentairespour-soigner-mici Nathalie Vergnolle Figure 3: The Intestinal Epithelium is a Major Source of Proteases and Inhibitors Controlling Mucosal Functions.
Dietary changes A lot of attention in the project has been focused on drug development, yet a deeper knowledge of intestine physiology would also help researchers assess the impact of diet, an important factor in both IBD and IBS. Dietary changes can mitigate the symptoms of both conditions, yet the picture in this area is not entirely clear. “Changing diet might be helpful in some cases, but this is something that we don’t fully understand. There is no clear indication of the dietary changes that each individual patient should make,” explains Dr Vergnolle. One question Dr Vergnolle and her colleagues have been looking at is how proteolytic homeostasis is affected by dietary interventions. “Diet might change homeostasis, but we cannot say with complete confidence that it changes it in a particular way,” she continues. This research could also hold broader relevance beyond the initial focus of the PIPE project. Mucosal surfaces within the body are often very much alike in the way they react to aggressions and damage, which could represent another avenue of
investigation in future. “Similar events could take place at these other mucosal surfaces, like the lungs, bladder, or vagina, so similar targets might be found there. It’s difficult to say this with confidence right now though, because not enough research has been done on those other mucosal surfaces from a proteolytic homeostasis point of view,” outlines Dr Vergnolle. Disruption of proteolytic homeostasis might also have an effect on other pathologies. “We focused on inflammatory diseases and pain in the project – but there might also be some important effects on other pathologies, such as cancer, upon long-term disruption of the mucosal proteolytic homeostasis,” says Dr Vergnolle. There are many possible avenues of research arising out of the project’s work, both in terms of developing new therapies to treat disease and more fundamental investigation into mucosal surfaces. For the moment though, Dr Vergnolle says the priority is to characterise the functions of all the different targets. “We will carry on investigating each of the targets that we have identified,” she says.
Nathalie Vergnolle, research director at INSERM, is since 1 January 2016 as head of the Research Institute for Digestive Health. She is also responsible for the team “Pathophysiology of the intestinal epithelium.” Before conducting her research in Toulouse, Dr. Vergnolle spent 10 years at the University of Calgary (Canada) in the Department of Pharmacology. Her work has highlighted the role of several mediators involved in inflammation and pain. In collaboration with researchers from INRA and the Pasteur Institute, her team produced bacteria expressing a human protein, elafin that can protect the body from intestinal inflammation. Her researchers have earned her international recognition and are supported at European, national and regional level. They have led to very active academic and private collaborations.
Deciphering the mechanisms of chromatin states Our genome is packaged into chromatin in a precise manner, which plays a critical role both during development and throughout our lifespan. We spoke to Doctor Genevieve Almouzni about the work of the ChromADICT project in investigating the general principles that control chromatin states, work which could lead to a deeper understanding of many pathological cases The primary role
of chromatin is to compact the genome in the cell nucleus by exploiting a complex comprised of a combination of DNA and proteins. Histones are key proteins, as they act as an elementary ‘brick’, while the histone chaperones act as ‘escorts’ of the histones that help to build up chromatin. In addition, chromatin organization also helps to regulate genome function, both for its expression and stable transmission through cell divisions. Chromatin structure changes over time and in different tissues; this is a critical factor in biological development over the course of the lifespan, a topic central to the work of the ChromADICT project. “We are trying to look at the general principles that control chromatin states with their dynamics during the cell cycle, but also in transition from stem cells to a differentiated state,” says Dr Genevieve Almouzni, the project’s Principal Investigator. Using a genus of African frog called Xenopus as a model system, Dr Almouzni and her colleagues are looking deeper into chromatin organization. Xenopus is well-suited to this purpose, as researchers can easily access the eggs laid by the
CHROMADICT Chromatin Adaptations through Interactions of Chaperones in Time Dr Geneviève Almouzni Research center at the Institut Curie Head of the Chromatin Dynamics team Member of the Science Academy in France Co-Chair of the LifeTime Initiative T: +33 1 56 24 67 01 E: firstname.lastname@example.org W: https://lifetime-fetflagship.eu
Dr Geneviève Almouzni is head of the Chromatin dynamics team at the Institut Curie, a position she has held since 1999. She is a world leader in understanding genome organization and function during development and disease, particularly in cancer. She combines biochemistry, cell biology and physical approaches with advanced imaging to explore chromatin dynamics.
female in large quantities, and these eggs can also be manipulated at high levels of precision to monitor all stages of development, starting from the very first cell up to the formation of a complete organism. “We manipulate the network of histone chaperones and histone variants to explore how they impact on each other,” explains Dr Almouzni. Interactions between histones - the major protein components of chromatin - and their chaperones
chaperones. “Histones and chaperones come hand in hand,” explains Dr Almouzni. The organisation of chromatin changes over time, something Dr Almouzni and her colleagues aim to investigate in the project. “We wish to explore how the distinct features evolve over time at various scales and under various conditions during the cell cycle, upon genotoxic stress, and in a differentiation system,” she says.
We wish to explore how the distinct features of chromatin and its regulators evolve at various scales and under various conditions - during the cell cycle, upon genotoxic stress, and in a differentiation system. are an important factor in controlling chromatin assembly and the formation of specific domains. “The placement of the histone variant CENP-A at the centromere is key to defining a unique organization. For other variants, their placement is also indicative of genome function, but the details of how still has to be characterized. This is part of the project,” continues Dr Almouzni. The dosage of histone variants is also an important consideration in questions around chromatin organization, alongside the position in which they are placed in the chromatin structure, and at what point during development or the cell cycle, a topic researchers are paying a lot of attention to in the project. Storage of a maternal pool of histones in oocytes has been observed in the amphibian model, and it is accompanied by the concomitant presence of the appropriate
This research holds wider relevance to our understanding of certain diseases in which chaperone-variant dosage imbalances are affected. Changes in chromatin dynamics have been associated with a number of pathological cases including cancer, as well as other neurological and hematological diseases, underlining the importance of the project’s work. “Our findings will undoubtedly impact these areas,” says Dr Almouzni. The project itself will reach the end of its funding term in 2021, yet this will not mark the conclusion of Dr Almouzni’s research in chromatin biology, and she hopes to continue her work in this area over the coming years. “I believe that our discoveries will be a foundation to build new proposals,” she stresses. “We hope to provide a spring board for future research, not only for us, but for others too.”
EU Rese barch
Educating students on the importance of science Science teachers discussing about good practice of using scenarios.
A scientific qualification can be the passport to an interesting and well-paid career, yet only a relatively small proportion of young people choose science studies after leaving secondary school. The MultiCO project aimed to heighten awareness of scientific careers through the introduction of career-based scenarios in lessons, as Professor Tuula Keinonen explains. The employment market increasingly places a premium on technical knowledge, and scientific skills are essential to addressing major contemporary social challenges, including climate change, food security and water sustainability. Despite this wider importance, only a relatively small proportion of young people study scientific subjects after leaving secondary school, an issue at the heart of the MultiCO project. “We aim to change this situation, we want to make young people more aware of scientific careers,” says Professor Tuula Keinonen, the project’s Principal Investigator. A key part of this is understanding why higher numbers of students aren’t studying scientific subjects in the first place, whether it’s because they view them as too hard, too abstract, or they aren’t aware of the opportunities that a science qualification could open up to them in future. “It might be that some students find science not as interesting as other subjects,” continues Professor Keinonen.
MultiCO project These are perceptions that the MultiCO project aims to change by investigating a new approach to teaching scientific subjects, including physics, chemistry, biology and geography, and evaluating its effectiveness. The aim is to introduce specific scenarios
to teaching, which are designed to enhance awareness of the wider opportunities open to those with a scientific background and technical knowledge. “Our scenarios are career-based, so students are more aware of what different career opportunities are available to people with scientific skills,” explains Professor Keinonen. Researchers are
over three years, and look at their attitudes towards science,” says Professor Keinonen. The wider goal here is to engage students’ interest in science more generally and highlight the importance of technical knowledge in the employment market, whether that’s in engineering, finance, academia or any other area. Science teaching will by necessity always
Our scenarios are career-based, so students are more aware of what different career opportunities are available to people with scientific skills looking to assess the impact of this approach on the attitudes of secondary school students between the ages of 13-15 in five European countries; the UK, Finland, Estonia, Germany and Cyprus. “We are doing longitudinal studies – so we follow the students in each country The MultiCO consortium during a brief break in the project meeting in Cyprus .
involve introducing abstract concepts and ideas; however, relating them to contemporary issues and possible career opportunities could help engage students more effectively, believes Professor Keinonen. “It’s important to present information and ideas to students in context, in a way that engages their interest,” she stresses. More than 50 scenarios have been developed in consultation with industry, companies, communities, and other stakeholders as well as with teachers and students, now researchers are looking to assess their effectiveness in terms of the project’s wider goals, comparing them to a group of students who didn’t experience these scenarios. “Different career opportunities are represented in our scenarios,” continues Professor Keinonen.
Many secondary school students are starting to think about their future careers, and that’s often an important factor in an individual’s decision on what to study, but a simple curiosity in the subject is also essential. The project is therefore not just about publicising future career opportunities and potential financial rewards, but also stimulating students’ individual interest in the topic and instilling a spirit of inquiry. “The primary aim in the project is to engage students’ interest in science,” says Professor Keinonen. Teachers of course are responsible for teaching, so they have played a central role in deciding on the content of lessons, and how the scenarios should be set up, while the students have also been involved. “Teachers have been considering scenarios and working on interventions, setting up problem for students to solve. In some cases, we have had input from companies who have presented specific technical problems,” outlines Professor Keinonen. “The hope is that the students are more deeply invested in this work, as they have a problem to resolve.” A high degree of collaboration is required to address demanding technical tasks and major contemporary challenges. Addressing issues around climate change isn’t just about technical knowledge for example, but also the ability to work collaboratively in a team, share knowledge, and identify the right approach; Professor Keinonen says this is reflected
In Botanical Garden Tartu after project meeting session.
in the project. “Social skills, creativity, collaboration and reasoning are seen as important,” she stresses. This runs contrary to the common perception of scientific research as quite a solitary activity, demanding intense concentration and personal seclusion, so alongside developing their technical knowledge, the project aims to encourage students to work together effectively. “The students work in close collaboration in making inquiries. Different scenarios require different kinds of collaboration,” continues Professor Keinonen. “For example, in one scenario students were asked to contact people in a company and interview them.” This approach also places new demands on teachers, who may have become used to a particular methodology, and may now have to adapt to a new approach. While this may be
disruptive to some degree, education always needs to evolve in line with wider social change, and Professor Keinonen believes many teachers are open to adopting new approaches. “There are many teachers who want to diversify their teaching,” she outlines. Secondary school students are approaching the end of their time in compulsory education, so while a teacher’s primary responsibility is of course to teach, it may be helpful if the curriculum also gives them the flexibility to play a role in advising students about possible career options, together with careers advisers. “We looked at this issue in the project and it is important to provide students with balanced advice, to guide them towards careers that suit their skills and personal strengths,” says Professor Keinonen.
Impact of interventions The project has only recently concluded and while it’s too early to draw definitive conclusions the results so far are positive, with researchers finding that these interventions helped to engage students’ interest. Researchers found that students who experienced the interventions showed more interest in science – or at least their interest did not decrease – during their time in secondary school. “We asked students about their plans for their future careers, and saw a significant increase in interest in sciencerelated careers,” outlines Professor Keinonen.
Discussions about possible scenarios in the Tartu project meeting.
MULTICO Promoting Youth Scientific Career Awareness and it Attractiveness through Multi-stakeholder Co-operation
The project aimed to identify modern scientists’ work/career stories; determine and analyse perceptions, related to scientific careers, among different stakeholders and students’ perceptions of careers and working life skills; design a collection of scientific career-related scenarios and determine students’ views related to the value of these scenarios in promoting science education; obtain evidence on students’ interests and experiences gained and career choices.
The MultiCO project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 665100. • Overall budget: € 1 393 750
Tuula Keinonen University of Eastern Finland Joensuu Campus The School of Applied Educational Science and Teacher Education PO Box 111, 80101 Joensuu T: +358 50 528 8818 E: email@example.com W: www.multico-project.eu
Professor Tuula Keinonen
The MultiCO consortium in a break at the Botanical Garden during the Bonn project meeting.
There were some significant variations within this however, with differing levels of interest in different subjects; biology was the most popular subject in Finland and Germany for example, while students in the UK showed a high level of interest in scientific careers. “Interest in science increased most in Finland – but it also increased in the other countries. We found a high number of students in the UK were considering studying scientific subjects,” continues Professor Keinonen. These findings do not point conclusively towards a need to change the way science is taught in schools, but it remains important to consider how to stimulate the interest of students in science, particularly given its importance to the wider economy and major social challenges. While this is a complex area, and there is no onesize fits all solution, Professor Keinonen says the project’s research could inform teaching methodologies. “We will make
some recommendations about what can be drawn from each scenario to help stimulate students,” she outlines. Further research is required to build a more complete picture, so Professor Keinonen is keen to gather more information on the students’ attitude towards science. “We would like to follow the students after a year, and assess the extent to which their attitudes have changed,” she says. “Our goals are evolving.” There are also other possible avenues of research arising out of the project’s work, and Professor Keinonen is looking to explore the possibility of establishing a successor initiative to build on what has been achieved so far. The wider picture in this is an awareness of the economic and social importance of a strong science and research base, which is reflected by the strong level of commercial interest in the project. “Many companies are interested in promoting scientific education,” stresses Professor Keinonen.
Coordinator Tuula Keinonen is Professor in Education at the University of Eastern Finland and the head of the School of Applied Educational Science and Teacher Education. Her research focuses on teaching and learning approaches which promote students’ interest towards science and science careers as well as on education for sustainability.
One of the MultiCO project meetings was held in the Tartu University.
Sharing data in the control of crime Forensic DNA databases are an important tool for enhancing transnational cooperation in the control of crime, yet European nations have different approaches to regulating, gathering, using, and sharing forensic DNA data. We spoke to Professor Helena Machado about the Exchange project’s work in investigating the ethical, social, political and operational issues around transnational sharing of DNA data in the EU The Prüm convention is a central pillar of efforts to strengthen police and judicial cooperation across Europe in the fight against terrorism and cross-border crime, enabling signatories to exchange DNA profiles and other information from national databases. However, while DNA databases can play an important role in identifying, exonerating and convicting suspects, countries may have very different approaches to gathering, storing and using this data. “For example, the French DNA database has grown quite a lot in the past five years, and around 70 percent of the profiles stored in their national DNA database are from suspects. By contrast, the national DNA database in Portugal cannot hold profiles from suspects – it can only hold profiles from individuals convicted of a crime and who are serving a sentence of three years or more,” explains Professor Helena Machado.
Exchange project As the Principal Investigator of the Exchange project, Professor Machado is probing deeper into the issues around sharing DNA data, including concerns that collecting more data
on citizens will lead to the erosion of civil liberties, as well as understanding the role of DNA evidence in the criminal justice system. This work is built on two different yet at the same time complementary approaches. “One part of our work involves conducting interviews in all the EU member states connected to the Prüm network,” outlines Professor Machado. The Prüm convention was initially signed by seven EU Member States in 2005 and currently 24 member states are operating in the Prüm network. “We have almost finished our empirical study: we have already conducted interviews with Prüm national contact points in 22 EU Member States,” says Professor Machado. A number of interviews have also been conducted with different stakeholders related to the uses of DNA technologies in the criminal justice system, including representatives of the police and judiciary, as well as forensic geneticists and legal specialists. This formed the basis of a comparative study looking at Germany, the Netherlands, Poland, Portugal and the UK, assessing the views of professionals involved
in criminal investigations about the risks and benefits of the Prüm convention. “We’re looking at views on the value of DNA in criminal investigation. One interesting point we’ve found is that professionals working in forensic laboratories tend to have more positive expectations about the impact of DNA evidence in criminal investigation, than the criminal investigators themselves,” says Professor Machado. The staff from national contact points were typically more cautious about the potential impact of the uses of DNA profiles as evidence in an investigation. The individuals from contact points tend to see DNA profiles more as a source of intelligence than evidence. “In general, criminal investigators tend towards the view that traditional forms of criminal investigation are still more important than DNA profiles,” explains Professor Machado. While DNA evidence may be just one piece of the picture, forming part of a case, it is often the most heavily highlighted by the media. “If there is a line in the criminal investigation involving DNA evidence, then the media will tend to emphasise that. There is often a much
more complex and nuanced scenario in which other pieces of evidence were assembled and used,” says Professor Machado. There are however other cases in which Professor Machado says DNA evidence can be particularly crucial. “DNA can be very important in certain criminal cases where there is a clear picture of biological samples left at crime scenes, or on victim’s bodies like in the case of sexual crimes,” she outlines. A well-maintained, rigorous DNA database might help police rapidly identify the perpetrator in these kinds of cases, yet the expansion of data collection, storage, and analysis that enable the rapid sharing of resulting information must be balanced with concerns around protecting individual privacy and the rights of citizens. “Sometimes there is a tension between security and the values of individual privacy and the presumption of innocence,” points out Professor Machado. “This is related to the issue of social control and the technological apparatus involved, which holds important implications for our current understanding of genetic privacy and citizenship.” Researchers in the project have also looked at the level of public engagement in discussions around the risks and benefits of the Prüm convention, and the debate about the use of technological systems to monitor citizens. Public legitimacy is essential to the use of DNA databases, a topic central to Professor Machado and her colleagues’ work in the project. “In our empirical work we investigated the different national
positions in relation to the public debate about forensic DNA databases and public engagement,” she outlines. This is a pressing issue in some countries, as governments seek to improve security while also protecting personal privacy. “Citizenship was addressed in terms of public engagement and public trust on the one side. Then on the other, we considered the balance between privacy and security in different jurisdictions,” says Professor Machado.
interviewees claim that the police will exert pressure to get information as quickly as possible, while the judiciary has a tradition of collaboration, which is much slower and which has more safeguards with respect to the exchange of data.” There is a tension here between different traditions and professional cultures, while the political circumstances are another important consideration. For example, in countries where terrorism is a major domestic
We’re looking at views on the value of DNA data in criminal investigation within the Prüm system. One interesting point we’ve found is that professionals working in forensic laboratories tend to have more positive expectations about the impact of DNA evidence in criminal investigation, than the criminal investigators themselves. Privacy and security The relationship between privacy and security varies according to the legal traditions in different European countries. For example, if the criminal DNA database in a specific country is under the aegis of the police, and in another it is regulated by the judiciary, then they will have different rules and conventions around sharing data. “For example, Belgium is more restrictive in sharing data than France. In recent years, France has moved to a position in which police forces are very active in investigating cross-border crime,” explains Professor Machado. “Some of our
concern, there may be more of an emphasis on sharing information rapidly. “They will make more effort to speed up information sharing processes and accelerate investigations of serious crimes,” says Professor Machado. The police in some countries have also acquired new investigative tools, partly in response to growing concerns around the level of serious criminality. “One clear example is the use of forensic DNA phenotyping. This is a technology that has the potential to allow scientists to infer physical appearance from DNA samples and to provide intelligence to the police about the probable externally visible characteristics
EXCHANGE Forensic Geneticists and the Transnational Exchange of DNA data in the European Union: Engaging Science with Social Control, Citizenship and Democracy
of the suspect – like hair, skin, and eye color,” explains Professor Machado. The use of forensic DNA phenotyping is highly controversial in many countries, as observers argue that it represents a serious risk of discrimination against vulnerable ethnic groups when they generate statements about the likely race or ethnicity of criminals. However, it is in use in some jurisdictions and this represents a significant shift in forensic genetic technologies, says Professor Machado. “This is just one instance of the differing positions of stakeholders across European countries on the type of data that can be held on citizens and the way in which it can be used,” she says. There are also many other points of divergence. “Stakeholders across European countries have different positions and priorities in relation to the fight against crime. They also have different levels of economic resources, different institutional arrangements related to the incorporation of science at the service of justice, as well as diverse traditions related to regulation, protection of data, and ethical oversight of criminal DNA databases,” stresses Professor Machado. A trans-national oversight body may be required to deal with the complexities and differences arising from the way data is shared between countries under the Prüm convention, believes Professor Machado. An oversight body may help to assess the
efficiency and efficacy of the data exchange, but also could be approached in situations of irregularities which affect data-subjects – those individuals whose data is exchanged. It is up to each country to decide the basis on which they will share data automatically; however, tensions between countries may emerge once a match has been identified. “When there is a DNA match then a case proceeds according to national legislation and judicial practices. Some tensions may arise between countries which are more eager to have quick information exchange – and countries that do not necessarily prioritise exchanging data for those trans-national cases,” outlines Professor Machado. The project will make an important contribution in these terms by helping to build the foundations of shared practices at a transnational level. National police and judicial forces need to share information to deal with cross-border crime and terrorism, underlining the wider importance of the project’s work. “It is absolutely essential to establish transnational practices and regulations with respect to the best practices to conduct this kind of sharing and exchanging of data,” stresses Professor Machado. “It is really important to continue with this dialogue between different professions and disciplines. In the project we hope to obtain results that inform governance and policy-making.”
The EXCHANGE project seeks to address the challenges to citizenship, democracy and social control posed by technological systems of surveillance and control of criminality and terrorism. The EXCHANGE project focus on the particular role of forensic genetics technology in the implementation of an ‘area of freedom, security and justice’ in the European Union.
Funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Consolidation grant, agreement N.º ), within the project “EXCHANGE – Forensic geneticists and the transnational exchange of DNA data in the EU: Engaging science with social control, citizenship and democracy”, led by Helena Machado and hosted at the Communication and Society Research Centre (CECS), Institute for Social Sciences of the University of Minho, Portugal.
Professor Helena Machado University of Minho CECS (Communication and Society Research Center) Institute for Social Sciences Campus de Gualtar 4710-057 BRAGA Portugal T: +351 253 601 752 E: firstname.lastname@example.org W: http://exchange.ics.uminho.pt Professor Helena Machado
Helena Machado is Full Professor of Sociology at the University of Minho, Portugal. Her research critically engages sociology and social studies of science and technology to explore the challenges emerging from the uses of genetics in contemporary modes of management and control of crime. Helena’s current work focus on the transnational surveillance operated by sociotechnical systems for exchange of forensic DNA data and intelligence information in the EU. She is author of more than 180 academic works. In 2015, she was awarded a Consolidator Grant from the European Research Council (ERC).
Behind the signals in cell walls Cell walls in plants are highly dynamic, and they need to change shape and composition in order to allow for growth and development. Researchers in the WallWatchers project are using a multi-disciplinary approach to build a deeper picture of cell signalling, which could open up new possibilities in terms of enhancing crop resilience, as Professor Julia Santiago explains. The cell walls within a plant are highly dynamic, as they need to change shape to allow for growth and development. Cell walls provide mechanical support to the plant, while they also play an important role in communicating both with other cells and the outside world, a topic of great interest to Professor Julia Santiago, the Principal Investigator of the WallWatchers project. “We work with signalling pathways, and we’re trying to dissect how the information that comes from a cell wall is integrated inside
WALLWATCHERS Plant cell wall communication and remodelling: the wall watchers Julia Santiago, Assistant Professor Department of Plant Molecular Biology Biophore Building. University of Lausanne 1015, Lausanne, Switzerland T: +41 21 692 42 10 E: email@example.com W: https://www.unil.ch/dbmv/en/home/ menuinst/research-and-teaching/recherche/ prof-julia-santiago-cuellar.html
Julia Santiago Cuellar is Assistant Professor in the department of plant molecular biology at the University of Lausanne. She holds a PhD in biotechnology and molecular biology, and has worked in research positions at different institutions across Europe, before taking up her current position in 2016.
the cell, so that the cell grows or develops in a certain way,” she outlines. “We’re then looking at how this information is translated into metabolic changes that will change the cell wall again, chemically and mechanically, so that it can adapt to a new situation.” Researchers are using the Arabidopsis plant as a model species in this work, with Professor Santiago and her colleagues taking a multidisciplinary approach to build a deeper picture of cell signalling. This includes using structural biology, quantitative biochemistry and plant genetics techniques. “We look at these processes from the nano scale, meaning atomic resolution, to the macro scale, using cell biology and genetics, to dissect what happens to different plant tissues.
inside a plant could eventually help scientists enhance crop resilience, which is an important aspect of the project’s work. “We want to look at how we can make plants more productive and accelerate growth, and how we can make them more resistant to pathogens and different ground conditions,” continues Professor Santiago. This could open up the possibility of genetically modifying plants to enhance resilience in future, or of using other techniques to improve productivity. While researchers have looked at Arabidopsis during the project, Professor Santiago says the project’s work holds broader relevance beyond this particular plant. “ Arabidopsis is a plant that we basically use for fundamental science, and then we can
We work with signalling pathways, and we’re trying to dissect how the information that comes from a cell wall is integrated inside the cell, so that the cell grows or develops in a certain way. Quantitative biochemistry techniques help us to understand the relationship of the different proteins within complexes” explains Professor Santiago. “With our biochemical and atomic models at hand we then go back to the plant and we look at how this communication system works in vivo.” The cell wall itself is rich in carbohydrates, with membrane receptors that play an important role in receiving information and transmitting signals elsewhere in the cell. A lot of attention in the project is centered on the interactions between receptors and ligands in the plant. “This molecule will bind to a receptor, which will then lead to the expression or activation of specific genes inside the cell,” says Professor Santiago. A deeper understanding of these structures
translate results to other plants,” she says. This work holds important implications for crop resilience, yet Professor Santiago is focused on more fundamental work at this stage. “We hope to build a clearer picture of how cell signalling works, and how the cell modifies the cell wall to adapt to different circumstances and develop differentiated tissue for cell function,” she says. The priority at the moment is uncovering the ligands involved in cell wall sensing receptors, yet Professor Santiago is also fully aware of the wider relevance of this work. “With more detailed structural information on the cell signalling system we can look towards gene-editing and modulating the function in the development of the plant to improve production,” she outlines.
Shale is a very common sedimentary rock, yet little is known about pore connectivity and how fluids flow through it, questions central to its potential as a means of storing radioactive waste in future. We spoke to Dr Maartje Houben about her work in characterising fluid flow pathways in shale rock and the wider implications of her research beyond the academic sector.
Photo by Patrick Hendry on Unsplash
Deeper underground to probe shale permeability
A number of
countries across Europe are investigating the potential of shale rock as a means of storing radioactive waste and captured CO2 underground, while these rocks also host valuable reserves of shale gas. Shale rock itself has a low level of permeability, so it’s difficult to transport fluids through it, yet there is still more to be learned in this respect. “We know that fluids flow through these rocks, but we don’t know how, or where it goes,” explains Doctor Maartje Houben, a researcher at Utrecht University. A deeper understanding of fluid transport within shale rock is clearly essential if radioactive waste is to be stored safely underground, a topic central to Dr Houben’s research. “We aim to image fluid flow in shale rock,” she outlines. “We take rocks from the UK, then we bring them back to the laboratory for analysis.”
Imaging rocks The rocks are placed in a triaxial machine to effectively replicate underground conditions, with researchers aiming to reach a hydrostatic pressure level corresponding to a certain depth. Two main techniques are used to image the rocks, namely ion-beam milling and scanning electron microscopy. “With the milling we’re essentially polishing the rocks, as they’re very fine-grained. We use ion-beam milling – we’re shooting argon or gallium ions onto the surface, which gives us a perfectly flat surface,” explains Dr Houben. A scanning electron
Segmented microCT image of Opalinus Clay, where the arrows indicate possible fluid flow directions through the sample. (a.) Fluid flow parallel to the bedding is biased by the microcracks present parallel to the bedding and the microcracks could in this case form highways for fluid flow. (b.) Fluid flow perpendicular to the bedding does not show any highways because microcracks are not connected, hence fluid flow is expected to go through the matrix following a byway model. (c.) SEM matrix image showing that even at higher resolution pores for isolated islands may or may not be connected through smaller pore throats. (d.) High resolution SEM matrix image showing isolated pores.
transport in her research. “Horizontal transport is a lot faster in shale, so the most likely way for fluids to travel is to go along the bedding,” she says. These are important considerations in terms of both the storage of waste and hydraulic fracturing – fracking – the process by which shale gas is extracted from rocks underground; current fracking methods are relatively imprecise. “The rocks are fractured, and then they see how much gas comes out,” explains Dr Houben. “It would be very helpful to be able to predict what a fracture will look like.”
We aim to identify how fluids flow through shale rocks, and the actual pathways that they use. We want to pinpoint which layers are more permeable than others. microscope is then used to image the surface. “We use electrons to image the polished surface. The scanning electron microscope has a resolution on the nanometre scale, so a very high level of detail,” continues Dr Houben. A variety of different parameters affect the speed at which fluids are transported through a rock, including whether they interact with the rock and any pressure differences between different areas. A fluid could also flow in different directions of course, so Dr Houben is considering both vertical and horizontal
This depends to a large degree on the bedding direction of the shale. If a rock is fractured parallel to the bedding, then its permeability does not seem to change much; however, it’s very different if it is fractured perpendicular to the bedding. “In this case, the permeability of the rock goes up quite significantly,” says Dr Houben. These rocks might already be fractured, which could affect both a fracking company’s ability to extract gas, and their suitability as a site for the storage of radioactive waste. “Do
these fractures increase the permeability by five orders of magnitude, or by less?” continues Dr Houben. “We aim to identify how fluids flow through shale rocks, and the actual pathways that they use. We want to pinpoint which layers are more permeable than others.” Mapping fluid highways and byways in shales Dr Maartje Houben Geosciences Utrecht University Aardwetenschappen Budapestlaan 4 3584 CD Utrecht T: +31 30 253 5095 E: firstname.lastname@example.org W: https://www.uu.nl/ staff/mehouben
Dr. Maartje Houben is a postdoctoral researcher at the Faculty of Geosciences at Utrecht University (NL). After her PhD project pioneering a new approach to quantify porosity in clay rich rocks at RWTH-Aachen University (DE), her research now focusses on fluid-flow pathways through low permeable rocks.
Hooking the bigger fish through epigenetics
Traditional fisherman in the river Nile.
Effective breeding strategies are crucial to the long-term sustainability of the aquaculture sector and its ability to meet the growing global demand for seafood products. We spoke to Professor Jorge Fernandes about the EPIFISH project’s work in investigating the role of epigenetics in fish domestication, which holds important implications for aquaculture biotechnology. The aquaculture industry as a whole will need to increase production in future to meet growing global demand for fish and shellfish. The ability to domesticate and selectively breed fish and shellfish is central to the sustainability of the industry, a topic at the heart of the EPIFISH project’s work. “We are looking at phenotypic changes in fish removed from the wild and taken into captivity. We are looking at these changes over three generations,” explains Professor Jorge Fernandes, the project’s Principal Investigator. This work was prompted to a large degree by earlier research which showed that selective breeding in domesticated fish could lead to large increases in size, even over relatively short timescales. “An 85 percent increase in average fish weight was recorded in Nile tilapia (Oreochromis niloticus), which would represent a huge gain for the industry. What amazed me was that this happened so quickly, over just five generations,” outlines Professor Fernandes.
EPIFISH project This rapid increase cannot be attributed solely to genetic factors, believes Professor Fernandes, and now he and his colleagues in the project are investigating the importance of epigenetics in fish domestication. Researchers are using the Nile tilapia as a model species, a fish which is commercially very important. “Nile Tilapia is the secondmost important farmed fish worldwide in terms of the volume of production, while it also has a very short generation time,
reaching sexual maturity at the age of 4-5 months. This means we can look at several generations within the timeframe of the project,” says Professor Fernandes. The Nile tilapia itself is native to Egypt and central Africa, but it is now found across the world, in a range of different environmental conditions. “They can deal with a wide range of temperatures, tolerate different salinities, and eat pretty much anything, both animals and plants,” continues Professor Fernandes. “They do very well on very little.”
regulation. This is not only because they can be epigenetically modified themselves, they can also regulate the expression of a number of genes involved in epigenetic regulation,” outlines Professor Fernandes. A second major priority in the project is to investigate the role of DNA modifications in fish domestication. Here, Professor Fernandes and his colleagues are looking at the methylation and hydroxymethylation of cytosine bases. “We want to see what happens to the expression of these miRNAs
The hypothesis behind this project is that if we include epigenetic markers then we will be able to cover a larger proportion of phenotype variability in the traits we want – like disease resistance, growth, and age at sexual maturity The wild fish were initially collected from Egypt and taken to a research station at Nord University in Norway, where Professor Fernandes is based. Researchers are observing the domestication of these fish over several generations, with Professor Fernandes and his colleagues looking at the epigenetic markers of this process. “Epigenetic changes occur beyond the genome, or genetics, and help to regulate the differentiation of the cells and how they function,” he explains. A lot of attention in the project is focused on miRNA variants, small, non-coding RNAs that are involved in a lot of biological processes. “They are known to be important players in epigenetic
when you put a wild fish in captivity,” he says. A selective breeding programme has been established at the research station, from which researchers aim to gain new insights in this respect. “We put these wild larvae that we collected from Egypt in a re-circulating aquaculture system, and monitored them as they grew,” explains Professor Fernandes. “When they reached sexual maturity we separated them into groups; one of averagely sized fish – the control lines – and another that were about 15 percent bigger – the selected lines.” The fish from this selective breeding programme were subjected to two sets of comparisons. One involves looking at changes
EPIFISH Innovative epigenetic markers for fish domestication Project Objectives
Second generation of Nile tilapia (Oreochromis niloticus) reared in our recirculating aquaculture system at Nord University.
from one generation to the next in these groups, both the selected lines and the control lines. “We want to investigate how the new environmental conditions in captivity (e.g., feed, water temperature and photoperiod) affect the expression of miRNAs, and how their DNA is modified. We plan to look at whether the modifications change across generations,” says Professor Fernandes. The other comparison researchers are making is between the selected and control groups within each generation. “The idea behind this is to find out which microRNAs and DNA modifications can explain differences in size between the selected fish and the control fish,” continues Professor Fernandes. “We’re looking to see what happens to the fish during domestication, and at how this changes across generations.”
Aquaculture industry This research holds important long-term implications for the aquaculture industry and how fish are bred for the commercial market. Existing selection programmes are based on genetic factors, yet these markers do not explain all the variations that can be seen between fish of the same species. “The hypothesis behind this project is that if we include epigenetic markers then we will be able to cover a larger proportion of
phenotype variability in the traits we want – like disease resistance, growth, and age at sexual maturity,” says Professor Fernandes. Finding and validating these epigenetic markers could lead to the identification of new parameters in selective breeding, which would be enormously beneficial for the aquaculture industry, yet Professor Fernandes says this is not an immediate prospect. “A lot of fundamental research needs to be done before we get to that stage,” he acknowledges. Researchers have nevertheless made progress over the course of the project, for example in identifying differences between small and big fish in terms of miRNA expression, DNA methylation and hydroxymethylation. Professor Fernandes and his colleagues aim to establish the proof of concept in the ERC-funded Epimark project. “We will try to validate these results by looking at more fish, including fish of different sizes and ages and from different locations,” he says. In future, Professor Fernandes hopes to build further on these findings, while also pursuing more exploratory research. “If we find some promising markers, it would be sensible to continue in an applied direction, to validate them and see if we can develop them further to use in aquaculture. I plan to combine this with continued fundamental research,” he outlines. Fisherman using a trap to catch tilapia in Luxor.
The overarching aim of EPIFISH is to ascertain the importance of epigenetics (DNA methylation and microRNAs) in fish domestication using the Nile tilapia as model species. The identification of epigenetic markers will enable the development and application of epigenomic selection as a new feature in future selective breeding programmes.
EPIFISH has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no 683210) and from the Research Council of Norway under the Toppforsk programme (grant agreement no 250548/F20).
• P rofessor John Liu, Syracuse University, USA • Professor Francesc Piferrer, CSIC, Spain • Professor Pål Sætrom, NTNU, Norway • Alex Alvarez, Genomar, Norway • Leonidas Papaharisis, Nireus, Greece
Project Coordinator, Jorge Manuel de Oliveira Fernandes Faculty of Biosciences and Aquaculture Nord University 8026 Bodø, Norway T: +47 75 51 77 36 E: email@example.com W: http://www.jmofernandes.com
Professor Jorge Fernandes
Jorge Fernandes is a Professor in Genomics and Molecular Biology at Nord University in Norway. His main research interests are domestication, antimicrobial peptides, microRNAs and the epigenetic regulation of gene networks by environmental factors. His research is aimed at commercially relevant fish species, such as nile tilapia, Atlantic cod and Atlantic halibut.
Is this the moment before global environmental catastrophe?
The UN body, the Intergovernmental Panel on Climate Change (IPCC) says we have 12 years before we are likely to witness a cataclysmic decline in nature, unless we do more to dramatically change our methods. With an original aim to limit global temperature rise by 1.5°C increasingly looking like an unachievable target, what difference can each half degree rise make? If we’re heading toward a 2°C temperature rise, surely, it wouldn’t be the end of the world, would it? By Richard Forsyth
s part of the decision to adopt the 2015 Paris Agreement to limit climate change, the IPCC was invited to produce, in 2018, a Special Report on global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways. It was arguably the most alarming report to date on our global climate change challenge. It’s made serious, unflappable scientists display rare emotional outbursts, and stirred up a public outcry that has galvanised populations in campaigning and being selective in their consumer choices. The facts, as they come out, are giving most of us a glimpse of a path that could lead to our own eventual demise, and a great many of us, are frankly scared. The science is not far-fetched, not alarmist for its own sake, nor is it, as President Trump once suggested, a Chinese conspiracy. “With more than 6,000 scientific references cited and the dedicated contribution of thousands of expert and government reviewers worldwide, this important report testifies to the breadth and policy relevance of the IPCC,” said Hoesung Lee, Chair of the IPCC. In total, 91 authors and review editors from 40 countries prepared the report. The report’s full name is Global Warming of 1.5°C, an IPCC special report on the impacts of global warming of 1.5°C above pre-industrial
levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty. It’s the first in a series of Special Reports to be produced in the IPCC’s Sixth Assessment Cycle. In October 2018, when the report was published, the news headlines could at last put a deadline for a tipping point, when it would be too late to reverse the damage, if we as a species failed to change our ways and continue to pump near to 40 billion tonnes of CO2 into the atmosphere. It is estimated we have till 2030 before 1.5°C will be an unstoppable consequence of our industrial actions. There were hopes in recent years that global CO2 emissions were plateauing but 2018 is likely to be the highest recorded emissions spike on record – at 37.1 billion tonnes. At 1.5°C the changes would be dramatic, for starters it would mean flooded lowlands, mass migration, increased poverty and inevitable deaths as land becomes uninhabitable. With such a short time before it’s too late to stop escalation of global temperature, beyond this, if we continue with business as usual, a 2°C temperature rise will have exponentially shocking and devastating consequences for our world.
“Right now, we are facing a man-made disaster of global scale. Our greatest threat in thousands of years. Climate change. If we don’t take action, the collapse of our civilisations and the extinction of much of our natural world is on the horizon.”
“Every extra bit of warming matters, especially since warming of 1.5°C or higher increases the risk associated with long-lasting or irreversible changes, such as the loss of some ecosystems.” So long coral reefs, and thanks for all the fish The details of the impacts of seemingly tiny rises in global temperature have rarely been exposed to be so desolating as in the latest report. Impending biodiversity disasters are being highlighted. “Every extra bit of warming matters, especially since warming of 1.5°C or higher increases the risk associated with long-lasting or irreversible changes, such as the loss of some ecosystems,” said HansOtto Pörtner, Co-Chair of IPCC Working Group II. For instance, coral reefs would decline by 70-90 percent with global warming of 1.5°C, whereas virtually all (> 99 percent) would be lost with 2°C. The very real threat of the complete loss of coral reefs has been observable in the last few years, for example, a third of the Earth’s largest reef system, the Great Barrier Reef died off in the 2016 heat wave. This one global environmental impact, of total destruction of reefs, is a game changer for nature as we know it. Coral reefs harbour the most diverse marine ecosystems and provide nutrients for marine food chains. Around 25% of fish species spend part of their life in reefs. Coral reefs could be thought of as the rainforests of the ocean, in that they are ideal habitats for thousands of species of fish to co-exist as well as other marine animals. These built up reefs also act as a wave buffer for coastal regions, beneficial during tropical storms as they reduce wave energy by up to 97%. One study estimated that if only the top metre of reefs were lost, annual flood damage would leap from $4 billion to $8 billion, affecting specifically Indonesia, the US, Philippines, Malaysia, Mexico, and Cuba with flooding damage every year. Coral reefs produce more oxygen than carbon dioxide. As a benefit for humans, coral can be used as a calcium supplement to treat some chronic health problems. It has been estimated that we are up to 400 times more likely to find new drugs from coral reef ecosystems than
land-based ones if they survive the next century. Around half a billion people also depend on reefs for food and employment. Reef extinction would have a knock-on effect to societies that would be hard to miss. And this is just one example of biodiversity being threatened to the point of destruction. Rich biodiversity is the lifeblood for a strong ecosystem where all the species play a part in the machine of the system – maintaining balance in species numbers, their impacts and contributions. High biodiversity is the benchmark of a healthy planet. A strong ecosystem will generate the essential resources to nurture its component parts. The more plant species there are means there will be higher variety of crops. The more animal species there are, means greater interplay of animals, plants and environment. Animals help fertilise land, pollinate plants, plants provide homes for creatures and species can live together in symbiotic systems, recycling each other naturally, for example, animals eat each other to manage over population, seeds are eaten from fruit on plants and are distributed in animal droppings, all to maintain natural balance in a system. Animals and plants effectively become gardeners and care takers of the wider environment. When ecosystems thin-out they eventually can crumble and die, and the environment becomes barren. If this were to be a world-wide phenomenon, eventually life itself would become unsustainable. Extinction is a word well used in 2018. We are already confirmed to be in the midst of an accelerated extinction era. To understand how bad things are, in May 2019 the Intergovernmental SciencePolicy Platform on Biodiversity and Ecosystem Services (IPBES) which compiled research from scientists representing 130 governments, will be presenting a global assessment on biodiversity and ecosystems and it’s going to be a bleak outlook. As the IPBES Chair, Sir Robert Watson said: “The loss of species, ecosystems and genetic diversity is already a
global and generational threat to human well-being.” Extreme impacts are predicted if we look ahead to a 2°C scenario. The likelihood of an Arctic Ocean free of sea ice in summer would be once per century with global warming of 1.5°C but compare that to at least once per decade, with a 2°C rise. In summary, such a global temperature will dramatically affect sea level and the disappearance of many coastal areas underwater. The contrasts in impact for small global temperature rises are huge. The coastlines and borders of waterways would be overcome, for example, we would potentially lose 14 of the 17 largest cities in the world. Our maps would be redrawn, and our populations reorganised.
Is there a will? The greatest problem is that serious political will is still a prerequisite for making potentially planet saving changes. For this reason, the latest IPCC report was a rallying cry for the UN supported COP24 meeting held in Katowice in Poland in the closing days of 2018. It was a crucial gathering, to increase commitments for climate action, bolstering the previous international decisions for positive actions, that came out of the Paris accord. Sir David Attenborough, the famous British naturalist spoke at the climate talks in Katowice, and didn’t pull his punches, stating: “Right now, we are facing a man-made disaster of global scale. Our greatest threat in thousands of years. Climate change. If we don’t take action, the collapse of our civilisations and the extinction of much of our natural world is on the horizon.”
To emphasise the urgency, UN Secretary-General, Antonio Guterres, indicated climate change was ‘a matter of life and death’ for many countries. In response, on 15 December 2018, 200 nation’s negotiators reached an agreement at COP24. In the spirit of the 2015 accord, countries set out to limit global warming to 2 °C above preindustrial levels by 2100, with a preferred target of 1.5 °C. Whilst goalposts are inevitably shifting there is still a fervent hope that damage can be limited and importantly the runaway acceleration of temperature rises can be stopped. The challenge cannot be underestimated, as countries like China and India are seeing rising levels of CO2 emissions. The Trump era of US politics also plays down climate change concerns whilst ramping up reliance on coal. Trump declared in 2017 he would withdraw the US from the accord, fuelling fears for the rest of the world and our shared climate future. There is also Brazil’s new far right President, Jair Bolsonaro, who is concerning environmentalists with his appetite for increasing exploitation and deforestation of the Amazon rain forest. Since the 1980s the Amazon has absorbed about 430 million tonnes of carbon a year. The Amazon is often called ‘the lungs of the world’. The report was clear that it would require an unprecedented international effort, with many solutions reliant on technologies that are new and still being developed, and that the world has just over a decade to put a revolutionary coordinated strategy into action. Factor these serious issues into the over-arching picture and a creeping pessimism is hard to fight.
The 24th session of the COP24 Conference of the Parties to the United Nations Framework Convention on Climate Change (UNFCCC COP24) in Katowice. Credit: C cop24.gov.pl.
“The decisions we make today are critical in ensuring a safe and sustainable world for everyone, both now and in the future,”
What do we need to do? The report examines pathways available to limit warming to 1.5°C, what it would take to achieve them and what the consequences could be. “The good news is that some of the kinds of actions that would be needed to limit global warming to 1.5°C are already underway around the world, but they would need to accelerate,” said Valerie MassonDelmotte, Co-Chair of Working Group I. The report finds that limiting global warming to 1.5°C would require “rapid and far-reaching” transitions in land, energy, industry, buildings, transport, and cities. Global net human-caused emissions of carbon dioxide would need to fall by about 45 percent from 2010 levels by 2030, reaching ‘net zero’ around 2050. This means that any remaining emissions would need to be balanced by removing CO2 from the air. Science, as usual is in the front lines with the fight, innovating and discovering new ways to combat some of the challenges. Take one example – relating back to the prediction that we are on course to lose the world’s coral reefs. A technique called micro fragmenting was discovered – by accident – by US scientist Dr David Vaughan that allowed coral to grow 40 times faster than in the wild. Instead of up to 75 years to grow to maturity the technique means coral can grow to maturity in three years. Many countries are making huge leaps and bounds with renewable energy. Costa Rica has a 98% reliance on non-fossil fuel sources. Sweden is aiming for 2040 as the year it will rely on 100 percent renewable energy. The UK wants to shut down its last coal-fuelled
plant by 2025. Morocco, Cyprus, Denmark, France and Scotland are all making headlines for their progress in sustainability, weaning off fossil fuels. In addition, industries such as electric transport are making progress in a transition from fringe to mainstream. As an incentive, The World Bank has stated it will supply funding of around $200 billion over five years to support countries taking action against climate change. Time is the real issue in all of this, because it’s widely believed that it’s the action within the next two years that is going to make the difference. The report does acknowledge that beyond simple government policies it is also a wider action from ‘societal choices’ that will define what happens in the near future to our emissions output. ‘Everyone’ needs to be onboard for a change of this magnitude. Undeniably, slashing global emissions in half by 2030 is a big ask. The awareness and belief of the problem is only the start of what is needed. Can we turn our collective fate around? It would be unprecedented, if so. One thing is for sure, we’ll know the answer soon as the countdown for climate impact being too late to reverse, is in the closing stages. “The decisions we make today are critical in ensuring a safe and sustainable world for everyone, both now and in the future”, said Debra Roberts, Co-Chair of IPCC Working Group II. “This report gives policymakers and practitioners the information they need to make decisions that tackle climate change while considering local context and people’s needs. The next few years are probably the most important in our history,” she added.
http: //www.ipcc.ch /report /sr15/ www.euresearcher.com
Flexible light sources for quantum technologies
Entangled frequency comb (artistic painting by K. M. Kues).
Photons are an important resource in the development of quantum technologies, with researchers seeking to control and harness their properties. We spoke to Dr Michael Kues about his MarieSkłodowska-Curie Individual Global Fellowship in which he investigated flexible on-chip light sources, which could open up new possibilities in both the academic and commercial sectors. A higher degree
of control over the production of photons could open up new possibilities in both the academic and commercial sectors, for example in bringing the prospect of quantum computing a step closer. Highly sophisticated photonic onchip/fiber-based systems are central to both producing photons and controlling their properties, a topic that lies at the heart of the DC FlexMil project. “We are working to enable flexible integrated pulsed light sources, for applications in both classical and quantum technologies,” says Dr Michael Kues. High-Q microring resonators, which consist of a waveguide bend to a ring in which light can propagate, are of particular interest. “The crucial point is that only certain frequencies can oscillate in this microring resonator – namely the frequencies that are resonant in this structure, enabling access to many discrete frequency components or colors,” explains Dr Kues. “Benefiting from a nonlinear phenomenon called four-wave mixing within these microring resonators, we’re trying to exploit these systems for the realization of controllable light sources with novel and unique properties.”
Microring resonator based laser The microring resonator can be used in various different experimental configurations. For a specific classical laser scheme, it is placed directly into a laser cavity, which then enables unique emission characteristics, namely very long, mode-locked laser pulses. “We have been looking into generating the longest possible
On-chip light sources at the single photon level Along with using the microring resonator for these classical laser concepts, Dr Kues and his colleagues found that it could also be used in the quantum domain to realize novel light sources at the single photon level, opening up a new avenue of investigation. “We are using the microring
We use integrated/fiber-based components to construct photonic quantum systems for different purposes; for example for the realization of very complex quantum states that could in future be used for quantum computation laser pulses together with realizing novel characterization techniques, enabling the possibility to resolve the full laser spectrum in the radio-frequency domain, which could lead to spectroscopy applications for example,” says Dr Kues. “Also, from a fundamental perspective, this laser system could contribute to the understanding of temporal laser dynamics.”
resonator to generate –through the nonlinear frequency conversion process of four-wave mixing– correlated photon pairs. Benefiting from the multi-frequency characteristic of the system, from there we can look to then generate different complex and unique quantum states, e.g. composed of many quantum bits, known as qubits” he outlines.
Time-bin entangled multi-photon state generation.
A classical bit can take one of two possible values, typically either 0 or 1, while a qubit by contrast can be both at the same time. These qubits are at the core of the wider field of quantum computing, which promises to significantly increase computational speed, underlining the wider relevance of the project’s work in both generating these qubits from an excitation laser pulse and controlling them. “When exciting the microring resonator with laser pulses we were able to generate these qubits by exploiting the photons’ time and frequency degrees of freedom” explains Dr Kues. A processing scheme to control these quantum states is also being developed in the project. “This is based on components commonly used in fibre-optic telecommunication networks, making this approach cost-effective and reliable,” continues Dr Kues. “We use these components to transform these quantum states for different purposes; for example into very complex quantum states that could be used for quantum computation, or for quantum key distribution.”
Entanglement is an important resource in the context of quantum technologies, where the properties of two photons in a pair are strongly correlated. Part of the project’s work in this area has involved using the arrival time of a photon to generate entangled states. In this case, the qubit is encoded into different discrete time bins, corresponding to the arrival time of a photon on a detector. In combination with the multi-frequency characteristics of the microring resonator approach, the researchers were able to generate multi-photon states in an integrated platform for the first time. Another important aspect of the project’s research is the dimensions of the system. “A qubit is a two-dimensional system, for example a 0 and a 1. A qudit is a higherdimensional system, where we can have 0, 1, 2, 3, 4, etc.” explains Dr Kues. By exploiting the frequency degree of a photon, so to say its colour, the researchers were able to generate entangled qudit states in an integrated format, where the photon is e.g. in a superposition of the colors red, blue and yellow. Micro-resonator base mode-locked laser with frequency beating.
High-dimensional cluster state generation scheme.
This is an important attribute of the system that Dr Kues and his colleagues in the project are developing. Highly complex quantum states are required to effectively exploit the quantum algorithms that have already been developed. “These complex quantum states are either composed of several photons to increase the information capacity of the state, or they go into a higher dimensionality,” outlines Dr Kues. The impact of these two approaches on the processing rate is an important consideration. “During the project, we achieved the first on-chip generation of a four-photon state. However, due to optical losses the processing rate diminishes as you increase the number of photons – so if you have six photons, the detection rate is lower,” continues Dr Kues. “However, if you add dimensions to the photon in the form of different frequencies, then the processing rate remains the same. We think that for photonics a combination of both is an interesting approach to follow in the future, so we can reach the quantum resources required for meaningful tasks.”
Multi-colored entangled photon states from an on-chip system.
High-dimensional entangled photon states.
Development and Control of Flexible Mode-locked Integrated Laser
DC FlexMIL is an individual global fellowship research action focusing on the development and control of novel integrated classical and non-classical light sources for applications in e.g. metrology and quantum technologies. It merges fundamental scientific investigations with recent advances in integrated photonics and fiber-based telecommunications technology to reach its goals.
The project received funding from the European Union’s Horizon 2020 Research and Innovation programme under the Marie SklodowskaCurie grant agreement number 656607.
• INRS-EMT (Institut national de la recherche scientifique Centre – Energie Matériaux Télécommunications), Canada. • University of Glasgow, UK.
Project Coordinator, Professor Michael Kues Hannover Center for Optical Technologies Nienburger Str. 17 D-30167 Hannover E: firstname.lastname@example.org W: http://dcflexmil.bplaced.net/DC_ FlexMIL/Results.html
Dr Michael Kues Multi-coloured entangled photon states in a fiber-based/integrated photonic system.
Miniaturised photonic system
Dr Michael Kues is conducting research on broad and interdisciplinary range of topics at the intersection of photonics and quantum science. In his current research, he focuses on the development and realization of compact on-chip optical quantum systems, and studies new and scalable optical approaches for present and future practical quantum information processing.
Researchers are not only looking into the generation of these quantum states in the project, but also how this can be achieved efficiently, through a small, practical and easily controllable system. Usually fairly large and complex systems are required, involving expensive optical free-space setups; Dr Kues and his colleagues in the project looked at an alternative approach. “Our approach is to miniaturise this and integrate it, making it easier to set up and use,” he says. The microring resonator is integrated on a chip, so it is more practical to use than previous systems. “We aim to slim it down, so that you can put these integrated light sources in a small box, instead of relying on large optical free space systems. This means it could not only be used in a special laboratory environment, but in out-of-lab scenarios,” continues Dr Kues. “We will continue working on this in future, to look into further miniaturising this, with the final goal of having the whole system integrated on a chip.” The project’s research could hold important implications for the future of quantum computing, as well as metrology,
telecommunications, spectroscopy and several other areas. The project itself recently concluded yet there is still scope for further research, and in future Dr Kues and his colleagues plan to explore the potential of these systems with respect to certain applications. “This could include sensing applications, for example,” he says. “The EU has established the Quantum Flagship to support continued investigation and help translate research into commercial development, which could propel this research further.” This type of research is very much collaborative in nature, and scientists from all over the world have made important contributions. As the recipient of this individual fellowship, Dr Kues has benefitted from the opportunity to spend time at INRS-EMT in Canada and University of Glasgow in UK, and the opportunity to share knowledge and ideas with international colleagues was central to the achievements of the project. “This research has been undertaken in a large collaboration, together with a core team at INRS-EMT, Canada and other places, who made important contributions to this work,” he stresses.
On-chip photonic system.
A new window on caloric effects in spin transport With Moore’s law expected to break down in the relatively near future, researchers are searching for new ways to control heat, charge and spin currents in nanostructures. The SpinCaT priority programme supports research into novel spin caloric effects, which could lead to interesting new functionalities, as Professor Christian Back explains It is expected that Moore’s law will break down with respect to CMOS technologies in the relatively near future due to the thermodynamic bottleneck, prompting a renewed focus on research into thermodynamic transport, spin caloric effects and other related topics. With circuit designers approaching physical limits in terms of the number of components that will fit on an integrated circuit, scientists are searching for new ways to control heat, charge and spin currents in nanostructures, topics central to the work of the SpinCaT priority programme. “The aim of SpinCaT was to develop the new research field of caloric effects in spin transport,” explains Professor Christian Back, the programme’s coordinator. Over 40 projects were supported under the programme, focussing on four areas. “These were spin caloric effects and spin mediated heat transport in planar geometry, thermal conductivities across interfaces in nanopatterned devices, spin currents induced by large temperature gradients and materials for spin caloric applications,” outlines Professor Back.
system FeCo, which allowed us to systematically tune the Fermi energy through the band structure, and to investigate the evolution of the transport properties with respect to composition.” A number of projects within the programme have focused on other novel spin-caloric effects, the spin counterparts of well-known thermoelectric effects like the Peltier effect. A deeper understanding of the electronic band structure
We have established an experimental platform for the simultaneous measurement of the electric, thermoelectric and thermal transport coefficients of metal films. The platform was applied to the alloy system FeCo, which allowed us to systematically tune the Fermi energy Spin caloric effects The projects were selected for inclusion in the programme solely according to scientific value, with scientists looking to build a deeper understanding of novel spin caloric effects, which can modify thermal transport, magneto-resistance and possibly even magnetic states. Based himself at the Technical University of Munich, Professor Back has been working on a project looking at the spin-dependent Seebeck effect, as well as thermal spin transfer torque. “We have clarified inconsistent evidence from different experiments in the literature, by showing that the so-called transversal spin-Seebeck effect in metals is unobservably small, when compared to the competing (established) magnetothermoelectric effects,” he outlines. “In the course of these efforts, we have established an experimental platform for the simultaneous measurement of the electric, thermoelectric and thermal transport coefficients of metal films. The platform was applied to the alloy
and electron transport can help researchers to optimize these effects. “A good example is the design of materials in Ferromagnetic alloy/tunnel junction/ferromagnetic alloy elements based on e.g. Fe/MgO/Fe, where the spin dependent Seebeck effect can be optimized by tuning the electronic band structure and thus improving electron transport across such a device,” says Professor Back. Research within the programme holds a lot of interest for industry, for example with respect to harvesting energy from waste heat using spintronic materials, while Professor Back says there are also other examples. “Heat management in nanoscale spintronic devices, such as read sensors or nano-oscillators, is another area of interest for industry. Then there’s energy harvesting in wearable electronics, employing the spin Seebeck effect,” he outlines. The primary motivation in research is scientific interest however, rather than the possibility of commercial applications, and the focus now is
building on the progress that has been made. Professor Back and his colleagues are applying a lot of technical advances that were achieved within their project, using spin Hall effects as efficient detection and characterization techniques, yet the focus of their research has shifted. “We are working on skyrmion hosting materials, as a new priority programme has just started on this topic. Skyrmions are non collinear magnetic nanosized topological objects that can be controlled and manipulated by spin currents and hold a lot of potential for future magnetic storage devices,” says Professor Back.
SPINCAT PP 1538 Spin Caloric Transport The priority programme was funded by the DFG for six years with a total amount of about 12 Million Euros. Professor Christian Back TUM Physics Department James-Franck-Str. 1 85748 Garching T: +49 89 289-12401 E: email@example.com W: http://www.spincat.info/ index.php Christian Back is a Full Professor of experimental physics at the Technical University of Munich. He previously worked as a postdoctoral researcher at the Stanford Linear Accelerator Center and IBM San Jose, before taking up a research position at ETH Zurich. He subsequently held a Professorship at the University of Regensburg before taking up his current position in 2017.
Redesigning the market for tomorrow’s technologies Technologies like solar lights and more efficient cooking stoves could bring significant social and economic benefits to developing economies, yet market frictions currently limit access to them. We spoke to Professor William Fuchs about his research into these imperfections, which could in the long run lead to more effective market design and regulation. Many people in
developing economies still use kerosene lanterns as a source of light, devices which consume large quantities of fuel while also representing a health and fire hazard. Solar lights are a safer, healthier, and ultimately cheaper alternative; however, this technology has not yet been adopted as widely as might have been expected. “People are not investing in this technology to the extent that an outside observer, looking at the economic and health benefits it offers, would expect,” says William Fuchs, a Professor at Universidad Carlos III de Madrid. In his research, Professor Fuchs is looking at the barriers and frictions that hinder the adoption of not only solar lights, but also other technologies that could boost welfare in developing economies, such as more efficient cooking stoves, anti-malaria nets and water
Theory and Applications in Development Funded under H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) ERC-CoG-2015 - ERC Consolidator Grant Project Coordinator, Dr William Fuchs Associate Professor McCombs School of Business Investigador de Excelencia Universidad Carlos III Madrid W: https://sites.google.com/ site/wfuchs/research/erc
Professor William Fuchs got his PhD at Stanford GSB in 2005. He then joined the Economics Department at the University of Chicago as an Assistant Professor and Thornber Research Fellow. In 2009 he moved back west to join the Finance Group at UC Berkeley’s Haas School of Business. He is now a distinguished researcher at Universidad Carlos III Madrid. His work has been published in the most prestigious Economics and Finance journals and has been supported by prestigious grants.
Training solar light vendors.
filters. “There are many frictions, for example the fact that most consumers don’t have $25 to pay for a solar light up front, and there’s not really a developed lending market that they can easily access, partly because contracts aren’t enforceable,” he outlines.
A second randomization which involved giving vendors the right to return unsold inventories did not have a significant impact, however the third was found to have a major effect on the level of sales. This randomization was directed at consumer constraints, in particular the fear of purchasing sub-standard imitation products, a common problem in some developing economies such as Uganda. “We trained vendors to charge the light, then they would leave it with a household for an evening. The household could then get a taste for a product before deciding on whether to buy it,” outlines Professor Fuchs. Greater access to credit, together with the ability to give potential consumers the opportunity to test the light before purchase, had a very
The hope to some extent is to try and figure out ways in which firms or institutions can overcome market frictions and help the market develop, so that people can get access to these very good products This is to some extent a failure of supply and demand, so researchers are exploring potential supply-side reforms to address these frictions, which could give more consumers in developing economies the opportunity to purchase these types of goods. Researchers ran a controlled randomized trial in Uganda, in which they essentially set up a medium-sized firm and entered into contracts with vendors which acted as the firm’s agents to sell technologies to consumers. “We varied the terms with these agents, looking for relationships that produced better results,” outlines Professor Fuchs. There were three main randomization elements in the experiment, from which Professor Fuchs and his colleagues aimed to learn more about how a market can be more effectively regulated. “The first randomization was about the amount of credit available to vendors. We looked at whether extending trade credits to vendors helped them overcome liquidity constraints, essentially a lack of cash. The idea was that vendors would then extend credit to their consumers, and this would help the market grow. This had a big effect in terms of sales,” says Professor Fuchs.
big impact on the vendor’s results. “While on average a vendor with no credit and no test light sold less than 2 lights over the duration of the experiment, one that got access to credit and used a test light sold around 16 lights,” explains Professor Fuchs. This research holds important implications, as products like solar lights and more efficient cooking stoves could have a significant impact in developing economies like Uganda, where consumers often spend a lot of time and energy gathering fuel for cooking and lighting. A household with a solar light would save $1.50 a week on kerosene costs, while it would also help improve health and give people more time to focus on education and generating their own income, so Professor Fuchs believes that overcoming these market frictions could lead to wider social benefits. “The hope to some extent is to try and figure out ways in which firms or institutions can overcome market frictions, and help the market develop, so that people can get access to these products,” he outlines. “I am also optimistic that new technologies and advances in mobile payments will further enable these markets.”
Plotting a path to reduce the energy performance gap Improving energy efficiency performance in buildings is a major priority for the European Commission, with a target of achieving 20 percent energy savings by 2020. The EU promotes solutions which reduce energy consumption in the building sector to achieve this, an area which forms the primary research focus for the MOEEBIUS project The MOEEBIUS project is introducing a totally new approach to reducing energy consumption in the building sector, based on modelling the optimization of energy efficiency in buildings for urban sustainability. MOEEBIUS has been developing its work since the latter part of 2015, and the initial results are very promising. The goal in the project was to elaborate products and services like the Integrated Energy Performance Optimization Framework, an Application for Consumers, or the multi-sensing NOD device that will enable the minimization of the aforementioned ‘performance gap’ and promote customer confidence in the effectiveness of Energy Performance Contracting (EPC). The project also looked at the ability of ESCOs (Energy Service Companies) to guarantee results and mutually agree with customers on savings targets, thus reducing the business risks that have hindered the growth of the ESCO market, especially at the EU level. Technically, the system as a whole appears quite complicated. Standing on a white box modelling approach of buildings, systems and distributed energy resources, a diverse range of components have been developed that take advantage of the energy simulation of those models and the real time monitoring of key performance indicators in pilot sites. Those components are combined in different ways, depending on the business scenario, use case and end-users’ needs. Selected applications are: Building Energy Performance Simulation tool, Demand Flexibility Engine, Dynamic Assessment Engine, Occupants’ User Interface, Predictive Maintenance Advisor, Retrofitting Advisor Tool, Facility Manager & ESCO Management Tool, or Decision Support System.
MOEEBIUS-NOD, a wireless multi-sensor device.
From the data acquisition side, the project has introduced innovations in a multi-sensor wireless device, MOEEBIUS NOD, for indoor monitoring and distributed data acquisition and management middleware. In the last stage of the project the partners focused on the validation of specific solution components at large-scale pilot sites, located in Portugal, the UK and Serbia, incorporating diverse building typologies, heterogeneous energy systems and spanning diverse climatic conditions. These components started from different Technology Readiness Levels (TRL) and have been developed by a diverse range of project partners, from universities to technological centres and large companies. As Dr Pablo De Agustin, representing the MOEEBIUS project coordinator, mentions, the potential of these models, algorithms, tools, etc. is impressive. However, there is still work to be done to prove that they can work together. One of the biggest technical difficulties in the project is undoubtedly integrating all the components within the MOEEBIUS framework and adapting its
Building Energy Model for a residential building in Belgrade, Serbia.
architecture to the needs and constraints of each business scenario and use case. Living Labs (LL) activities played a crucial role in elaborating solutions tailored to the needs of potential end-users, and were an element which differentiated MOEEBIUS from ostensibly similar EU projects on energy efficiency. This is an environment for experience sharing and exchange towards user-driven open innovation of products and services. The activities carried out within the MOEEBIUS Living Labs were oriented towards widely disseminating the project outcomes, creating opportunities for exploitation and replication of the project results. Most importantly, the Living Labs served as a channel for gathering feedback from the end-users and interested stakeholders throughout the whole project. This provided a basis to optimize all project developments, so as to directly address the critical needs of end-users, building occupants and relevant stakeholders involved in the operation of the MOEEBIUS optimization framework. The pilot sites included a diverse range of building types and uses, where the contact people included professionals (ESCOs, Aggregators, Building Managers), with a solid knowledge of energy and technology aspects. Occupants of residential buildings also acted as contact points, as Dr De Agustin explains: “These interactions with end-users convinced us that, in order to ensure that MOEEBIUS framework answered to stakeholders’ needs in each pilot site, the solution had to be flexible and adaptable, and their requirements should be understood.” Indeed, the Living Lab community included both end-users of the pilot sites and external experts from academia and energy business sectors, who provided feedback during the project External view of a residential building in Belgrade, Serbia.
External view (left) and Building Energy Model (right) of a primary school and sport hall complex in Mafra, Portugal.
lifetime through questionnaires and workshops. This approach helped project partners in the cocreation process and in adapting the solution to each country and use case. Moreover, the general reaction of Living Lab participants on the proposed solutions was very positive; “We really appreciate LL members’ attitude, both in answering online questionnaires and actively participating in the workshops,” dr Pablo De Agustin adds. “In exchange, our intention was to propose useful solutions to diverse stakeholders in the energy market, from ESCOs and aggregators right through to consumers. Solutions are currently being evaluated in the pilot sites, and we can already confirm that project partners are considering MOEEBIUS framework components as solutions that will help them in their decision making processes and in their day-to-day operations, working towards the goal of achieving energy savings. These partners include a district heating company in Serbia, an aggregator in the UK, and building managers in Portugal,. On the other hand, consumers are already enjoying an informative consumption and billing mobile app, which is expected to raise awareness of energy efficiency among the general public.” The main challenge faced by the MOEEBIUS project remains the energy performance gap: this is the deviation between the predicted energy performance of buildings and their actual performance. This ‘performance gap’ is reflected in the inaccuracy of business-as-usual modelling techniques as a method of representing the
realistic use and operation of buildings. A variety of factors are usually simplified through assumptions, not updated or even not included in static/passive models. These factors include the occupant’s behaviour, the complexities of the building (i.e. thermal bridges, infiltrations), alterations on building fabric, services, usage and controls during building life time, non-efficient control strategies, loss of performance due to low quality building practices and environmental inaccuracies (i.e. local weather, surrounding modelling). All these multiple and heterogeneous aspects are considered in the holistic MOEEBIUS approach. Continuous calibration of the models is necessary in order to adjust the energy model to reality and keep them aligned over time. In this sense, the building energy models have been developed in EnergyPlus open-source software and standardized in order to allow dynamic modification of multiple parameters. A Building Energy Performance Simulation tool has been established in a server to run the models and update them according to inputs from other components, such as occupancy or user behavioural profiles, or weather forecasts. On the other hand, a Dynamic Assessment Engine launches simulation requests, which results in Key Performance Indicators. These are compared with the real data measured at the building, modifying multiple calibration parameters and repeating the simulation requests until, through Bayesian models, the loop converges. An analogous approach has been implemented for the district level energy
models, in this case based on Modelica. The reduction of the ‘performance gap’ is also a real challenge for ESCO’s. As Post–Occupancy Evaluation studies in built and occupied buildings have demonstrated, the measured energy use can be as much as 2.5 times higher than the predicted energy use (more than 70 percent in the retail sector, 100 percent in residential, 150 percent in offices, and over 250 percent in the education sector). So current predictions tend to be unrealistically low whilst actual energy performance is usually unnecessarily high. In more detail, the ‘performance gap’ generates a consequent gap between payback estimates and techno-commercial Return-On-Investment calculations in ESCO projects which still use previous energy audits based on simplistic and inaccurate calculations. Due to this, ESCOs are not able to provide customers with appropriate simulations on acceptable paybacks of 2-5 years using low-cost interventions. This constitutes a significant barrier to the development of the ESCO market, believes Ander Romero, coordinator of the MOEEBIUS project; “ESCOs are forced to add installations and commissioning services, project management, man hours, measurement and verification costs to hedge risk induced by prediction uncertainty and inaccuracy.” It makes many contracts totally unattractive, including in cases where the ESCO takes responsibility for the full implementation of a refurbishment project (from auditing to design and implementation). This introduces extra risks for ESCOs and significantly reduces their profit margins.
District Energy Model of Stepa Stepanovic neighbourhood in Belgrade, Serbia.
MOEEBIUS Modelling Optimization of Energy Efficiency in Buildings for Urban Sustainability Project Objectives
MOEEBIUS introduces a Holistic Energy Performance Optimization Framework that delivers innovative tools which deeply grasp and describe real-life building operation complexities in accurate simulation predictions. The system reduce „performance gap” and enhance optimization of building energy performance.
Project Funding Hence, through significantly reducing the ‘performance gap’, the holistic approach introduced in MOEEBIUS enhances the ability of ESCOs to guarantee attractive energy savings. “This will, in turn, eliminate the need for the addition of risk-hedging costs on-top of pure energy services, consequently increasing the payback attractiveness of energy performance contracts and reinforcing the confidence of customers regarding EPC effectiveness. This is crucial for the growth of the ESCO market, especially on the EU level,” says Ander Romero. It’s obviously possible to indicate potential beneficiaries of the new solution. ESCO’s and DSM (Demand Side Management) Aggregators have been identified as main system stakeholders and as core business entities in the project. But occupants of buildings have their interest, too. According to assumptions of different innovative
would serve as a revenue source for all system stakeholders. In conclusion, MOEEBIUS enables the development of different business models in two areas - ESCO and demand response. The project offers the possibility of providing users with fast, clear, easy, and always available insights into their own consumption and the cost of the energy used for heating. It enables users to learn, train, and make progress in their behaviour and demand response. It also enables remote monitoring of conditions in the apartment and control of the heating system. It enables ESCO firms, at the same time, to calculate more accurately the savings achieved by raising awareness and educating users about ways to save energy, which reduces the risk of poor EPC contracts and enables development in this area. So MOEEBIUS creates a win-win situation for all the stakeholders involved.
Interactions with end-users convinced us that, in order to ensure that MOEEBIUS framework answered to stakeholders’ needs in each pilot site, the solution had to be flexible and adaptable, and their requirements should be understood business models elaborated within the framework of MOEEBIUS, there are four innovative approaches tailored to the needs of the novel system users. In the business model suggesting a new energy management based on enhanced Energy Performance Contracts, owners of specific buildings (public or commercial) could benefit from energy savings of up to 50 percent per side, which could be boosted by the incorporation of contextual and operational building parameters on optimization process. Another efficiency business model for ESCO assumes that revenue per service will be offered by the ESCO to building managers (so that savings are shared). Along with the third proposed business model, raising occupants’ awareness is proposed as a tool for generating energy savings. Eventually, the valorisation of buildings through energy certification
After nearly four years of international collaboration, the MOEEBIUS project will soon conclude its activities. The culmination will be the project’s final conference on 28th of February 2019 in Wels, Austria, in the framework of World Sustainable Energy Days, which brings together more than 650 delegates from over 50 countries from business, the research community and the public sector. This international event is one of the largest annual conferences on sustainable energy in Europe. Every year it features policies, technologies, innovation and market development through a unique combination of conferences and interactive events. We encourage you to follow the Moeebius project website www.moeebius.eu to keep up-to-date with the project’s achievements and the details of the final Moeebius event.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 680517.
Fundacion TECNALIA Research & Innovation Parque Cientifico y Tecnologico de Bizkaia - C/Geldo, Edificio 700 E- 48160 Derio, Bizkaia (Spain) Project coordinator: Ander Romero E: firstname.lastname@example.org T: +34 946 430 069 W: www.moeebius.eu Pablo De Agustin E: email@example.com Dissemination leader: Mrs. Agnieszka Kowalska E: firstname.lastname@example.org Dr Pablo De Agustin (Left) Ander Romero (Centre) Agnieszka Kowalska (Right)
Dr Pablo De Agustin is an energy efficiency researcher at TECNALIA, he has worked on energy efficiency in buildings and renewable energies’ integration since 2011. His professional background includes simulation and experimental works focused on energy efficiency, electric and thermal metering, buildings as energy storage systems, and trigeneration and selfconsumption solar heating and cooling systems. Agnieszka Kowalska holds the roles of International Cooperation Department Director and Senior Project Manager within ASM. She has been involved in over 25 international research projects. Expert in market research, socio-economic analysis, perception of innovative solutions by end users as well as business models and exploitation plans development. Ander Romero is a Project Manager in the Sustainable Construction Division of TECNALIA. He joined TECNALIA in 2007 as a senior researcher in the field of energy efficiency in building design and retrofitting, focusing on energy modelling and the integration of innovative and sustainable solutions to optimize urban and building energy performance.
Soft materials for solar energy conversion Solar energy has an important role to play in meeting growing global demand for energy, yet conventional methods of making solar cells have some significant limitations. We spoke to Dr Micheál Scanlon about his work in investigating a new approach which could lead to the emergence of a new type of solar conversion device. The conventional method of making solar cells is by using inorganic materials to make solid state architectures, through which light is harvested and converted into chemical energy. However, inorganic materials are expensive and production is energy-intensive, while this approach has other shortcomings which limit the effectiveness of solar cells. “When a light is shone on a material an exciton particle is created. It then has to be separated into a positive charge and a negative charge, and this leads to different problems. For example, if there is an impurity in the material, the charge may get stuck,” explains Dr Micheál Scanlon, a lecturer in chemistry at the University of Limerick. As the Principal Investigator of the Soft Photoconversion project, Dr Scanlon is now exploring a new approach to solar energy conversion, based on the use of a liquid-liquid interface. “We’re trying to convert solar energy without using solid materials,” he says.
Water-oil interface This research is built on Dr Scanlon’s expertise in controlling an electric field at a water-oil interface. Photosynthesis is a good example of nature efficiently converting light energy into stored chemical energy, so Dr Scanlon draws inspiration in his research from the natural world. “In a biological cell you have a fatty, oil-like membrane in the middle, and water on either side. We’re trying to mimic a membrane-cell structure in the water-oil interface,” says Dr Scanlon. The oil used in the interface has to be extremely hydrophobic, meaning that it is incapable
of mixing with water. “The oil that we have picked is very immiscible with water, and the oil molecules only mix with the water in a 1 nanometre region, meaning the interface is just a nanometre thin,” outlines Dr Scanlon. The photoproducts are separated at the interface based on their affinity to water, with one side of the interface
the system is that everything happens at the liquid-liquid interface. “Any molecule more than 2 nanometres away from the interface is lost, as electrons can travel only a tiny distance between molecules. So the dye on the interface, the molecule in the water, and the molecule in the oil, all have to be within a few nanometers (at most) of
We shine a light on the dye at the liquid-liquid interface. The dye uses that energy to effectively move an electron from the low-energy molecule in the oil to the high energy molecule in the water. very hydrophilic, while the other is very hydrophobic. Researchers take advantage of this to convert light energy into chemical energy, using a dye. “We shine light on the interface, and an electron is transferred from the molecule in the oil to the molecule in the water,” explains Dr Scanlon. One major issue to consider here is back electron transfer, which Dr Scanlon says can limit conversion efficiency. “You need inputted energy to move the electron from the low-energy molecule to the high-energy molecule,” he outlines. “We shine a light on the dye at the liquid-liquid interface. The dye uses that energy to effectively move an electron from the low-energy molecule in the oil to the high energy molecule in the water.” A lot of attention in the project is now focused on modifying the interface to improve the efficiency of the energy conversion process. The key point about
each other when the light hits,” explains Dr Scanlon. The concentration of dye at the interface is an important factor in solar conversion efficiency, so a lot of attention in the project has focused on optimising these strategies of dye-sensitising the liquid-liquid interface. “The photoconversion efficiency increases linearly with the amount of dye on the liquid-liquid interface, so the more dye we can concentrate there, the better the performance of our system” says Dr Scanlon. This is a very novel approach to solar conversion, so Dr Scanlon and his colleagues have created a number of their own customised techniques during the project as they aim to improve photoconversion efficiency. A range of techniques have been used to characterise a liquid-liquid interface, including electrochemical, spectroscopic and surface tension measurements methods. “We’ve also just started doing confocal
Photo by Andreas Gücklhorn
SOFT-PHOTOCONVERSION Solar Energy Conversion without Solid State Architectures: Pushing the Boundaries of Photoconversion Efficiencies at Self-healing Photosensitiser Functionalised Soft Interfaces Project Objectives
The objectives of Dr Scanlon’s research group are to: • develop methodologies to functionalize soft interfaces with photoactive materials, e.g., dyes or semiconductors, • develop in situ methodologies to characterise materials at soft interfaces, • use these photoactive soft interfaces in all-liquid-based solar cells, • optimize the efficiency of these novel solar cells.
The concept of “SOFT-PHOTOCONVERSION”: The oil phase contains a molecule capable of being oxidised and reduced, but of low energy. The water phase also contains such a molecule, but of much higher energy. To drive electron transfer “up-hill” from the molecule in the oil to the molecule in the water, we trap solar energy using a layer of concentrated dye at the interface. This input of energy from light is converted to chemical energy in the form of a reduced molecule in the water and an oxidised molecule in the oil.
raman spectroscopy at the liquid-liquid interface. We’re using these techniques to look at our dye at the liquid-liquid interface,” outlines Dr Scanlon. One of the principal goals of the project has been achieved, namely maximising the concentration of dye at the interface, now Dr Scanlon is looking towards the next steps. “We plan to move forward and optimise the kinetics of the photo-electrochemistry at the liquid-liquid interface,” he says. There are a lot of kinetic elements to consider here, and it’s important to gain deeper insights into each of the steps involved. With the project approaching the halfway point of its funding term, Dr Scanlon plans to make further calculations over the coming year. “How fast is electron transfer? How fast is re-combination? What’s the ratio of the electron transfer to recombination?” he outlines. These are important issues in terms of conversion efficiency. “We aim to maximise the rate of electron transfer. We also want to maximise the rate of photoproduct separation, which is essentially the opposite of recombination,” continues Dr Scanlon. “If the charge separates then you’ve effectively converted light energy into chemical energy. But if they recombine then this is undone.”
Looking to the future The long-term goal would be to apply this approach more widely in solar energy conversion, an objective very much in line with the EU’s Renewable Energy Directive and its 2050 Energy Strategy, which set ambitious goals around future provision of energy from renewable sources. While this technology is not yet ready for wider application, Dr Scanlon is considering how it could be used in traditional solar cells. “One approach is through selfassembly at the liquid-liquid interface, where the molecules use the water-oil interface as a template. When they adsorb at the interface, each molecule will adsorb in a specific orientation,” he says. “The point with the system is that you have a set of molecules in the oil, and a set of molecules in the water. When you pour them out everything self-assembles and you don’t have to do anything further. We’re really working on that intensively.” This research is largely exploratory at this stage, rather than being directly concerned with practical applications. Research into renewable energy is widely recognised as a major priority however, and the project’s work provides strong foundations for further development. “A future project would take this proof-of-concept, develop strategies to make the dye more concentrated, and then really start engineering them with respect to useable solar cell devices,” says Dr Scanlon.
Dr Scanlon’s research is funded by a European Research Council Starting Grant (agreement no.716792) and Science Foundation Ireland Starting Investigator Research Grant (grant number 13/SIRG/2137).
Dr Micheál Scanlon Department of Chemical Sciences AD3-017 Analog Devices Building Bernal Institute University of Limerick (UL) Limerick, Ireland T: +353-61-237760 E: email@example.com W: https://www.ul.ie/research/blog/ulresearcher-awarded-€15m-pioneering-solarenergy-research Dr Micheál Scanlon
Dr Micheál D. Scanlon is a Principal Investigator in the Bernal Institute, and lecturer in the Department of Chemical Sciences, at the University of Limerick. His research involves nanomaterial self-assembly and electrochemistry at immiscible liquid-liquid or “soft” interfaces for solar energy conversion, electrocatalysis and sensor development.
A green outlook on property renovations As more people and organisations consider installing their own solar panels, ICT plays an increasing role in connecting energy supply and demand across communities. The DREEAM project aims to help identify the right combination of technologies to help manage this energy effectively and reduce energy bills, as Rolf Bastiaanssen explains A lot of attention in research is currently centered on improving energy efficiency in residential buildings, with scientists developing innovative technologies to reduce consumption. Only a relatively limited set of technologies can be applied when the focus is on individual buildings however, due to both economic and practical constraints, now the DREEAM project is looking at the issue on a larger scale. “We are working with large social housing providers to plan renovations at scale, as you can get costs down through simple operational efficiencies,” says Rolf Bastiaanssen, the project’s Principal Market Developer. The project is working at pilot sites across Europe with the aim of identifying the right combination of technologies to improve energy efficiency, taking local circumstances into account. “For example, in urban environments, there may be solar panels on the roof to generate energy,” outlines Bastiaanssen. “There’s not a lot of space on a roof. There is also the question of who owns that energy and what you do with it.”
DREEAM Demonstrating an integrated Renovation approach for Energy Efficiency At the Multi-building scale With a focus on social and public housing, the DREEAM project aims to show how renovating housing stock on a larger scale is an opportunity for better integration of renewable energy and is generally more cost-effective. The project demonstrates a multi-building and single owner renovation approach that can achieve a 75% reduction of total energy demand. Rolf Bastiaanssen C/ Casp 118-120, 5o-2a, 08013 Barcelona, Spain T: +34 93 476 04 44 E: firstname.lastname@example.org W: www.baxcompany.com W: http://dreeam.eu/
On average, as little as 45 percent of the energy generated by photo-voltaic panels on a residential building is actually used by the household that installed them, with the surplus being sold back to the national grid at low rates. Creating a local smartgrid opens up the possibility of selling that energy to neighbours in an apartment block, benefitting both parties. “Maybe a neighbour would pay 80-90 percent of the commercial
energy generated by renewable sources of energy can vary according to the local climate and the time of year, the methods by which that energy can then be effectively distributed are more broadly applicable, and Bastiaanssen says the market is set to evolve further. “Over the coming years, as technology continues to develop, we can expect that there will be many more energy producers than we see now,” he stresses.
We are working with large social housing providers to plan renovations at scale, as you can get costs down through simple operational efficiencies. price for that energy,” explains Bastiaanssen. Housing associations are able to do this on a larger scale, as they often manage thousands of apartments, which Bastiaanssen says could lead to wider benefits. “We work with large housing providers to demonstrate the value that could be delivered to people in the affordable housing sector through the adoption of energy flexibility services,” he continues. “It’s about using renewable energy efficiently. These flexibility services could lead to reductions in household bills of between 10-20 percent.” This approach is very much in line with the wider shift towards the de-centralisation of energy provision and the goal of bringing supply closer to demand. The aim in the project is to work with different networks of housing associations and connect them with solution providers that could help improve efficiency. “We provide housing providers with insights and technology, and we invite solution providers that we see could add value,” explains Bastiaanssen. While the amount of
The focus of attention in the project has been on social housing, yet this approach could potentially be applied by private housebuilders and companies in future. New buildings will have to meet ever-tighter energy efficiency standards, and Bastiaanssen believes that flexibility services will have an important role to play in this respect. “Regulations in some countries state that buildings should be energy-neutral, so they will need to generate energy, for example through solar panels. From here, the situation becomes more complicated, and you have to distribute the energy that has been generated,” he says. This points to a growing need for flexibility services; Bastiaanssen says that by developing an effective methodology for energy-efficient renovations, the project will make an important contribution. “We will take our methodology and help housing associations see how green renovation at large scales is cheaper in the long-term, while it is also environmentally-friendly and makes good business sense.” Pilot site in Berlin - Netelbeckplatz
Rolf Bastiaanssen is a partner and senior consultant at Bax & Company. He has been involved in creating and leading international collaborative R&D projects and in business planning for start-ups as well as for industry.
Getting behind the value of energy efficiency The EC has set ambitious goals around reducing carbon emissions and retrofitting Europe’s housing stock to improve energy efficiency. We spoke to Maarja Meitern of Bax & Company about the Revalue project’s work in investigating the relationship between the energy efficiency of a property and its value A range of
factors are typically taken into account in valuing a property, including location, size and access to local amenities, yet energy efficiency has not historically been a major consideration. This is a topic at the heart of the Revalue project, an initiative which brings together six partners from across Europe. “We are looking at the relationship between the energy efficiency of a property and its value,” explains Maarja Meitern, the manager of the project. This work is focused on affordable housing and housing associations, with researchers gathering and analysing data from five European countries; the UK, Netherlands, Germany, Sweden and Spain. “We found that in most markets, energy efficiency has a very marginal effect on value. For example, in Amsterdam we found some evidence that there was a premium attached to more energy-efficient
introduced in early 2018 which requires all privately rented new lets to have a minimum EPC label of E. That will lead to change in the market,” says Meitern. The regulatory environment is evolving, as governments seek to meet CO2 reduction goals, something of which housing associations need to be aware. As energyefficiency standards improve, those housing providers that do not invest in retrofitting properties risk being left behind. “If building owners do not act now, they risk losing money in the long run through having stranded assets, a property that they cannot let because there is no demand in the market for that quality of building,” points out Meitern. Lenders across Europe have started factoring in energy efficiency in their long-term risk models, and are even adjusting interest rates for more sustainable portfolios.
Building owners need to invest in energy efficiency today or risk being left with stranded assets in the future. properties. But this was still very marginal in most cases,” outlines Meitern. This means housing associations don’t have a strong immediate financial incentive to retrofit properties and improve energy efficiency, despite recognition of its importance to the wider goal of reducing carbon emissions. On the other hand., Meitern says the project did find evidence that investing in energy efficiency can lead to financial benefits in some markets. “For example, in the UK market we found that houses with double-glazed windows had a higher final valuation than similar houses in the same neighbourhood with single-glazed windows,” she says. Nevertheless, the regulatory environment and the availability of subsidies are even more important considerations in this respect, as governments across Europe look to accelerate the transition towards a low-carbon economy. “Policy-makers are looking at changing the current minimum energy efficiency standards – a clear example would be the UK’s Minimum Energy Efficiency Standards (MEES)
Many housing providers and organisations nevertheless tend to have quite a shortterm outlook on retrofitting properties, often working on a house-by-house basis. By gathering and analysing relevant data, the project aims to help these organisations identify how they could scale this up and retrofit houses more efficiently. “We need to get better quality business cases, which will allow housing associations to negotiate better financial terms with banks,” explains Meitern. This is a key part of the project’s overall message. “We would encourage property owners and housing associations to collect data on their assets, as banks are starting to look in greater detail at retrofit projects. So data collection is very important,” stresses Meitern. “Many banks already ask their valuers to collect data on the energyefficiency performance of buildings. For housing associations, it is about long-term investment planning and smarter decisionmaking.” This could encourage more housing associations to invest in retrofitting houses,
which is very much in line with goals set by the European Commission around limiting carbon emissions. The residential sector is an important part of this, so energy efficiency could become a more prominent consideration in property valuations in future. “One of the project partners, the Royal Institute of Chartered Surveyors (RICS) in the UK is developing an insight paper on how property valuers can take energy efficiency into account. In the updated edition of their Red Book, they also urge valuers to collect and record sustainability data,” says Meitern. “We want to help inform valuers about the importance of energy efficiency, and to get the message across to building owners that they really need to look today at how they retrofit properties. They need to think about the investments they need to make in order to avoid being left with stranded assets in future.”
REVALUE Scaling Energy Renovation REVALUE aims to help valuers to reflect the value of energy efficiency in their valuations of both social and private housing stock. Introducing the right guidance and identifying additional income streams would hopefully encourage buyers and owners to consider EE retrofit decisions. Maarja Meitern, Consultant C/ Casp 118-120, 5o-2a, 08013 Barcelona, Spain T: +34 93 176 31 10 E: email@example.com W: http://revalue-project.eu Maarja Meitern is a consultant at Bax &amp; Company, working on smart energy systems, future energy markets and institutional sustainability strategies.
The Underground Lakes of Mars The European Space Agency (ESA) announced in February, they had found the first geological proof of a system of interconnected lakes under the surface of Mars, five of which may contain minerals crucial to life. By Richard Forsyth
s our planetary neighbour, Mars has long tantalised us with its mysteries. With successions of sophisticated hardware deployed on the surface and in orbit around the Red Planet we are finding out new astounding facts about this world on a regular basis. In total there are currently six active satellites orbiting Mars, and the 15 year long exploration of the NASA rover Opportunity, has only recently come to its end in February – but all the time we are planning new missions and machines to ‘take the baton’ for pioneering research. Scientists are confident that Mars once had substantial, large bodies of surface water and may have had opportunities for sustaining life. Features on the surface that look like they were shorelines have been identified.
However, it is the underground water systems that have been under scrutiny of late. Climate models for early Mars reveal temperatures that rarely rise above freezing, so wet periods may not have been prolonged (relatively speaking), which is not the ideal environmental scenario for surface life. The subsurface is more promising a proposition for the hosting of some form of Martian life. Incredibly, there is evidence that water under Mars remains today. In 2018, a pool of liquid brine (about 1.5 kilometres below the surface, measuring in length about 20 kilometres) was detected beneath the Red Planet’s South Pole. Models had suggested an underground, connected system would exist but hard proof was missing, as was the understanding of the mechanics of such as system.
This is the ExoMars2020 Rover which will soon be deployed to drill into the Martian surface for sample. ©Mars_ESAATG medialab
Evolution of water filled basins over time. This diagram shows a model of how crater basins on Mars evolved over time and how they once held water. This model forms the basis of a new study into groundwater on Mars, which found that a number of deep basins – with floors sitting over 4,000 m deep – show signs of having once contained pools of water. There are three main stages: in the first (top), the crater basin is flooded with water and water-related features – deltas, sapping valleys, channels, shorelines, and so on – form within. In the second stage (middle), the planet-wide water level drops and new landforms emerge as a result. In the final stage (bottom), the crater dries out and becomes eroded, and features formed over the previous few billions of years are revealed. ©NASA/JPL-Caltech/MSSS; Diagram adapted from F. Salese et al. (2019)
Therefore, European Space Agency’s latest discovery around underground water systems of ancient Mars provided a welcome revelation with fresh insights and confirmed what scientists had been suspecting.
Beneath the Martian Surface Seeing beneath the surface of another planet is, in its own merit, a feat of human ingenuity that deserves being celebrated. It was all thanks to ESA’s Mars Express orbiter, which has been circling the planet since 2003, using a radar instrument called MARSIS (Mars Advanced Radar for Subsurface and Ionosphere Sounding). The orbiter is able to help find answers to all sorts of questions relating to geology, atmospheric conditions, the surface and the history of water on Mars. It relies on three radar booms, two of which are 20 metres long, to gather data, as it circles the planet from above. The latest research, now published in the Journal of Geophysical
Research, indicates that there was a groundwater system on Mars that fed into the lakes. By examining imagery sent back from the orbiter, researchers looked at 24 deep, enclosed craters in the Northern Hemisphere, that had floors around 4,000 metres below the estimated sea level at the time. The features in these craters, that could only be caused by water, consisted of channels etched in crater walls and valleys carved out by sapping groundwater and various other physical indicators. These features showed that some of the craters once had pools and flows of water that changed and receded over periods of time. “Early Mars was a watery world, but as the planet’s climate changed this water retreated below the surface to form pools and ‘groundwater’,” said lead author Francesco Salese of Utrecht University, the Netherlands. “We traced this water in our study, as its scale and role is a matter of debate, and we found the first geological evidence of a planet-wide groundwater system on Mars.”
“Findings like this are hugely important; they help us to identify the regions of Mars that are the most promising for finding signs of past life.”
This image from ESA’s Mars Express shows a network of dried-up valleys on Mars, and comprises data gathered on 19 November 2018 during Mars Express orbit 18831. The ground resolution is approximately 14 m/pixel and the images are centered at 66°E/17°S. This image was created using data from the nadir and colour channels of the High Resolution Stereo Camera (HRSC). The nadir channel is aligned perpendicular to the surface of Mars, as if looking straight down at the surface. North is to the right. ©ESA/DLR/FU Berlin
This colour-coded topographic view shows the relative heights of the terrain in and around a network of dried-up valleys on Mars. Lower parts of the surface are shown in blues and purples, while higher altitude regions show up in whites, yellows, and reds, as indicated on the scale to the top right. This view is based on a digital terrain model of the region, from which the topography of the landscape can be derived. It comprises data obtained by the High Resolution Stereo Camera on Mars Express on 19 November 2018 during Mars Express orbit 18831. The ground resolution is approximately 14 m/pixel and the images are centered at 66°E/17°S. North is to the right. ©ESA/DLR/FU Berlin
The Search for Signs of Life The water levels seem to align with shorelines of a Martian ocean that’s been proposed to have existed three to four billion years ago. It’s speculated by the researchers that the ocean may have connected to the system of underground lakes around Mars. In addition, signs of minerals were found in five craters that are linked to the emergence of life on Earth such as clays, carbonates, and silicates. This is exciting new evidence that Mars may have supported life, as the ingredients to support it were in place. Zones beneath the Martian surface have been a promising focus for some scientists who suspect life could have been prevalent on the
planet. For example, a study by researchers at Brown University in the USA, published in Earth and Planetary Science Letters, suggested that ancient underground subsurface areas could have been home for substantial amounts of microbial life over long periods of Martian history. The idea was that within these habitable zones, microbes could take hydrogen electrons from water for energy. This idea was taken from a study of underground microbes on Earth that were deprived on sunlight, whose source of energy proved to be hydrogen electrons from water molecules that had drained into the subsurface. These communities of microbes thrived in this unusual way. In theory, with flowing water,
Express Delivery Mars Express gained its name from being put together quicker than any other comparative planetary mission. It arrived at Mars at the same time as Beagle 2 – a lander that made it to the Martian surface but sustained damage that rendered it unable to transmit data. The orbiter was tasked with imaging the entire surface of Mars at a high resolution (10 metres per pixel) and
What ancient mars may have looked like billions of years ago, based on MOLA data. © Ittiz
some parts of the surface at very high resolution (2 metres per pixel). It was also tasked with understanding how the atmosphere interacts with the solar wind, mapping the composition of the atmosphere, as well as the atmosphere’s effect on the surface. Finally, as it has now aptly demonstrated, its mission involved determining the structure of the sub surface to a depth of a few kilometres.
there could have been the same system for microbial life on Mars. The researchers concluded that 4 billion years ago, the Martian subsurface was absorbing enough hydrogen to energise microbes for hundreds of millions of years. The underground lakes prove an exciting find for the basis of future Mars missions. The findings can aid other machines currently investigating Mars, for example another Mars orbiter, the ExoMars Trace Gas Orbiter, is analysing the atmosphere in great detail, looking for gases related to biological or geological activity, and to identify subsurface locations where water-ice or hydrated minerals are present. The orbiter will be working in tandem with research relayed to it by
an ExoMars rover on the surface. This rover is a joint venture between ESA and Russian Space Corporation, Roscosmos, and is named Rosalind Franklin after the scientist who discovered the structure of DNA. It is to begin research in 2021. It will be capable of drilling down into the surface to around two metres depth to analyse composition of the soil. The confirmation of the interconnected subsurface lakes has created headlines in both tabloid newspapers and scientific publications, as a milestone development in understanding Mars. As Dmitri Titov, ESA’s Mars Express project scientist puts it: “Findings like this are hugely important; they help us to identify the regions of Mars that are the most promising for finding signs of past life.”
Mars Express discovered water buried under the South Pole of Mars. Copyright Context map: © NASA/Viking; THEMIS background: © NASA/JPL-Caltech/Arizona State University; MARSIS data: © ESA/NASA/JPL/ASI/Univ. Rome; R. Orosei et al 2018. ESA’s Mars Express has used radar signals bounced through underground layers of ice to find evidence of a pond of water buried below the south polar cap. Twenty-nine dedicated observations were made between 2012 and 2015 in the Planum Australe region at the south pole using the Mars Advanced Radar for Subsurface and Ionosphere Sounding instrument, MARSIS. A new mode of operations established in this period enabled a higher quality of data to be retrieved than earlier in the mission. The 200 km square study area is indicated in the left-hand image and the radar footprints on the surface are indicated in the middle image for multiple orbits. The greyscale background image is a Thermal Emission Imaging System image from NASA’s Mars Odyssey, and highlights the underlying topography: a mostly featureless plain with icy scarps in the lower right (south is up). The footprints are colour-coded corresponding to the ‘power’ of the radar signal reflected from features below the surface. The large blue area close to the centre corresponds to the main radar-bright area, detected on many overlapping orbits of the spacecraft. A subsurface radar profile is shown in the right hand panel for one of the Mars orbits. The bright horizontal feature at the top represents the icy surface of Mars in this region. The south polar layered deposits – layers of ice and dust – are seen to a depth of about 1.5 km. Below is a base layer that in some areas is even much brighter than the surface reflections, highlighted in blue, while in other places is rather diffuse. Analysing the details of the reflected signals from the base layer yields properties that correspond to liquid water. The brightest reflections are centred around 193°E/81°S in the intersecting orbits, outlining a well-defined, 20 km-wide zone.
Getting to the heart of stellar systems Recent observations have shed new light on astrophysical dynamics and the behaviour of stellar systems, now researchers aim to build on these foundations. The GalNUC project is developing sophisticated dynamical models with the wider aim of investigating the properties of dense stellar systems, as Professor Bence Kocsis explains. A region at the centre of a galaxy, galactic nuclei host supermassive black holes and are densely populated with stars and other compact objects. Based at Eötvös Loránd University in Budapest, Professor Bence Kocsis and his colleagues in the GalNUC project aim to probe deeper into the nature of these stellar systems. “We are trying to understand the physical properties of these dense stellar systems,” he explains. The number of stars and other objects in these systems is much higher than in other regions of the universe. “For example, the next star beyond our own Sun is more than a light year away from the Sun. It’s 1.3 parsec away, which is approximately 4.2 light years,” outlines Professor Kocsis. “Whereas in these regions that we’re interested in, you have millions of stars within just a few light years. So that’s a very high number of stars in a very small volume.” Dense stellar systems These stars in a dense stellar system are distributed in both spherical and counterrotating disk-like structures, a major topic of interest to Professor Kocsis. In one of these structures, stars have been observed to rotate in a clockwise direction, while in the other, another set of stars rotate in a counter-clockwise direction. “This observation was made a couple of years ago. Since then there have been new observations, and neutron stars and black holes have been discovered in the Galactic nucleus,” says Professor Kocsis. Researchers now aim to develop a model from which more can be learnt about galactic nuclei. “We’re trying to develop some simple models, and to understand which types of models lead to which type of activity,” continues Professor Kocsis. “From there, we can then look to see whether the model matches some of the features that have been observed.” This research brings together statistical physics and astrophysics. Typically statistical physics is used to understand quite smallscale objects, but Professor Kocsis says it can also be applied on astrophysical observations. “The way stars interact in the galactic centre is very similar, in a mathematical sense, to the interaction among liquid crystal molecules,” he explains. These molecules have an axisymmetric shape, and the statistical behaviour of the
Black holes (shown with blue) settle in a disk in a galactic nucleus simulation.
system as a whole can be described by deriving its Hamiltonian, which helps to determine the energy of the system. “The stars in the Galactic center move around in a manner unlike bees in a beehive, their respective orbits cover disks with a geometry that resembles that of liquid crystal molecules. Liquid crystals exhibit a phenomena called phase transition, which is very interesting in physics,” says Professor Kocsis. “If the liquid crystals are cooled down to very low temperatures, then an ordered state forms.” A phase transition can be thought of as the point at which a system assumes a completely different form, such as when ice changes into water for example, while it will eventually enter into a disordered state if heated above a certain temperature. All that is needed to describe the phase transition of a system is its Hamiltonian; researchers have found that the Hamiltonian of stars orbiting in a galactic nuclei is actually very similar to that of liquid crystals. “This discovery led us to hypothesise that maybe this model would have similar phases. So maybe at low temperatures a coherent, ordered phase can exist, where the stellar orbits align in a disk, while at high temperatures the distribution will be disordered resembling a sphere,” explains Professor Kocsis. “We’re very interested in finding out what the order/disorder transition depends on, and whether this can be applied in the galactic nuclei.” Researchers have found that the transition between order and disorder depends on the mass of the object, or the population. Professor Kocsis points to the example of a stellar system with a large population of stars – some with a high mass, some with a lower mass – in which there is also a distribution of black holes. “These black holes are typically more massive than regular stars – they
are usually somewhere between 5 and 50 solar masses,” he outlines. The more massive objects tend to settle in a more ordered state than lower mass objects. “So higher mass objects may represent ordered states and low mass objects will represent the disordered states in many cases,” continues Professor Kocsis. “This also ties in with observations, which show that highmass stars are in a disk and lower-mass stars are distributed spherically.”
Gravitational waves This work could also lead to interesting new insights into the distribution of black holes. Globular clusters, a type of star cluster which does not have a supermassive black hole at its centre, are of great interest in this respect. “Calculations show that these clusters are expanding. They lose mass, because stars are ejected beyond the escape velocity. The cluster expands, while the inner region gets denser and denser,” outlines Professor Kocsis. These stellar systems hold a lot of interest in the wider field, as it has been hypothesised that they may be the sites of stellar mergers and the mergers of black holes. “It is now possible to detect the merger of black holes using gravitational wave observations. We’re trying to make predictions about the origin of pulses of gravitational waves,” says Professor Kocsis. Researchers from the LIGO-VIRGO collaboration recently managed to detect the merger of two black holes at a distance of at least one billion light years from the Earth, now Professor Kocsis aims to build further on these findings. The aim is to make specific theoretical predictions about the types of gravitational wave sources that can be expected and at what rate, which can then be
GALNUC Astrophysical Dynamics and Statistical Physics of Galactic Nuclei Project Objectives
GALNUC strives to develop a comprehensive model to describe the long term behavior of galactic nuclei using revolutionary multidisciplanary methods. GALNUC explains the astrophysical origin of electromagnetic and gravitational waves from these systems, which host a central supermassive black hole and the densest population of stars and compact objects in the Universe.
Overall budget: € 1 511 436
The GALNUC team at the Eotvos Lorand University observatory in Budapest.
compared with observed data. “If the theory does not match the observations, then we will know that some important pieces are missing,” explains Professor Kocsis. Models may predict the presence of gravitational waves which have not yet been observed, yet these waves could be detected in future with more sophisticated instruments, so Professor Kocsis says theoretical models have an important role to play in informing how observatories operate. “One of the predictions we have made is that intermediate mass black holes should be observed in the future,” he outlines.
millihertz is required. This type of instrument would need different technologies than those which are currently used.”
LISA gravitational wave detector A lot of energy is currently being devoted to this work, while research is also ongoing into the development of space-based instruments to detect gravitational waves. LISA, a spacebased gravitational wave detector, is currently being developed by the European Space Agency. “This instrument is currently planned to be launched in 2034. A lot of predictions are being
The next star beyond our own Sun is more than a light year away from the Sun, it’s 1.3 parsec away. Whereas in these regions that we’re interested in, you have millions of stars within just a few light years. These black holes have a mass of above 100 solar masses, however they are not supermassive black holes, which have a much larger mass. It has also been predicted that these black holes will probably merge with other back holes. “We can even predict the rate at which they will merge as a function of redshift,” says Professor Kocsis. Current facilities are not capable of observing these objects, so the question then arises on how they could be observed in future. “The frequency of the gravitational waves that they emit are too low to be detected with the noise in current instruments,” explains Professor Kocsis. “As the noise at frequencies between 10 and 30 hertz is reduced by ongoing upgrades, these instruments may detect intermediate mass black holes. To detect supermassive black holes, an instrument capable of detecting frequencies between around 0.01 and 100
made about its possible impact,” says Professor Kocsis. This will form an important part of Professor Kocsis’ future research agenda. “We are planning to make specific predictions for gravitational wave observatories,” he continues. “Current gravitational wave detectors – namely LIGO-VIRGO – are observing much higher numbers of mergers than originally expected. The big question is; what is the astrophysical origin of these types of mergers? What can these observatories see?” Researchers will also look to further improve the theoretical models. It has been established that there is a connection between statistical physics and astrophysics, now Professor Kocsis and his colleagues aim to build further on these findings. “We’re trying to make use of this connection, to make better and better models,” he outlines.
Project Coordinator, Bence Kocsis Assistant Professor, Department of Atomic Physics Institute of Physics Eötvös Loránd University Pázmány Péter sétány 1/A Budapest H-1117 Hungary T: +36 1-372-2500/6342 E: firstname.lastname@example.org W: http://galnuc.elte.hu/
Bence Kocsis obtained a PhD from Eotvos University (Hungary). He has held prestigious independent postdoctoral positions at the Harvard Center for Astrophysics as a NASA Einstein Fellow and the Institute of Advanced Study Princeton. He is currently an assistant professor at Eotvos University where he leads the GALNUC project since 2015.
ALMA (ESO/NAOJ/NRAO), NASA/ESA Hubble Space Telescope, W. Zheng (JHU), M. Postman (STScI), the CLASH Team, Hashimoto et al. The expanded square image shows the very distant galaxy MACS1149-JD1, seen as it was 13.3 billion years ago and observed with ALMA. ALMA (ESO/NAOJ/NRAO), Hashimoto et al.
Illustration of how we pinpoint the time when the galaxy JD1 “switched on”.
Probing for the dawn of light in the universe The first stars formed around 250 million years after the Big Bang, producing the chemical elements that we see all around us today. Researchers in the First Light project are looking back into cosmic history, aiming to pinpoint the time at which the universe was bathed in starlight for the first time, as Professor Richard Ellis explains. The Universe is believed to be around 13.8 billion years old, yet there were no stars during the early part of cosmic history. In fact, it was not until around 250 million years after the Big Bang that the first stars formed, as hydrogen clouds began to collapse under their own weight. “As these gas clouds collapsed, the energy that had kept them up was converted into heat. As the centre of these gas clouds became very hot, hydrogen was synthesised by nuclear burning into helium, and so the universe was bathed in starlight for the first time,” explains Richard Ellis, Professor of Astrophysics at University College London. As the Principal Investigator of the ERCfunded First Light project, Professor Ellis aims to pinpoint when this event occurred, a quest which he says holds profound implications. “We are all made up of
material that was synthesised in stars,” he stresses. “We are thus searching for our own origins when we follow this exciting quest for `First Light’.” Researchers in the First Light project are using images and data from both space and ground-based telescopes to look back into cosmic history and observe early galaxies, formed at a time when the universe was barely 3 percent of its current age. Images from the Hubble and Spitzer Space Telescopes allow researchers to find where early galaxies are located, then more detailed insights can be gained through analysis of complementary data from ground-based telescopes. “Large ground-based telescopes such as ALMA, the VLT in Chile and the Keck telescopes in Hawaii,are ideal for getting spectra. These spectra are crucial for revealing how far away different galaxies are, and hence – taking into
account the expansion of the universe – at what period in cosmic history we’re seeing them,” explains Professor Ellis. This allows researchers to build a more complete picture.
The first glimpse of `First Light’ A number of important advances have been made over the course of Professor Ellis’ project, including exciting spectroscopic observations of MACS1149-JD1, one of the furthest known galaxies, seen at a time around 500 million years after the Big Bang. His team confirmed that this galaxy is at a redshift (z) of around 9.1, meaning the universe has expanded more than ten-fold since light left it. “The universe is expanding, and as it expands, light rays from galaxies are stretched,” he explains. The team detected oxygen – the most distant ever detection of this element – from which new insights can be drawn on the previous
Detecting dust in the early Universe: Celestial dust comprises minute grains of silicon and other elements produced in supernova explosions (left). Dr Laporte and Professor Ellis detected the glowing radiation from warm dust in the galaxy YD4 (zoom) which is gravitationally-lensed by the massive cluster A2744.
history of this remote galaxy. “Oxygen is only synthesised by nuclear burning in stars. When a star collapses at the end of its life, it explodes in a supernova,” outlines Professor Ellis. “Supernovae are key to research on cosmic evolution because all the chemical elements that have been produced during the lifetime of a star are then expelled into space, and are ready as source material for any new stars that form.” The presence of oxygen at z=9.1 suggests prior nuclear processing, which indicates even earlier periods of star formation. “We’re inferring that there must have been starlight at earlier epochs in an object that we’ve studied at redshift 9.1,” says Professor Ellis. However, the most significant discovery Ellis’ team found for MACS1149-JD1, in collaboration with Japanese colleagues at Osaka Sangyo University, is the clear signature of mature stars. Detailed models of the energy spectrum of the galaxy indicate this galaxy is already 290 million years old at z=9.1, indicating `First Light’ occurred at a redshift of z~15 when the Universe was only 250 million years old. “Through this analysis, we have our first exciting glimpse of when the Universe emerged from darkness,” explains Professor Ellis. Dust observations are similarly valuable in terms in assessing the ages of specific galaxies. “Like oxygen, dust is also produced in supernovae, and if we know roughly how much dust there is in a galaxy at early times, then it tells us how many supernovae must have exploded to produce that dust. That allows us to age-date a galaxy at early times,” explains Professor Ellis. “This then enables us to determine when First Light might have occurred, even if we can’t yet observe it directly.” One of Professor Ellis’ postdoctoral assistants, Dr Nicolas Laporte, made an
important contribution in this area through observations of dust in A2744_YD4, a very distant young galaxy, using the Atacama Large Millimetre Array (ALMA) - an interferometer based in the Atacama desert in Chile. Professor Ellis explains how ALMA can take two types of measurements. “It can scan across a certain frequency range, just like turning the dial on a radio. If there’s suddenly a signal as you scan in frequency, that would tell you that there’s a spectral feature, for instance from oxygen. We found an oxygen emission in A2744_YD4, which gives us the redshift of that galaxy at z=8.38,”
“We can observe very early galaxies in their infancy, estimate their age, then from that assess when they were born,” outlines Professor Ellis. The final piece of progress Professor Ellis’ team has made is in determining exactly how these early galaxies drive cosmic reionisation – the transformation of hydrogen atoms in deep space into their constituent protons and electrons. “When the first stars switched on they were very hot, and they had enough energy to ionise the hydrogen around them. A very hot star produces very energetic ultraviolet light rays which have the energy
Large ground-based telescopes are ideal for getting spectra. These spectra are crucial for revealing how far away different galaxies are, and hence – taking into account the expansion of the universe – at what period in cosmic history we’re seeing them. he outlines. But ALMA is also sensitive to glowing dust, similar to background noise you might hear when tuning a radio. “This background noise represents the glow from interstellar dust heated by stars. Interstellar dust is composed of tiny particles of silicon that are only produced in supernovae at early times. They are sub-millimetre sized nuggets of silicon,” says Professor Ellis. “These particles are spinning in space and heated by starlight – then they glow, just like embers in a fire.” Given Professor Ellis and his team has found that these dust particles existed just 600 million years after the Big Bang, they aim to build further on these findings. One major objective is to survey a larger population of early galaxies, see how much dust each contains, and build up a census.
to break hydrogen back into a proton and an electron. We call this process cosmic reionisation,” explains Professor Ellis. “In this context, we can estimate how numerous early galaxies were, but can we be sure all this energetic ultraviolet radiation escapes into deep space?” The team has come up with a novel technique to address this important question by correlating the presence of early galaxies with fluctuations in the degree to which space is ionised as probed by hydrogen absorption seen in the detailed spectrum of a luminous background source, a quasar. “We are using the quasar, an intensely bright signal, as a beacon which illuminates the clouds of hydrogen along the line of sight towards us. Linking the presence or absence of hydrogen clouds
FIRST LIGHT Unveiling first light from the infant Universe
Several hundred million years after the Universe was born the first stellar systems began to shine. Energetic photons from early hot stars, freionised the hydrogen in deep space. Ambitious observational facilities can directly chart this final frontier in cosmic history. The programme has three complementary themes. (i) Tracing the duration of the reionisation process by analysing the spectra of early galaxies; (ii) Determining whether star-forming galaxies are the sole agent of reionisation by addressing the number of ionising photons they produce and the fraction that escape; (iii) Inferring the abundance of the earliest galaxies whose direct detection is beyond reach of current facilities. Masses and ages of galaxies will be used to plan surveys for the James Webb Space Telescope.
The First Light programme is entirely funded by the ERC award which commenced on Oct 1 2015 and ends on Sept 30 2021.
Collaborators include astronomers at the Universities of Tokyo, Arizona, California (Santa Cruz, Davis), Lyons and Be’er Sheva (Israel).
Project Coordinator Professor Richard Ellis Professor of Astrophysics Department of Physics & Astronomy University College London Gower Street London WC1E 6BT T: +44 20 3108 7912 E: email@example.com W: https://www.ucl.ac.uk/astrophysics/ research/cosmology/first-light
Professor Richard Ellis
Drs Kakiichi and Bosman together with Ph.D. student Mr Meyer work with Professor Ellis to gather spectra of galaxies along the line of sight to a quasar (far right) luminous enough to highlight intervening absorbing clouds of neutral hydrogen. In this way they can determine how much ionizing radiation escapes a typical galaxy.
to the proximity of galaxies allows us to examine the strength of the radiation that escapes from galaxies,” adds Professor Ellis. This idea, first promoted by Dr Koki Kakiichi, one of Professor Ellis’ postdoctoral assistants, is a major step forward in building the picture of how this cosmic reionisation occurred.
James Webb telescope The James Webb telescope, currently scheduled for launch in March 2021, will allow astronomers to look further back than is currently possible and possibly observe `First Light’ directly, an enormously exciting prospect. Competition for observing time with the telescope will be intense, so Professor Ellis and his colleagues are already making preparations. “We’re reaching out to collaborators, so that we improve our chances of getting observing time,” he explains. The James Webb telescope could also play an important role in other areas of research, such as the question of how quasars form. “As material falls into a black hole, it accelerates as it enters, and it produces radiation that is not related to starlight. Although these extraordinarily luminous quasars have been known about since the 1960s, we now know
Star forming regions (yellow) produce ionising radiation but only a fraction of it reaches outer space due to re-absorption and scattering within each galaxy. Determining this fraction is a major challenge being addressed by the team.
that quasars existed right back when the universe was about a billion years old, so about 8 percent of its present age,” continues Professor Ellis. “So an obvious question is; how did these quasars form?” The answer must be that the black holes they contain grew over time, and there must therefore be black holes in the early universe as well. This was the subject of a recent paper led by Dr Laporte. “We published a paper looking at the spectra of several of the most massive galaxies in the reionisation era. And indeed we found evidence that they may contain black holes,” outlines Professor Ellis. This is an important consideration in terms of the future operation of the James Webb telescope. “When James Webb is launched, we should be open to the idea that we have another problem to solve, and that is; where do these quasars come from? And how fast did these black holes grow over time?” says Professor Ellis.
Richard Ellis is Professor of Astrophysics at UCL with previous professorial positions at Durham, Cambridge and Caltech. He studies dark matter, the cosmic expansion and the first galaxies. His awards include the Gruber Cosmology and Breakthrough Foundation Prizes and the Gold Medal of the Royal Astronomical Society. He is a Fellow of the Royal Society and the Australian Academy of Sciences.
New light on galaxy formation The epoch of reionization began roughly 200 million years after the Big Bang, as photons from the first stars broke up hydrogen atoms into their constituent protons and electrons. Researchers in the Delphi project are investigating the relationship between galaxy formation and reionization, while also addressing other major questions in physical cosmology, as Dr Pratika Dayal explains.
According to our current understanding, immediately after its inception in the Big Bang, the Universe underwent a period of accelerated expansion (“inflation”) after which it cooled adiabatically. Roughly 400,000 years later, for the first time the Universe became cool enough that electrons and protons recombined to form hydrogen and helium (“recombination”); at this point matter and radiation also decoupled (“decoupling”) giving rise to the Cosmic Microwave Background (CMB; shown by the coloured dots). This was followed by the cosmic “Dark Ages” when no significant radiation sources existed. These cosmic dark ages ended with the formation of the first stars a few hundred million years after the Big Bang. These first stars started producing the first photons that could reionize hydrogen into electrons and protons, starting the ``Epoch of cosmic Reionization” which had three main stages: the ``pre-overlap phase” where each source produced an ionized region around itself, the ``overlap phase” when nearby ionized regions started overlapping and the ``post-overlap phase” when all of the hydrogen in the Universe was effectively completely ionized.
The first stars are thought to have formed around 200 million years after the Big Bang, following the end of the cosmic dark ages. The first galaxies produced photons which led to the start of the epoch of cosmic reionization, when hydrogen was broken apart into protons and electrons, marking an important point in the evolution of the universe. “As the universe was re-ionized, it also heated up. When reionization started, the universe had an ambient average temperature on the order of around 60 kelvin. But in regions which were reionized, the temperature rose to up to 20,000 kelvin,” explains Dr Pratika Dayal. Based at the University of Groningen in the Netherlands, Dr Dayal is the Principal Investigator of the Delphi project, an ERC-backed initiative developing a model to investigate the relationship between cosmic reionization and the rate of galaxy formation. “If there are small or low-mass galaxies inside these ionized regions, their gas may be boiled out. These low-mass galaxies are the most numerous in the universe, so we think in general that they were one of the key drivers of reionization,” she continues.
These tiny galaxies which formed in the first billion years of cosmic history are the key building blocks of all structure, and essentially the major sources of reionization. The question of how these tiny galaxies evolved into the galaxies that we see today is another major area of interest to Dr Dayal. “Galaxies form hierarchically, and over time smaller galaxies merged to form larger and larger systems,” she outlines. The third question being addressed in the project surrounds the nature of dark matter, which is thought to account for the majority of matter in the universe. “Everything done in this area currently is based on the cold dark matter paradigm (CDM), where dark matter is very cold and made up of heavy particles. This is an assumption though – we’re not sure whether dark matter is cold, or if it’s actually warm,” explains Dr Dayal. “If these galaxies that formed in the first billion years of cosmic history are the key building blocks of all structure and the key sources of reionization, then changing the kind of dark matter that they are embedded in would probably change the formation of structure and the history of reionization.”
Cosmic reionization The processes of galaxy formation and reionization were closely inter-linked during the epoch of reionization, with galaxies forming and in turn driving reionization. However, reionization may also have hindered star formation to some degree. “We think that when galaxies formed, they reionized the surrounding space. That could essentially have stopped some or all of the star formation in the reionized space,” outlines Dr Dayal. The epoch of reionization ended when all the hydrogen had been reionized, now Dr Dayal and her colleagues are looking back into cosmic history to learn more about how reionization affected galaxy formation. “We have data from a number of telescopes, for example the Hubble Space telescope, which allows us to look at these low-mass galaxies in the first billion years of cosmic history,” she says. “Mapping out the epoch of reionization is one of the key aims of cuttingedge facilities including Lofar (Low frequency Array), MWA (Murchison widefield Array) and the forthcoming SKA (Square Kilometre
Array). All of these aim to map the presence, and evolution, of neutral hydrogen in the first billion years of cosmic history.” This work involves analysing the 21 centimetre emission, from which researchers can look to assess how much hydrogen was left neutral during that period, indicating the extent to which reionization had proceeded. The wavelength from the 21 centimetre emission enables scientists to identify the redshift – or the era – which this gas relates to, with Dr Dayal focusing on the epoch of reionization. “What we expect to see is that at the beginning of the epoch of reionization, all of the hydrogen should basically be neutral. So we should see this 21 centimetre emission from everywhere in the universe,” she outlines. However, this changes as galaxies start to ionize the surrounding hydrogen. “As galaxies ionize the surrounding hydrogen, we start to see little holes appearing in this 21 centimetre emission. These holes grow over time, until the 21 centimetre signal disappears,” explains Dr Dayal. “This data will come on-line over the next 5-7 years from the SKA, which is going to be built in South Africa and Australia.” The research community looking at this 21 centimetre signal from the epoch of reionization has historically been separate from that looking at galaxy formation. However, these two topics are actually quite closely
related says Dr Dayal, who is now looking to enable deeper collaboration. “We are trying to build a framework to relate the 21 centimetre emission to the underlying galaxy population,” she says. The project is making full use of the latest cosmological simulations to investigate the relationship between reionization and the rate of galaxy formation. “We need extremely large simulations for this, while we also need a very high resolution so that we can look at
CDM framework A further important dimension of Dr Dayal’s work centres around investigating the nature of dark matter, which is thought to comprise around 80 percent of matter in the universe. The CDM framework has proved fairly accurate in predicting the large-scale structure of the universe, yet Dr Dayal says a number of problems become apparent at smaller scales, one of which is the so-called
As galaxies ionize the surrounding hydrogen, we start to see little holes appearing in this 21 centimetre emission. These holes grow over time, until the 21 centimetre signal disappears. the tiny galaxies which provided most of the photons for reionization,” continues Dr Dayal. “We’re running high resolution simulations, over a very large volume. The novel point about our approach is that we are combining dark matter simulations with galaxy formation models, which will be coupled with a full radiative transfer code which can actually follow the process of reionization. Dr. Anne Hutter, a postdoctoral fellow in the Delphi team, is the driving force behind this numerically intensive project.”
missing satellite problem. “In the CDM framework, dark matter can form structure over all scales. Conventionally, simulations of cold dark matter have predicted the presence of thousands of low-mass satellites of the milky way. But then when we go to look for these satellites, we find only around 20,” she explains. Changing the nature of dark matter itself could lead to new insights in this respect. “What if you change the kind of dark matter? What kind of constraints are current or future
The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre of collecting area. Delphi aims at building a framework to combine galaxy data, obtained from facilities including the JWST and E-ELT, with the 21cm data that will be provided by the SKA to build a complete picture that links the progress of reionization to the underlying sources driving the process.
DELPHI DELPHI: a framework to study Dark Matter and the emergence of galaxies in the epoch of reionization
To answer three crucial questions in Astrophysics and Cosmology: (i) how did the interlinked processes of galaxy formation and reionization drive each other?; (ii) what were the physical properties of early galaxies and how have they evolved through time to give rise to the galaxy properties we see today?; and (iii) what is the nature (mass) of the mysterious Dark Matter that makes up 80% of the matter content in the Universe?
H2020 ERC starting grant (agreement number 717001). Rendition of the ELT and its enclosing structure or dome. Here, it is shown with respect to the London Eye and the small dots on the ground next to the telescope are cars. You can see the overall structure will be huge – it will be about the same size as a football stadium! The 39-metre primary mirror of the ELT will consist of almost 800 hexagonal segments, each 1.4 metres wide, and only 50 mm thick.
observations able to give us?” asks Dr Dayal. “We’re carrying out a lot of calculations to link those properties to things that we can actually measure.” This could mean looking at the metallicity of galaxies, to take one example. Currently the metal content of galaxies can be mapped to a redshift of 3.5, covering roughly the last 2 billion years of cosmic history; the aim is to develop a model to provide a baseline against this. “We want to be able to ask; what was the metal content of early galaxies in different cosmologies? How did it evolve over time in different dark matter models? We’re trying to figure out the properties of early galaxies – and how they evolved into the systems which we can see at later times in a range of dark matter models,” outlines Dr Dayal. The model itself is being built up in steps, and although the project still has around three years to run, Dr Dayal says significant progress has already been made. “We have completed our dark matter simulation, it took a couple of months at the Leibnitz supercomputing facility in Munich. Now we’re coupling it with our galaxy formation model. We’ll then couple this fully and self-consistently with a semi-numerical reionization model developed by Dr. Hutter, before we carry out a full radiative transfer calculation for reionization,” she continues.
maximise the scientific value of the data that it generates. “We’re trying to relate the 21 centimetre signal that we get from neutral hydrogen with the underlying galaxy population. The question is; how can we optimise this co-relation with the SKA? Should we survey large patches of the sky with a low resolution, or should we survey small patches with a high resolution?” she outlines. Several other major observational facilities are currently under development, including the James Webb Telescope and the European Extremely Large Telescope (E-ELT); Dr Dayal aims to encourage collaboration between them. “We are trying to combine these different data sets and get these different facilities talking to each other. From this, we can then look to work out whether it is best to have synergy between the SKA and the E-ELT or JWST for example, or say the SKA and the existing Subaru telescope,” she outlines.
• Professor Stefan Gottloeber, Leibniz Institute for Astrophysics, Potsdam, Germany • Professor Gustavo Yepes, Universidad Autónoma de Madrid, Spain
Pratika Dayal Principal Investigator Faculty of Science and Engineering Astronomy — Kapteyn Astronomical Institute Landleven 12 9747 AD Groningen The Netherlands T: +31 50 363 4089 E: firstname.lastname@example.org W: https://pratika24.wixsite.com/delphi
Dr Pratika Dayal
Dr Pratika Dayal is an Assistant Professor and Rosalind Franklin Fellow at the Kapteyn Astronomical Institute, part of Groningen University. She gained an ERC starting grant laureate in 2016, and was awarded the 2017 Young scientist Medal in Astrophysics by the International Union of Pure and Applied Physics.
Square Kilometre Array The results of this work could also help inform the operation of future observational facilities, such as the SKA. As co-lead of the SKA epoch of reionization synergy group, Dr Dayal aims to help
Artist view of the JWST which will be located at the Lagrange point L2, at 1.5 millions km from the Earth in the opposite direction to the Sun, the JWST will allow to probe the very young Universe. (Credit : NASA/ESA)
Globalization from below Migrants from South Asia contributed to building East Africa’s infrastructure and administrative apparatus during the second half of the 19th century, and they continued to play an important role throughout the 20th century. We spoke to Dr Margret Frenz about her research into circular migration between South Asia and Africa, work in which she combines historical and anthropological methods. There is a long history of migration between South Asia and Africa, but interactions across the Indian Ocean intensified around the latter part of the 19th century as colonial powers increasingly sought control. This is a topic of great interest to Dr Margret Frenz, who aims to build a deeper understanding of migratory movements across the Indian Ocean between 1850 and 2000 in the GloBe project. “The idea is to compare different types of migration movements. Some South Asians came voluntarily as traders, while others came as indentured or contract labourers, for example to build the Uganda Railway from Mombasa to Kampala,” she outlines. Others may have chosen to move to Africa for better work prospects. “A group of people were recruited to work in the colonial administration. I aim to compare types of movement, and people who came from different areas of India, and went
to different areas in Eastern and Southern Africa,” says Dr Frenz. The main focus of Dr Frenz’s attention is migrants who travelled to Kenya, Tanzania, Uganda, Mozambique and South Africa, countries with natural resources that European colonizers were keen to exploit. The major European colonial powers, including Great Britain, France, Germany and Portugal, effectively divided up Africa between themselves at the Berlin Conference in 188485, formalizing the ‘scramble for Africa’, which led to an increased demand for labour in these countries. “The colonial states needed people to run their administrations,” says Dr Frenz. These migrants have been characterized as the backbone of colonial administrations and have put in place the foundations of the globalized economy. “Through maintaining relationships, through the production of social
and cultural space, they enabled globalization from below,” points out Dr Frenz. “Without those people, none of the European empires could have built up their administrations and acted on a global level.” As a global historian, Dr Frenz now investigates the historical trajectory of South Asian migrants to Africa. This work involves delving into national archives across three continents to reconstruct a deeper picture of how migration affects families. “The official archives mostly offer the government perspective on migration. They include government reports and census documents, as well as information on laws, regulations and perhaps court cases,” she outlines. The European powers ran different countries in the region; Great Britain were able to recruit people from India, the ‘jewel in the crown’, to work and live in their African colonies.
“Tanganyika (modern day Tanzania) was part of German East Africa until 1918 and then became a British mandate, Mozambique was under Portuguese rule, while Kenya and Uganda were British,” says Dr Frenz. “The British recruited Indians actively, as India was also under the British crown. So it made sense for them to actually recruit people from India to East Africa.” This analysis of archival sources provides official or top-down views of migration. It is much more difficult to get material on outlooks of individuals, or bottom-up perspectives. Diaries, autobiographies and other publications can provide valuable insights in this respect. “I combine these multifarious types of sources to get an insight into the story from different angles,” explains Dr Frenz. In addition, Dr Frenz also conducts oral history interviews with both recent migrants and the descendants of migrants, to probe personal memories and investigate the importance of family heritage to individual identity. “After their family has lived in a destination country for five or six generations, individuals don’t see themselves as migrants, they are at home where they live,” says Dr Frenz. “Frequently, there’s a distinction
between an individual’s social or cultural selfperception, and the pragmatic decisions they need to make to be successful.” Migrants face many challenges when they arrive in a new country, not least adjusting culturally, socially and economically. This is a central part of Dr Frenz’s research. “It’s important to understand how individuals actually made a new life,” she says. In her research, Dr Frenz investigates how they
A level of cultural exchange was, however, established with these migratory movements, which to some degree helped plant the seeds of independence movements. Many people moved between East Africa and India in the 1920s, 30s and 40s, not only between colonies within the British Empire, but also across empires. “Goa was a Portuguese colony for instance, as was Mozambique. In both, there were discussions
My goal is to highlight the history of south-south connections, the workings of ‘globalization from below’, and to reach a fuller understanding of today’s world, which is characterized by a culture of migration. established themselves not only economically, but also socially and culturally in wider society. “I highlight how they founded a church, temple or mosque for example, or sports or cultural clubs, in their new home,” she outlines. “It’s crucial to be aware that society was tightly regulated in colonial times, particularly in British East Africa. Administrative structures and neighbourhoods were arranged along racial lines.”
on how countries could gain independence,” says Dr Frenz. India and Pakistan achieved independence in 1947, which sent a signal to colonized countries that it was possible to end colonial rule. It took another two to three decades, however, for East African countries to become independent nationstates; Mozambique, for example, could not celebrate independence until 1975. “There was a lot of pressure from Frelimo
GLOBALIZATION FROM BELOW Circular Migration between South Asia and Africa, c. 1850-2000
This project is path-breaking, because it conceptualizes global history as a connected history, taking into account experiences, trajectories, and perspectives of different social actors from the ‘periphery’. GloBe investigates South Asian migrants to Africa, their historical trajectories, the continuities and transformations of their movements, as well as similarities in and differences between their migrations to different parts of Africa. It differentiates the various types of migration in order to create a novel understanding of circular migration movements between South Asia and Africa.
Heisenberg Position plus research budget (called ‘Sachbeihilfe’), both funded by the DFG (German Research Foundation).
Dr Margret Frenz Historisches Institut Universität Stuttgart Keplerstr. 17 70174 Stuttgart T: +49-711-68583847 F: +49-711-68573847 E: email@example.com W: http://www.margretfrenz.de/Research W: http://www.uni-stuttgart.de/hi/globe/ forschung/index.en.html
Dr Margret Frenz
Margret Frenz holds a Heisenberg Position at the University of Stuttgart. Her research interests are in connected and comparative histories of South Asia, the Indian Ocean, East Africa, and Europe between the 18th and 20th centuries. Her publications include Community, Memory, and Migration in a Globalizing World. The Goan Experience, c. 1890-1980 (OUP, 2014); From Contact to Conquest: Transition to British Rule in Malabar, 1790-1805 (OUP, 2003), and (edited with James Belich, John Darwin, and Chris Wickham) The Prospect of Global History (OUP, 2016).
for Mozambique to liberate itself from Portuguese rule after the Estado Novo broke down,” outlines Dr Frenz. “In all East African countries and South Africa, Indians were part of the independence movements.” Migration between the two regions continued during the period in which colonial regions and newly established nation-states existed simultaneously, and the number of South Asians in East Africa is thought to have peaked in the 1960s. Subsequent political developments led many to leave, with the decision of some new governments to nationalize property, for example. “Many Tanzanians, Ugandans, Kenyans, Mozambicans and South Africans of Indian descent didn’t see how they could survive economically if they stayed, and so decided to move on,” outlines Dr Frenz. “Seismic events such as the revolution in Zanzibar in 1964 or the expulsion of South Asians from Uganda in 1972 affected migration patterns, and in some cases they resolved to set off to new shores. Some went back to South Asia, others went on to Canada or the UK, while today Australia seems to be the most attractive destination,” continues Dr Frenz. The situation has since changed again, and in the 1990s President Yoweri Museveni invited Asians to return to Uganda, and a ‘new’ community has been established in the country. This is a good example of migration between countries in the southern hemisphere, a topic which has yet to receive adequate scholarly attention. “My work demonstrates that south-south connections have been very significant, and they have often been overlooked. A lot of researchers have focused on looking at movements between the global north and the global south, in both directions,” she says. There is a high level of migration nowadays from
Norman Godinho School, Kampala © MF
the global south to the global north, yet Dr Frenz says it’s important to consider migration between countries in the southern hemisphere as well. “We need to look at movements between countries of the global south, to understand what these southsouth movements meant then, and what they could mean today,” she stresses. This forms a central part of Dr Frenz’s research agenda. The next step within the project will be to go to relevant archives across the world and bring the material from different sources together to create a novel understanding of migration, which will be a demanding task. “My methodology is multisited archival and field work, carried out across three continents. So I’m integrating complementary analytical methods,” says Dr Frenz. This covers data and documents on migratory patterns between several countries. “Working on the history of numerous countries is more time-intensive than if you go to just one place. The next step will be to write and disseminate findings through workshops and presentations,” outlines Dr Frenz. “My goal is to highlight the history of south-south connections, analyse the workings of ‘globalization from below’, and reflect on narratives, perceptions, and memories of multiple migrants to reach a fuller understanding of today’s world, which is characterized by a culture of migration.”
Retailshop Souza Figueiredo & Co. © private
For more information, please visit: www.euresearcher.com
The philosophy of imitation in focus We might not notice it, but a lot of human behaviour is acquired through imitation: from language, to cultural habits, to aesthetic and political tastes. This human propensity to imitate is central to the arts but also shapes subjectivity, culture and politics, as Dr Nidesh Lawtoo, Principal Investigator of the ERC-funded project, Homo Mimeticus (HOM), explains. The concept of ‘mimesis’ emerged at the dawn of philosophy with Plato and Aristotle, yet it escapes unitary definitions. “The dominant tendency has been to translate it in terms of visual or aesthetic representation,” says Dr. Lawtoo. “So, for example, we can think of a realist painting or novel or a photograph that represents or copies reality, as a mirror does. However, there is much more to say about the human tendency to imitate, mimic and identify with others.” Based at the University of Leuven in Belgium, Dr. Lawtoo and his team in the HOM project are working to explore the contemporary relevance of mimesis thanks to funding from the European Research Council. This work brings together researchers from disciplines as diverse as philosophy, literary theory, musicology and film studies, as Dr. Lawtoo and his colleagues aim to study Homo mimeticus from multiple perspectives.
Mimesis The HOM project shifts thinking about the concept of mimesis toward more embodied,
Homo Mimeticus: Theory and Criticism This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement n°716181) Nidesh Lawtoo, Assistant Professor Institute of Philosophy / Faculty of Arts KU Leuven, Kardinaal Mercierplein 2-box 3200 3000 Leuven, Belgium E: firstname.lastname@example.org W: http://www.homomimeticus.eu/ : https://twitter.com/HOM_ Project?lang=en : https://www.facebook.com/ HOMprojectERC/ Nidesh Lawtoo is currently Assistant Professor of Philosophy and English at KU Leuven and adjunct director of the MDRN research center. Prior to joining KU Leuven, he held positions at the University of Lausanne, Johns Hopkins University, and the University of Bern. He is the author of The Phantom of the Ego (2013), Conrad’s Shadow (2016), and (New) Fascism (2019).
performative, and behavioural forms of imitation. “The models that surround us, be they real or fictional, true or simulated, good or bad, have formative and performative effects that inform what we think, how we act, and ultimately transform who we are. This is not only a scholarly problem; it’s an all too human problem that the humanities should urgently address,” says Dr Lawtoo. Crowd behaviour offers a clear example of the urgent political implications of imitation. A leader’s slogan can be repeated by a crowd of supporters, indicating that a form of imitation or reproduction is taking place. But Dr. Lawtoo adds that “the power of mimesis goes much deeper – it’s not simply the slogan that is
Alongside looking at the political and philosophical implications of imitation, HOM team members also pay attention to forms of affective participation in the theatre (Niki Hadikoesoemo), opera (Dr. Daniel Villegas Velez) and film (Dr. Lawtoo). “The arts have a lot to teach us about imitation,” says Dr. Lawtoo. “And since mimesis manifests itself via different aesthetic media, within the HOM team we adopt a trans-disciplinary approach,” a method in line with both the ERC mission and KU Leuven’s promotion of cross-disciplinary collaborations to address contemporary problems that do not fit within neat disciplinary boundaries. By considering different perspectives on mimesis, Nidesh Lawtoo and his team members
The models that surround us, be they real or fictional, true or simulated, good or bad, have formative, and performative effects that inform what we think, how we act, and ultimately, transform who we are. This is not only a scholarly problem; it’s an all too human problem. reproduced; it’s the idea behind it and the emotions it expresses – from fear to anger – that spread contagiously.” New media like Twitter and Facebook play a key role in generating phenomena of mimetic contagion, something Lawtoo addresses in (New) Fascism: Contagion, Community Myth (Michigan State UP, 2019). But the focus of the project is not only political. This diagnostic of Homo mimeticus has deep philosophical implications, for it invites academics and citizens to reassess the ideal that we are primarily rational creatures, or Homo sapiens. “As humans we have knowledge, reasoning and rationality, for sure, but we should not forget that we are also irrational, embodied and emotionally volatile creatures,” Dr. Lawtoo stresses. This is again an ancient philosophical lesson, but each generation has to rethink the problem in the light of contemporary challenges. The proliferation of violence via (new) media like film, the Internet and computer games, for instance, calls for new diagnostic investigations that revisit the ancient debate between katharsis and contagion in a contemporary key.
contribute to the contemporary revival of interest in mimetic behaviour with a series of articles and monographs on topics that go from (new) fascism to violence, (neo)baroque opera to theatricality, mirror neurons to science fiction films, available at www.homomimeticus.eu. To make research on mimesis accessible to a broader public, the project has also started a series of video-interviews with influential international thinkers on mimesis, in fields like political theory, literary theory, anthropology, and philosophy, among others. (see https://www.youtube.com/ channel/UCJQy0y0qCxzP4QImG2YWqpw?view_ as=subscriber) Humans imitate in different ways, consciously and unconsciously, actively and passively, and the effects can be good and bad. Dr Lawtoo and his team are building on a long tradition in philosophy and the arts to sketch a double-faced picture. “It’s a question of discerning between different forms of mimesis and developing what I call a diagnostic or genealogical approach. In the way that a doctor cures the body, philosophers and critics can help diagnose the good and bad forms of imitation that we see massively at play today,” he says.
Looking to the future of the judiciary An ever greater number of both civil and criminal cases do not reach the trial stage, with conflict resolution methods increasingly used to build a consensus between parties and reach a settlement. We spoke to Professor Michal Alberstein of the Bar-Ilan University in Israel about the JCR project’s work in investigating the changing role of the judiciary in an age of vanishing trials An increasing number
of civil and criminal cases are resolved outside the courtroom, with 97 percent of cases in England and Wales not reaching trial. This phenomenon of ‘vanishing trials’ is a topic of great interest to Professor Michal Alberstein, the Principal Investigator of the Judicial Conflict Resolution (JCR) project, an ERCbacked initiative in which researchers are analysing the changing role of the judiciary, building first on clear data. “We cannot really capture the reality of the changing role of judges in an age of vanishing trials, without first of all understanding to what extent are trials vanishing? And where are judges playing a role?” she outlines. The first stage of the project centered around a quantitative study, in which researchers aimed to assess the extent of the ‘vanishing trials’ phenomenon. “Assuming that 100 cases enter the system, how many of them reach a judge? At what stage does the judge intervene? And what mode of disposition will result from such an intervention?” continues Professor Alberstein. “We’re looking at how cases proceed as they enter the system and move towards a resolution.”
Legal systems This research is focused on three legal systems, Israel, Italy and finally England and Wales, each with their own traditions and conventions. The system in England and Wales is traditionally quite adversarial, with legal teams making oral arguments and presenting supporting evidence, while Italy has a different model for civil cases. “Judges in Italy have a different role to their counterparts in England and Israel in civil cases. They sit at the same level as the parties, their office is less formal, and they decide on a case after they have seen the evidence. There is no separate stage of oral presentation, like in the adversarial system,” explains Professor Alberstein. All legal systems represent a fusion of different elements to some extent, as the authorities seek to balance the considerations of transparency, due process and efficiency, yet they are also evolving in line with modern priorities. “In civil cases in England and Israel today, much more control is given to the judge, which is not common in the adversarial system. They have more freedom to manage the case,” outlines Professor Alberstein. “Most processes end during the preliminary stage, which means that trial is rare.”
The result of this shift is that the legal process itself becomes more inquisitorial rather than adversarial, with judges playing a prominent role in assessing the evidence and helping to resolve the conflict. Conflict resolution techniques are already commonly applied in some areas of the law, for example employment or family disputes, but Professor Alberstein says these types of cases are not the focus of attention in the project. “We are looking at more mainstream cases,” she outlines. Judges are involved in various levels across the three legal regimes that the project is examining. “In Israel, judges are very much involved in the pre-trial phase of a civil case, even when they are presiding on it. They have a less significant role in criminal cases, while in England and Italy judges are involved in civil cases, but to a lesser degree than they are in Israel,” explains Professor Alberstein. A judge may seek to build a consensus between the parties, yet with the authority to impose a judgment if necessary. “There’s a certain threat that if one of the parties don’t accept the suggestions that are made then they could be penalised later on. In England there is a formal sanction of cost-shifting
JCR International Research Group tour of Jerusalem and Israeli High Court, 25th November 2018. (Left to Right): Dr Yosef Zohar, Prof. Linda Mulcahy, Dr Edite Ronen, Dr Beatrice Coscos Williams, Dr Hadas Cohen, Adv. Sari Luz-Kanner, Dr Dana Rosen, Prof. Michal Alberstein, Dr Laura Ristori, Prof. Paola Lucarelli, Adv. Elisa Guazessi.
in cases where one of the parties refuses a reasonable settlement. In Israel, more discretion is given to the judge in deciding on cost sanctions. In Italy, the costs are also less explicity imposed, yet parties may be ordered to attend mediation, and participate significantly in the sessions,” explains Professor Alberstein. “So there are incentives for the parties to reach an agreement. It’s a situation that requires careful ethical consideration, there are some questions about whether it’s the right way to deal with conflicts.” A criminal case is of course different to a civil case, with different procedures and legal conventions. However, Professor Alberstein and her colleagues have found that in Israel, judges are still more likely to intervene at preliminary stages than was previously the case. “Judges in Israel exercise powers that are similar to inquisitorial judges. Within the preliminary stage they receive all the evidence and they can reach their own assessment of the case,” she says. The judges in Israel assigned for preliminary hearings do not preside on the case if it continues to trial. “In these cases, a form of abbreviated trial is
more the rule,” says Professor Alberstein. “We have found that abbreviated trials rather than plea bargains become the main causes for vanishing trials in inquisitorial countries. The full criminal trial, including full presentation of evidence and the formal procedural stages, is a rare phenomenon across legal cultures.” The more high-profile criminal cases may attract a lot more attention and publicity,
is confidential and separate from the actual case,” outlines Professor Alberstein. This may prove to be a more effective way of dealing with complex cases and processing large amounts of information and evidence than a more adversarial model. “These judges don’t act just as mediators, they also know how to manage a trial. This interplay between the work of judges and the work of mediators
Assuming that 100
cases enter the system, how many of them reach a judge? At what stage does the judge intervene? And what mode of disposition will result from such an intervention? with the prosecution and defence presenting different arguments, yet conflict resolution techniques can still be relevant in this type of situation . In Israel for example, an alternative legal proceeding called criminal mediation can be used, sometimes in parallel to the actual trial, which continues according to the normal rules. “The non-presiding judge attempts to help the parties reach a plea. This
is sometimes helpful in concluding a case,” continues Professor Alberstein. “In Israel, judges have a bit more scope to offer deals than in England and Wales.”
Conflict resolution There are many different ways of resolving conflicts, and judges today play an increasingly important role in identifying the appropriate
mechanism for individual cases. A full legal trial may be necessary in some circumstances, but other cases may require mediation, while other techniques are also available to try and reach a conclusion. “Judges can use modified modes of conflict resolution,” says Professor Alberstein. This might mean something as simple as encouraging one of the parties to apologise for a mistake that they acknowledge they made, which can encourage dialogue, although Professor Alberstein says it’s important to manage this type of situation carefully. “It’s about trying to see how the parties respond to it, whether it gives them any incentive to move forward,” she explains. “Judges need to know how to manage their emotions, to regulate the courtroom effectively and to constructively engage with the conflict. These are the soft skills that we want to help develop, that we will translate into training scenarios.” This is part of wider shift in the role of judges as some countries look to modernise their justice system, with the Briggs Report in the UK recommending the increased use of online courts for certain types of cases for example. Judges will still have an important role to play in the justice system as technology advances, yet they will need a wider range of skills. “Judges will need advanced conflict resolution skills,” stresses Professor Alberstein. The way in which these skills can be applied may vary between different countries and legal cultures. “Legal culture in Israel has gone through some decline of formalism in the past decades and judicial discretion, including the use of policies and principles, is considered broad and significant, whereas England and
Wales has a more formal culture,” continues Professor Alberstein. In Italy, where the system is based on a code, the idea is to apply the law, so the judge does not have a high degree of discretion. “Judges apply the law as they read it from the codex,” explains Professor Alberstein. “We found that there is some correlation between perceiving the law as an open texture and applying more conflict resolution tools. So as a general point the less formal we are – both in terms of procedure and legal rules – the more discretion the judge will have to include more conflict resolution techniques.” The phenomenon of vanishing trials is not limited to the three countries covered by the project, with pre-trial settlement increasingly common across the world. Given this backdrop, Professor Alberstein believes it’s important to help the lawyers and judges of the future acquire not only legal knowledge, but also conflict resolution skills. “We need to train lawyers and legal professionals in negotiation skills. We should also teach judges how to do settlement work, with this new combination of conflict resolution and legal skills,” she says. Alongside contributing to the literature, Professor Alberstein also plans to publicise her research and to participate in the wider debate around the evolving role of the judiciary. “I intend to continue writing about this phenomenon of vanishing trials, to examine it in different contexts, and to see whether we can really establish a broader perspective on the law,” she continues. “We want to look towards addressing disputes and conflicts in a more holistic and relational way.”
Dr Ayelet Sela, Research fellow in the JCR Project at the International Conference held at Bar-Ilan University, 26th November 2018.
JCR Judicial Conflict Resolution: Examining Hybrids of Non-adversarial Justice
The Judicial Conflict Resolution (JCR) project explores the changing roles of judges in the era of “vanishing trials”, wherein settlements and plea bargaining far outnumber full and final verdicts. The five-year long comparative study is taking place in three countries: Israel (project headquarters), England & Wales, and Italy. The activities include theoretical and regulatory legal research, as well as quantitative and qualitative empirical research.
Total EU funding: 1,272,534 euros
We are official parteners with London School of Economics and our partner is Professor Linda Mulcahy . We also have an unofficial connection with the University of Firzenze and Professor Paola Lucarelli. They are third party contractors.
Project Coordinator, Professor Michal Alberstein Faculty of Law Head, ERC Research Team on Judicial Conflict Resolution (JCR) Bar-Ilan University Ramat-Gan Israel T: +972 03-5317098 E: Michal.Alberstein@biu.ac.il W: www.jcrlab.com W: https://www.youtube.com/playlist?list=PLXF_ IJaFk-9C5MN9cT2CJa3hKW28fQSXX W: http://ssrn.com/author=111666
Prof. Michal Alberstein
Michal Alberstein is a professor at The Faculty of Law, Bar-Ilan University, Israel, where she teaches jurisprudence and conflict resolution. She is the academic director of eight legal Clinics at Bar Ilan Faculty of Law. She is also the academic chairperson of “Israeli hope” project, supported by the president of Israel and The High Council of Education. Her current research includes dealing with theories of law and conflict resolution and their intellectual roots; as well as representations of conflict resolution in literature and film.
gicon Hi In h search Old loIrish no roof C The Irish language changed considerably during the medieval period, with morphological and phonological shifts marking the transition between Old Irish and Middle Irish. The ChronHib project aims to build a deeper picture of the Irish language in the 7th to 10th centuries, opening up new insights into medieval culture and language, as Professor David Stifter explains.
The early Middle Ages
was a productive period in Ireland’s literary history, yet in many cases the authorship of texts in the Irish language from the time and the circumstances of composition is not clear. This leaves gaps in our understanding of the nation’s cultural history and how the Irish language evolved over time, something which Professor David Stifter and his colleagues in the ChronHib project are working to address. “Our main interest is in very fine changes from the older stages of the language to the more recent stages of the language,” he outlines. The project is looking at the period between roughly the 7th and 10th centuries, which covers the transition from Old Irish to what is known as Middle Irish. “The usual definition of Old Irish is that it dates from the 8th-9th century. The 10th century is thought of as the start of the Middle Irish era,” explains Professor Stifter. There is a vast amount of literature from this period across various different genres, including poetry, sagas and narratives, yet in most cases it exists in the form of much later manuscripts copied by scholars in subsequent centuries who adapted the language to the later period. This is a challenge in terms of the project’s goal of building a clearer picture of the language in the 8th and 9th centuries. “There are many changes in the grammatical forms of words and the way the text is organised over time,” says Professor Stifter. There is nevertheless some undisturbed Old Irish material that survives in contemporaneous codices from the 8th and 9th centuries, mainly in the form of glosses, a kind of marginal or inter-linear note. “There are three key manuscripts from the period and many smaller ones. The content of these manuscripts were mainly written in Latin, but the glosses were very often written in Irish, for instance to explain difficult Latin passages,” says Professor Stifter. The researchers in the ChronHib project are developing computational tools and statistical methods to analyse these glosses and gain new insights into how the language changed over the period. While there is an abundance of information available on modern languages like
English, material on Old Irish is much scarcer, so Professor Stifter says innovative methods are required. “We need to develop methods to ensure our analysis is as comprehensive as possible. We want to draw as much information as possible out of the limited data available,” he explains. This approach could hold wider relevance to the linguistics field, beyond the specific case of Old Irish. “This could be a model for the analysis of other languages – how to analyse and present data on ancient or medieval languages that are no longer spoken in that form,” outlines Professor Stifter.
through the orthography. Of course we don’t have tape recordings from that period, so we have to look at the spelling to assess how it would sound,” he outlines. The morphology of words – their form – is another major topic of interest in the project, with researchers looking at Old Irish verbs. “The verb system of Old Irish is extremely complicated and encodes a wide range of semantic and formal categories and dimensions within a single word. There are examples in which it takes over ten words to express something in English that
Our main interest is in very fine changes from the older stage of the language to the more recent stage of the language. We want to get a picture of what the Irish language looked like in the 8th or 9th century. Morphological and phonological changes Morphological and phonological changes are among the most important topics for Professor Stifter and his colleagues in order to assess how the Irish language evolved over the period. “Phonology means what a word sounds like – which is reflected
can be expressed by a single word in Old Irish,” continues Professor Stifter. A lot of attention in the project is focused on morphological analysis of these verbs, with researchers creating minutely annotated databases to build a chronological framework of linguistic changes. The wider aim here is to build a reliable resource and reference point
The ChronHib team demonstrating their databases at the 10th Celtic Linguistics Conference 2018.
ibernicum for linguistic dating, potentially allowing researchers to identify the period that a specific text dates from through the language used, yet Professor Stifter says there are some significant hurdles to overcome in this respect. “A lot of the information included in the Old Irish form needs to be analysed, characterised, annotated, and re-worded on this database,” he explains. The challenge here is in representing this information as clearly as possible. “These morphological changes in the language over time are of great interest to us,” says Professor Stifter. Historical writing from the period is also invaluable for gaining insight into linguistic changes over time. There are annual records from the early medieval period – annals – which briefly record the events that occurred in that specific year. “Over many centuries, a lot of text accumulated,” points out Professor Stifter. In principle, these records reflect changes in the language over time. “The annals are clearly associated with a year, so you have an idea of what the language may have looked like at that time,” says Professor Stifter. “If we compare the form in two years and see that there is a difference, then we can take that as an indication that a change is taking place. The dates are fairly close in the case of annals, as they are really associated with a specific year, which is not the case with the glosses.”
Indo-European language family This research opens up new insights into Medieval culture and language, while Professor Stifter says Old Irish is also an important source for our understanding of linguistics more generally. Analysis of Old Irish will add to a more complete picture of the Indo-European language family. “The Indo-European family is one of the largest language families. It covers most of the European languages, as well as many languages in the Middle East,” explains Professor Stifter. Ultimately, all of these languages are related and descend from a common ancestor, about 6,000 years
CHRONHIB Chronologicon Hibernicum – A Probabilistic Chronological Framework for Dating Early Irish Language Developments and Literature Project Objectives
The aim of the Chronologicon Hibernicum (ChronHib) project is to refine the methodology for dating Early Medieval Irish language developments, (between approximately the 6th-10th centuries) and to build a chronological framework of linguistic changes that can then be used to date literary changes within the Early Irish period. This goal will be achieved by combining philological and linguistic analysis and advanced statistical methods.
Funded under: H2020-EU.1.1. • Overall budget: € 1 804 229,62 Early Irish texts survive only as glosses or marginal notes in contemporary Latin manuscripts, like this Middle Irish poem on the manuscript final page.
ago, and Professor Stifter says analysis of Old Irish will help researchers gain a fuller picture. “Old Irish forms part of Celtic, one of the 12 main known branches of the Indo-European language family. The Celtic branch is rather small, but it still has very important contributions to make to our understanding of Indo-European languages in general,” he stresses. The project will make an important contribution in these terms, using technology to improve linguistic dating and strengthen links between the past and the present. In the remaining time of the project’s five year funding term, Professor Stifter and his colleagues intend to apply Bayesian statistical methods to the collected and annotated linguistic material and to improve the database, which he says will be an invaluable resource for linguists looking to date historical texts. “There’s an enormous amount of literature from the 8th and 9th centuries. If we can understand it more accurately, it will greatly add to our understanding of the period,” he says.
Professor David Stifter Professor of Old Irish Maynooth University Department of Early Irish Maynooth, Co. Kildare, Ireland. T: + 353 01 7083710 E: email@example.com W: https://www.maynoothuniversity.ie/ people/david-stifter Fangzhe Qiu, David Stifter, Bernhard Bauer, Elliott Lash, Tianbo Ji, ‘Chronologicon Hibernicum: A Probabilistic Chronological Framework for Dating Early Irish Language Developments and Literature’, in: Marinos Ioannides, Eleanor Fink, Raffaella Brumana, Petros Patias, Anastasios Doulamis, João Martins, Manolis Wallace (Eds.) [= Lecture Notes in Computer Science 11196], Cham: Springer Nature Switzerland 2018, 731–740. doi 10.1007/978-3-030-01762-0_65
Professor David Stifter
David Stifter is Professor of Old Irish at Maynooth University. He was previously a lecturer in Indo-European linguistics at the University of Vienna, where he played a major role in establishing and managing the Celtic studies programme. He is founder and editor of the interdisciplinary Celticstudies journal Keltische Forschungen, and a founding member of the Societas Celtologica Europaea, the European Association of Celtic Studies.
In Chronologicon Hibernicum, the Old Irish language is transferred from vellum manuscripts into a digital environment.
Museo Casa de la Memoria Indómita, Mexico City (Mexico)
Collective memory in the digital age The disappearance of 43 students from the Ayotzinapa Rural Teachers college in Mexico prompted protests and put the human rights situation in the country into sharper focus. Over 40,000 people have disappeared in Mexico since 2006, yet it was the Ayotzinapa case that caught public attention, now Silvana Mandolessi is investigating the role of digital media in shaping collective memory of the case. The disappearance of
43 Mexican students from the Ayotzinapa Rural Teachers college in September 2014 attracted a lot of attention across the world, and their fate is still unclear. While violence is not uncommon in Mexico’s ongoing war on drugs, the disappearance of the Ayotzinapa students struck a particular chord, both nationally and internationally. “It triggered a lot of protest around Mexico for several months, with people denouncing the situation. It also became a global event – people became increasingly aware of the situation of disappearances in Mexico. There have been more than 40,000 disappearances since 2006, but this was the first case that went global,” says Silvana Mandolessi, an Assistant Professor of Literary Theory and Cultural Studies at KU Leuven in Belgium. As the Principal Investigator of the DigitalMemories project, Professor Mandolessi is investigating the role of digital media in shaping how this case has been perceived and remembered. “One part of the project focuses on this case. We’re investigating the situation of disappearances in Mexico,” she explains.
Disappearances in Mexico Over the last ten years or so disappearances in Mexico have been committed not only by the State but also by organised crime This is distinct from the image of enforced disappearances, a crime which is traditionally thought of as being committed by the state, for example by the military dictatorship
Footprints of Memory/Huellas de la memoria Collective Project / International Campaign Against Enforced Disappearance.
which ruled Argentina between 1976-83. “So, situations where organised criminal groups are the perpetrators cannot be defined as enforced disappearances, but rather as disappearances,” outlines Professor Mandolessi. Researchers in the project are aiming to analyze the meaning of the involvement of organised crime and the state in these disappearances; the overall picture is very complex, as there are many cases in which local police collude with criminal organisations. “It’s very difficult because there is a deeply embedded culture of impunity in Mexico. There are also very different kinds of perpetrators and victims,” continues Professor Mandolessi. “For example, some migrants to the US have disappeared and people don’t know what happened, and there are also more conventional political disappearances committed by the state. We try to look at the situation from a legal, sociopolitical and historical perspective.”
This is further complicated in the Ayotzinapa case by the fact that the fate of the students is still unclear, even with all the tools of modern technology at our disposal. While in the digital age we are often overloaded with information on all manner of topics, there is still a gap in our understanding of what happens in disappearance cases, raising some important questions. “How does the omission of information affect the representations that are created around the case?” asks Professor Mandolessi.” The war on drugs itself officially began in 2006, when the Mexican military was deployed to fight drug trafficking organisations and engage in public security functions. These are long-standing problems however, and human rights organisations have since documented the involvement of State forces in enforced disappearances. “One of the slogans that became prominent in the protest was - ‘it was the state’. One part of our research regarding the legal perspective is; to what extent is the state responsible for crimes committed by organised crime?” asks Professor Mandolessi. “There are some cases where the state is directly responsible, because it was a perpetrator and it was involved in the disappearances. But what happens in cases where disappearances are committed by agents of organised crime?” The Ayotzinapa case holds particular interest in this respect, as it was one of the first to attract global attention. Digital media played an important role in keeping the case
in the public eye and putting the human rights situation in Mexico into sharper focus, something which Professor Mandolessi and her colleagues are now investigating. “We are interested in how digital activism works in this case, in order to understand in which way digital media and memory function today,” she explains. The project’s work centres on investigating how collective memory of an event – in this case the human rights violations committed against the Ayotzinapa students – has been affected by the growth of digital media. “We’re looking at the way in which people disseminate messages and participate in campaigns. Specific phrases are associated with digital activism and political representation in the digital age, which differ from those used in collective campaigns in the past,” says Professor Mandolessi. “This affects the way we remember the event itself.” There have been more than 40,000 disappearances in Mexico since 2006, and while many earlier cases faded from the collective memory relatively quickly,
the Ayotzinapa case remains extremely prominent, with online activism and images of the disappeared helping to keep it in the public consciousness. There are certain differences here with how earlier disappearance cases were represented and remembered. “The way in which people engage, the images that
terms of how victims are represented. “There is a lot of protest art and art installations about the 43 students. For example there is a facial recognition test, where a camera tests to what extent your face matches those of the students,” outlines Professor Mandolessi. “These kinds of objects are mixed up in an
The disappearance of the Ayotzinapa students triggered a lot of protest around Mexico for several months, with people denouncing the situation. It also became a global event – people became increasingly aware of the situation of disappearances in Mexico. are used and the way in which the discourses are constructed are very different,” says Professor Mandolessi. For example, while the Grandmothers of the Plaza de Mayo also used images of their missing sons and daughters to protest about disappearances during Argentina’s dictatorship, the onset of the digital age has opened up new possibilities in
intersection of our digital paradigms.” This particular example also serves to bring home any personal similarities and make people feel more closely connected to the students themselves. Digital activism, for example through online campaigns or petitions, also deepens engagement and enables the personalisation of demands, even among
43 / Francisco Mata Rosas – Felipe Victoriano. The picture from the book “43” can be accessed at this link: http://www.casadelibrosabiertos.uam.mx/contenido/contenido/Libroelectronico/flip/43/
DIGITALMEMORIES We are all Ayotzinapa: The role of Digital Media in the Shaping of Transnational Memories on Disappearance
The main objective of the project is to provide a theoretical model for analyzing digital memory. The model will contribute to answering to the question of how new media forge new instruments for fighting against violations of human rights and will contribute to understanding the dynamics of networked social movements in the digital age.
Funded under: H2020-EU.1.1. • Overall budget: € 1 444 125 ERC ‘Starting Grant’ Programme: Horizon 2020 / Call: ERC-2015-STG / ERC project number: 677955 / From 01.07.2016 to 30.06.2021 / Based at: KU Leuven (Belgium)
Project Coordinator, Silvana Mandolessi Assistant Professor Cultural Studies Department of Literary Theory and Cultural Studies Blijde-Inkomststraat 21-3311 (3000) Leuven - Belgium T: +32 16 32 48 32 E: firstname.lastname@example.org W: http://digitalmemories.be W: https://cordis.europa.eu/project/ rcn/204436/factsheet/en Silvana Mandolessi
Archivo Provincial de la Memoria, Córdoba (Argentina).
people and communities far removed from the actual case. “This is expressed in sentences like ‘we are all Ayotzinapa’ for example, a paradigmatic slogan to show solidarity in global connective movements. I want to understand whether this is useful or effective with respect to today’s concerns. I’m also interested in how this affects whether an event is remembered by someone in a country far away from Mexico,” says Professor Mandolessi.
Silvana Mandolessi is Assistant Professor of Cultural Studies at the KU Leuven. She is the co-editor of El pasado inasequible (Eudeba 2018) and the special issue Transnational Memory in the Hispanic World (European Review 2014) and numerous articles on Latin American collective memory, literature and culture.
These objects and representations are central to the collective memory of the case, and also to building a clearer picture of what actually happened to the students. The work of Forensic Architecture, a research agency based in London, is of great interest in this respect. “What Forensic Architecture do is combine digital techniques to construct evidence around these cases, to reconstruct the scene,” outlines Professor Mandolessi. The agency has reconstructed the abduction of the students in an interactive platform, by which different explanations and theories around the event can be assessed; this could eventually be used as evidence in a trial, while Professor Mandolessi says it is also important in terms of public memory. “It contributes to keeping the case alive,” she points out. “Forensic architecture have organised all this information in a new platform in which we can explore what happened, looking at different moments, different factors, and different versions of events.”
The focus of the project for the moment however is on developing a theoretical model to analyse the digital objects that help shape transnational memories on disappearances. Another area of interest is in reflecting theoretically on how collective memory has changed under the impact of digital media. “Many people today speak about connected memory instead of collective memory,” says Professor Mandolessi. Communities can come together to campaign about human rights violations without necessarily needing to be in close geographical proximity, or share a common history, which is an important consideration in Professor Mandolessi’s research. “The way in which we define a collective or community is essential to understanding what happens with collective memory in today’s connected age,” she says. Ilustradores con Ayotzinapa, Collective project. (http://ilustradoresconayotzinapa.tumblr.com)