EU Research Winter 2020

Page 1

EU Research Winter 2020

2020: A history making year The true impact of Covid and Brexit

Horizon Europe gets 4 billion Euros boost Climate change warning: This Autumn was the hottest ever recorded

A focus on the science of well being Supramolecular chemistry driving paradigm shifts in sustainable science

Disseminating the latest research from around Europe and Horizon 2020 Follow EU Research on

Editor’s N W

e need science to be trusted again. It is upsetting and frustrating for real scientists to understand a few characters and websites promoting conspiracy theories can have their messages accepted as truth by so many.

Often it does not take much investigation to undermine the arguments of such theories. 5G networks being responsible for spreading the coronavirus was a particularly poor argument, ironically usually promoted on social media seen on 3G and 4G phones. Covid19 is rife in areas that don’t have 5G, for one. Cell towers have been burned down, construction workers threatened and all because of this kind of nonsense. Then there are those convinced Bill Gates is plotting our demise, using mass vaccination to microchip us and track our movements. This kind of slur is despicable, of a foundation that only in November gifted 70 million dollars, on top of a previous donation of 350 million dollars toward making Covid19 vaccinations affordable and accessible for low income countries. That does not sound like the work of a super villain to me. Others believe vaccination is dangerous and unnecessary, or somehow altering our DNA to genetically modify us, which is not how vaccines work at all.

As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.

The trials that vaccines go through are rigorous before they are rolled out. With Covid19 vaccine candidates, the trials have undergone development stages and testing in an overlapping way, rather than in sequence, to speed them up but corners are not being cut for safety. In reality, it is the so called antivaxxers that are really dangerous for us all in this moment in history. At the time of writing, two viable vaccines are in final stage tests before we can use them against Covid19. There are many more in early stages that will be ready soon, but the first two have results that indicate they successfully prevented more than 90% of people getting infected, which is the best possible news. This is real hope but only if the majority of people embrace this as a solution when the time comes. As we get good vaccines coming through, we understand that a large percentage of people will not take them for reasons based on unfounded ideas on the internet. This is a tragedy. Every effort needs to be done to help people understand the difference between real science and invented fantasy. If we can do that, we will save more lives from this terrible virus.

Hope you enjoy the issue.

Richard Forsyth Editor




Research News

EU Research takes a closer look at the latest news and technical breakthroughs from across the European research landscape.


WELLWAYS Stressful life events can affect individual wellbeing, and the effect may be magnified if several occur in a short period. Professor Laura Bernardi aims to build a deeper picture of how these major life events affect wellbeing.


Role of the CREB coactivator CRTC1 in mood disorders and antidepressant response Depression affects millions across the world. We spoke to Dr JeanRené Cardinaux about his research into the molecular mechanisms and regions of the brain linked with depressive behaviour, which could in future lead to improved treatment.


IllegalPharma An increasing number of people are turning to illegal pharmacies to buy drugs. We spoke to Professor Luis Diestre about his work in analysing the competitive dynamics of the informal economy and its implications for regulation.


MECHANO-CONTROL Mechanical forces may affect whether a cell differentiates, proliferates or adopts a malignant phenotype. Understanding how these forces are transmitted in cells could open up interesting new therapeutic avenues, as Pere RocaCusachs explains.




Prof. Oren A. Scherman, Director of the Melville Laboratory for Polymer Synthesis, reveals how the ERC funded CAM-RIG project is providing state of the art innovation to understand hydrogel behaviour at a molecular level in real time.

20 COVID 19 VACCINE Around the world, research teams have been working tirelessly on different vaccines which can prevent infection from Covid19. What can we expect from these vaccines and is an end of the pandemic now in sight? Richard Forsyth reports.


We spoke to Professor Alexandre Roulin and Professor Jérôme Goudet about their work in investigating the genetic basis of variability in the number, size and colouration of black spots on barn owls, which could open up new avenues of investigation.

26 SLIM Reducing environmental impacts from mining is an important aim for Europe, as is pushing efficiencies to bolster supply, streamlining operations and reducing costs. Professor José Sanchidrián, coordinates the SLIM project, addressing these concerns with technologies.

24 plan4res Renewable sources of energy like wind and photovoltaics produce a variable output. We spoke to Sandrine Charousset and Danny Pudjianto about the plan4res project’s work in developing new tools for supporting the integration of renewable energy sources within the system.

30 BioMates

Using biomass in the production of transport fuels would help reduce dependence on fossil resources and also de-carbonise the European economy, topics central to Dr. Stella Bezergianni’s work as Principal Investigator of the BioMates project.

32 VEEP Recycling and re-using concrete and demolition waste would bring both environmental and economic benefits. The VEEP project is developing new technological solutions that could help increase the use of recycled materials in the construction sector, as Anna Paraboschi explains.

33 ICC Taking out high-cost credit can have long term financial consequences. We spoke to Professor Daniel Paravisini about his research into how the growth of new institutions like online marketplaces is affecting the consumer credit market.

34 PROSANCT Financial sanctions are a powerful foreign policy tool, but they can also have unintended effects. Financial sanctions are at risk of being over-used, which poses significant challenges to global banks under pressure to comply, as Professor Gregoire Mallard explains.


AGRARIAN REPUBLICS We spoke to Professor Béla Kapossy about the eighteenth-century Swiss intellectual context in which the Genevan political thinker Jean-Jacques Rousseau warned of a major European revolution caused by agricultural crisis and violent urban unrest.

EU Research


DISAGREEMENT IN PLATO AND XUNZI Richard King and Anders Sydskjør are looking at the use of definitions in the writings of Plato and Xunzi, from the Greek and Chinese philosophical traditions, which represents an important contribution to the emerging field of Sino-Hellenic studies

56 The Weight of the World By the end of 2020, the combined weight of all man-made objects will likely exceed the weight of all the living things on Earth. By Richard Forsyth


42 US / EU Relations How will President Biden’s policies and approach affect Europe? Will we now see closer collaborations in science, from climate change policies to rejoining the World Health Organisation? Richard Forsyth reports.

60 LOW-END INNOVATION Low-end innovations can prove highly profitable for businesses, yet many decision-makers remain biased towards high-end innovation. We spoke to Professor Sebastian Gurtner about his research into the characteristics of lowend innovators and how organisations can effectively support them.

46 FBD_BMODEL Over recent years a large amount of textile production capacity has shifted away from Europe. The FBD_BModel project is developing a new interactive digital design platform that could help rejuvenate the European textile industry, as Dr Xianyi Zeng explains.

47 WorkFREE The WorkFREE project aims to build a deeper picture of circumstances around exploitative labour on the ground, investigating how the combination of cash transfers and participatory action research can help, as Dr Neil Howard explains.

48 ODYCCEUS We spoke to Dr Eckehard Olbrich about the work of the Odycceus project in seeking ways to both tap into the information swirling around on social media, and also use it to help resolve conflicts.

50 METRO-HAUL We spoke to Professor Andrew Lord and Dr Daniel King about how the METRO-HAUL project is designing and building scalable infrastructure that services the needs of the new applications that could emerge with the development of new 5G networks.

53 EOPEN Dr Guido Vingione and Maria Gabriella Scarpino are working on the EOPEN project, an initiative developing an open platform to enable non-expert users in areas like flood risk assessment and food security to make full use of satellite data.

We spoke to Dr Aylin Tschoepe and Susanne Käser about their work in using an interdisciplinary approach to explore the possibility of enabling participation in urban planning processes through multi-authored images as a way of communicating future urban visions.


PETER Electron paramagnetic resonance (EPR) opens a window into the structure and function of materials, yet classical methods have relatively low levels of sensitivity. The PETER project is developing a new, more sensitive method, as Professor Tomáš Šikola explains.

62 ExaQUte Simulations can play an important role in civil engineering projects. We spoke to Professor Fabio Nobile and Professor Riccardo Rossi about the work of the ExaQUte project in constructing a framework to help address complex engineering problems.

64 SABRE Dr Benjamin Woods, Principal Investigator of the SABRE project explains the complexities and significant benefits of developing a new approach to helicopter design, which uses blade morphing technologies to optimise performance

66 LMCat It is difficult to produce graphene on large scales with sufficient quality. Researchers in the LMCat project are developing a new method of producing 2-D materials using liquid metal catalysts, as Dr. Irene Groot explains.

EDITORIAL Managing Editor Richard Forsyth Deputy Editor Patrick Truss Deputy Editor Richard Davey Junior Editor Adam England Science Writer Holly Cave Acquisitions Editor Elizabeth Sparks PRODUCTION Production Manager Jenny O’Neill Production Assistant Tim Smith Art Director Daniel Hall Design Manager David Patten Illustrator Martin Carr PUBLISHING Managing Director Edward Taberner Scientific Director Dr Peter Taberner Office Manager Janis Beazley Finance Manager Adrian Hawthorne Account Manager Jane Tareen EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: © Blazon Publishing June 2010

Cert o n.TT-COC-2200




The EU Research team take a look at current events in the scientific news

11th hour Brexit deal leaves scientific research on a cliff edge

Britain’s association to Horizon Europe hangs by a thread amid costs concerns and opposition from senior UK Cabinet ministers. Brexit is set to become a reality, but its likely consequences for researchers are still emerging. The United Kingdom left the European Union on 31 January, but has remained part of EU trade and travel agreements while the final Brexit deal is negotiated. On 1 January 2021, those ties will be severed—and a Brexit deal has yet to materialize. The future of research funding, international collaboration, and wrinkle-free supply chains of lab stocks hinge on the details of the final deal, but those issues are “some way back in the queue” behind the sticking points of a trade agreement, says James Wilsdon, a science policy expert at the University of Sheffield. For U.K. scientists, the biggest question is whether they can be a part of the €85 billion 2021–27 Horizon Europe research funding program. Non-EU members can participate, but the United Kingdom is wavering over a potentially hefty price tag. The EU offer would see the United Kingdom pay in about £15 billion, plus a top-up payment if U.K. applicants win more than that in research grants. But because U.K. success rates in winning Horizon grants have fallen by almost one-third since the vote to leave Europe in 2016, it is likely to pay in much more than it gets out. Vivienne Stern, director of Universities UK International, estimates the 7-year premium at about £3 billion. Although Stern supports joining Horizon Europe, she told a parliamentary committee on 22 October the price is too high. The benefits of international collaboration and access to diverse funding schemes could justify paying extra, but not such an “eyewatering” sum, says Kieron Flanagan, a science policy expert at the University of Manchester. U.K. research advocates have proposed a cap on any premium paid into Horizon Europe by non-EU countries, but the European Union has shown no sign of budging since its initial offer in March, says Martin Smith, a policy manager at the Wellcome Trust, a U.K. philanthropic research funder. “I’m currently optimistic that a way forward on costs can be found,” he says, “but all this is moot if the wider negotiations collapse.” Wilsdon is more pessimistic and suggests Horizon participation could stall even if a trade deal squeaks through.


Another issue that could disrupt research collaborations is data privacy. EU negotiators may deem U.K. data protection laws inadequate because of its broad use of surveillance, says Rosie Richards, head of digital policy at the NHS Confederation, which represents U.K. health providers. That could disrupt studies including those from the COVID-19 Genomics UK Consortium, which is working with the European Bioinformatics Institute to track clinical data and changes in the coronavirus genome to establish whether they link to easier transmission of the virus or more severe disease. No matter what deal emerges, Brexit is sure to change the flow of researchers themselves. EU citizens wishing to work in the United Kingdom will now need a visa, with requirements for a job offer, a salary above a certain threshold, and English language testing. The U.K. government is offering a new “global talent visa” that eliminates some bureaucratic hoops for researchers and technicians named on grants—including ones from Horizon Europe. But the cost of a 5-year visa for a family of four is nearly £15,000. “It’s really quite prohibitively expensive,” and might persuade researchers to land elsewhere in Europe, says James Tooze, a policy officer at the U.K. Campaign for Science and Engineering. The visa requirements will also affect essential university workers, like building cleaners. NWUPC’s suppliers are helping European employees with the paperwork that allows pre-Brexit residents to remain, but in the future they might not be able to recruit that labor so easily, Dodd-Williams says. Brexit talks are still ongoing, with stubborn disagreements over rules that would prevent businesses on either side having an unfair advantage because of labor and environmental standards, or state subsidies. But even if a deal is struck and ratified by both sides before the end of the year, many details will linger. If the United Kingdom does not join Horizon Europe now, it could still choose to join months or years down the line. And the fallout from the new immigration restrictions and customs bureaucracy will take time to unfold. Flanagan says Brexit means a permanently different relationship between the United Kingdom and Europe that will demand constant shifts and negotiations. “This is what people don’t really understand,” he says. “It will never be over.”

EU Research

Horizon Europe budget gets unexpected €4 billion boost Next research programme can start on time, and MEPs also manage to claw back money for health and student exchanges. Amid a new wave of pandemic lockdowns, prospects for the European Union’s flagship research programme have brightened slightly. In a final round of intensive budget talks this week, policymakers agreed to give €85 billion (US$100 billion) to the 7-year Horizon Europe programme due to start in January — €4 billion more than previously proposed.

that we managed to claw back €4 billion for Horizon Europe, which compensates, to some extent, for the cuts pushed so hard by the EU member states,” he says. The European Commission had previously proposed a €94.4-billion budget for the programme, but in July, negotiations had scaled that back to €81 billion.

The last-minute increase is part of an agreement between the union’s 27 members and the European Parliament on the bloc’s overall 2021–27 budget, a record €1.8-trillion package that includes a €750-billion COVID-19 recovery fund. Governments and the parliament also agreed on 10 November to raise the budgets for health and education: together, Horizon Europe, the student-exchange programme Erasmus+ and the COVID-19 response package EU4Health will get an extra €15 billion.

The European Parliament is expected to formally approve the budget deal before the end of the year. Meanwhile, the European Commission and the member states must decide how to spread the extra money over Horizon Europe’s various pillars, including basic science funded by the European Research Council and mission-oriented research in fields such as climate, cancer, soil health and food. Research organizations that had lobbied hard for more generous research funding say the final deal is underwhelming at best: Horizon Europe is still at least €10 billion short of what they had hoped. “A symbolic €4-billion top-up for Horizon Europe,” says Kurt Deketelaere, secretary-general of the League of European Research Universities in Leuven, Belgium. “What a disappointment after a campaign of two years.”

The agreement is a “victory for researchers, scientists and citizens alike”, says Christian Ehler, the European Parliament’s spokesperson on research and development, innovation, industry and energy. “It was a very tough fight, but at least we can say

Nektarios Tavernarakis elected Vice President of the European Research Council Professor of Molecular Systems Biology at the Medical School of the University of Crete, and the Chairman of the Board of Directors at the Foundation for Research and Technology, has been elected Vice President of the ERC. The ERC’s mission is to encourage the highest quality research in Europe through competitive funding and to support investigator-driven frontier research across all fields, on the basis of scientific excellence. Tavernarakis said he was “deeply honored” by the decision. “Having witnessed first-hand, as a grantee, the transformative impact of ERC on European science, I am wholeheartedly committed to contributing to its mission in this new role. The ERC is not merely a success story in the framework of the European ideal; it stands as a radiant paradigm of how investment in frontier research can reap enormous benefits for society at large. I am looking forward to working closely with my Scientific Council colleagues, across all three domains, to serve science by building on the ERC’s legacy of fostering excellent research. Today, the ERC’s support for bottom up, blue skies research is more relevant than ever. This strategy can effectively protect us in the face of unpredictable threats, and address diverse and complex global challenges,” he added. Nektarios Tavernarakis studied Biology at the Aristotle University of Thessaloniki and holds a PhD degree in Molecular Genetics from the University of Crete, Greece. His research focuses on the molecular mechanisms of necrotic cell death and neurodegeneration, the interplay between cellular metabolism and ageing, and the mechanisms of sensory transduction and

integration by the nervous system. He is an elected member of the European Molecular Biology Organization, and Academia Europaea. He is the recipient of two ERC Advanced Investigator Grants, an ERC Proof of Concept Grant, the EMBO Young Investigator award, the Alexander von Humboldt Foundation, Friedrich Wilhelm Bessel research award, the Bodossaki Foundation Scientific Prize for Medicine and Biology, the Empeirikeion Foundation Academic Excellence Prize, and the BioMedical Research Award of the Academy of Athens.

Visit of Nektarios Santorinios to the EC, meeting Corina Creţu, Member of the EC in charge of Regional Policy in January 2019. © European Union


Astronomers produce most detailed 3D map yet of the Milky Way With over 2 billion stars it could help shed light on galaxy’s origin and future. Scientists from Cardiff University have helped produce a brand-new, three-dimensional survey of our galaxy, allowing them to peer into the inner structure and observe its starforming processes in unprecedented detail. The large-scale survey, called SEDIGISM (Structure, Excitation and Dynamics of the Inner Galactic Interstellar Medium), has revealed a wide range of structures within the Milky Way, from individual star-forming clumps to giant molecular clouds and complexes, that will allow astronomers to start pushing the boundaries of what we know about the structure of our galaxy. SEDIGISM has been unveiled today through the publication of three separate papers in the Monthly Notices of the Royal Astronomical Society, authored by an international team of over 50 astronomers. “With the publication of this unprecedentedly detailed map of cold clouds in our Milky Way, a huge observational effort comes to fruition,” says Frederic Schuller from the Max Planck Institute for Radio Astronomy (MPIfR), lead author of one of the three publications, presenting the data release. Dr Ana Duarte Cabral, a Royal Society University Research Fellow from Cardiff University’s School of Physics and Astronomy, was lead author on one of the papers and has provided a catalogue of over 10,000 clouds of molecular gas in our Milky Way. The Milky Way, named after its hazy appearance from Earth, is a spiral galaxy with an estimated diameter between 170,000 and 200,000 light-years which contains between 100-400 billion stars. Our galaxy consists of a core region that is surrounded by a warped disk of gas and dust that provides the raw materials from which new stars are formed. For Dr Duarte Cabral, the new catalogue of gas clouds will allow scientists to probe exactly how the spiral structure of our own Milky Way affects the life cycle of clouds, their properties, and ultimately the star formation that goes on within them. “What is most exciting about this survey is


that it can really help pin down the global galactic structure of the Milky Way, providing an astounding 3D view of the inner galaxy,” she said. “With this survey we really have the ability to start pushing the boundaries of what we know about the global effects of the galactic structures and dynamics, in the distribution of molecular gas and star formation, because of the improved sensitivity, resolution, and the 3D view.” The catalogue of molecular gas clouds was created by measuring the rare isotope of the carbon monoxide molecule, 13CO, using the extremely sensitive 12-metre Atacama Pathfinder Experiment telescope on the Chajnantor plateau in Chile. This allowed the team to produce more precise estimates of the mass of the gas clouds and discern information about their velocity, therefore providing a truly three-dimensional picture of the galaxy. Dr Duarte Cabral and colleagues are already beginning to tease out information from the vast amount of data at their disposal. “The survey revealed that only a small proportion, roughly 10%, of these clouds have dense gas with ongoing star formation,” said James Urquhart from the University of Kent, the lead author of the third publication. Similarly, the results from the work led by Dr Duarte Cabral suggest that the structure of the Milky Way is not that well defined and that the spiral arms are not that clear. They have also shown that the properties of clouds do not seem to be dependent on whether a cloud is located in a spiral arm or an inter-arm region, where they expected very different physics to be playing a role. “Our results are already showing us that the Milky Way may not be a strong grand design type of spiral galaxy as we thought, but perhaps more flocculent in nature,” Dr Duarte Cabral continued. “This survey can be used by anyone that wants to study the kinematics or physical properties of individual molecular clouds or even make statistical studies of larger samples of clouds, and so in itself has a huge legacy value for the star formation community.”

EU Research


Hottest Autumn of all time recorded in 2020 With just over 1 degree Celsius of warming so far, Earth is already dealing with the devastation caused by more frequent and stronger extreme weather events such as wildfires. The world just experienced its hottest November on record while Europe had its warmest autumn, according to an alarming report from the European Union’s Copernicus Climate Change Service. Temperatures were most elevated in a large region across northern Europe, Siberia and the Arctic Ocean, where sea ice was at the second lowest level ever seen in November. The United States, South America, southern Africa, the Tibetan Plateau, eastern Antarctica and most of Australia also saw temperatures well above average. Globally, November was almost 0.8 degrees Celsius (1.44 Fahrenheit) above the average for 1981-2010, and 0.1C (0.18F) higher than last year. And this unusual heat comes despite the cooling effect of La Niña. In Australia, a bushfire has been burning out of control for six weeks now in the popular tourist spot of Fraser Island as parts of the country swelter through a record-breaking heatwave. “These records are consistent with the long-term warming trend of the global climate,” said Carlo Buontempo, director of the Copernicus Climate Change Service at ECMWF. He said November was “an exceptionally warm month” globally and temperatures in the Arctic and northern Siberia remained consistently high while sea ice was near its lowest extent. “This trend is concerning and highlights the importance of comprehensive monitoring of the Arctic, as it is warming faster than the rest of the world,” he added. Buontempo said policymakers who prioritize mitigating climate risks “should see these records as alarm bells” and think more seriously than ever how to best comply with the 2015 Paris Agreement.

The US pulled out of the Paris accord last month, with President Donald Trump claiming it was “designed to kill the American economy,” but President-elect Joe Biden has promised to re-enter the pact after he is sworn in. Copernicus data shows that 2020 could be the hottest year on record. It is on a par with 2016, the warmest year ever, and is likely to equal the record or marginally exceed it unless the mercury drops this month. The World Meteorological Organization’s (WMO) annual climate report last week said 2020 was on course to be one of the three warmest years on record, just after 2016 and 2019. The WMO said average global temperature was about 1.2 degrees Celsius above pre-industrial levels. Global temperatures must be kept from rising above 1.5 degrees Celsius on those levels to avoid major impacts on the climate, the United Nations’ Intergovernmental Panel on Climate Change (IPCC) has said. While December’s data will be decisive, it is almost certain that 2020 will be the warmest calendar year for Europe, Copernicus reports. The JanuaryNovember period was 0.5 degrees Celsius warmer than that period in 2019 -- the warmest year on record -- and at least 0.4 degrees warmer than the same period in any other year. In September, October and November, average European temperatures were 1.9 degrees Celsius above 1981-2010 and 0.4 degrees higher than 2006, the previous warmest fall. Most of Europe saw above average heat, with temperatures soaring the most in the northern and eastern regions. In the Arctic region and large parts of northern Siberia, temperatures have been substantially above average for all of 2020, not just the fall. Sea ice cover has been particularly low since the beginning of summer and Siberian wildfires released record emissions.


Bucharest to host new EU cybersecurity research hub

Romania’s flourishing IT sector gets a new boost from the European Union, which will host its new Cybersecurity Competence Centre in the country’s capital. The Romanian capital Bucharest has been selected as the site of a new European Cybersecurity Competence Centre, a hub that will distribute EU and national funding for cybersecurity research projects across the bloc. Bucharest had been in competition with six other cities, Brussels, León, Luxembourg, Munich, Warsaw, and Vilnius, and was chosen during a meeting of EU ambassadors late on December 9. “Romania’s capital will take on this task in a responsible and dedicated manner, to the benefit of the entire European Union,” said Luminiţa Odobescu, Romania’s permanent representative to the EU, after the vote. One of the key criteria set by the EU had been the presence of an active cybersecurity ecosystem in the host city. Bucharest also has some of the fastest and most robust internet infrastructure in Europe. Romania ranks third in the EU for women employees in IT, and 24 per cent of IT graduates in the country are female.

The European Commission first announced plans for the centre in 2018, funding pilot projects to test how such a centre and network would operate. It is initially expected to host around 30 staff, increasing to 80 in subsequent years. It has also been suggested that the centre will help boost business for local cybersecurity companies and burnish the host country’s reputation on cyber. “This is a great honour for Romania, and demonstrates the trust that other European countries have in us,” said Siegfried Mureșan, a Romanian MEP and vice-president of the European People’s party. “It’s a great chance to develop the IT sector in Romania.” IT has long been one of the main drivers of the Romanian economy. The value of the sector’s turnover is forecast to reach 6.3 billion euros by the end of 2020, according to the country’s investment promotion agency, Invest Romania. The European Commission, Parliament and Council are still finalising the details of how the centre will be governed. An agreement on a new law establishing the centre is expected to be struck before the end of the year.

Beer and Crisps to be used to tackle climate change Crisps firm Walkers has adopted a technique it says will slash CO2 emissions from its manufacturing process by 70%.

Emissions from beer fermentation in a brewery, which usually produces a high amount of carbon dioxide (CO2), will be captured and mixed with potato waste - so it can be turned into fertiliser. The idea has been developed by UK firm CCm Technologies, which trialled the fertiliser on potato seed beds earlier in the year, and has been adopted by crisps giant Walkers. The UK-based crisps firm will install the carbon capture equipment at its factory ahead of the 2022 crop, with its owners looking for ways to source the gas from inside the company. It is expected that Walkers will reduce its carbon emissions by 70% once the project is rolled out more widely, and could even become a carbon-negative potato producer by 2030. David Wilkinson from US snack conglomerate PepsiCo, which owns Walkers, said: “From circular potatoes to circular crops, this innovation with CCm Technologies could provide learnings for the whole of the food system, enabling the agriculture sector to play its part in combating climate change.


“This is just the beginning of an ambitious journey, we’re incredibly excited to trial the fertiliser on a bigger scale and discover its full potential.” “This initiative is a step in the right direction, and we will continue working hard to lower the carbon impact of our products from field, through manufacturing sites, to consumption.” Pawel Kisielewski, from CCm Technologies, said: “CCm is delighted that PepsiCo has chosen our technology to demonstrate the huge potential that innovative approaches can have in promoting sustainable agriculture across the UK. “By enabling the sustainable reuse of waste resources and the locking of captured carbon back into the soil, our partnership represents a significant step forward in proving that agriculture can play a role in carbon reduction and the circular economy.”

EU Research

EU Commissioner Mariya Gabriel shares her ambitious research policy plans on the coronavirus pandemic According to the Commissioner of Innovation, Research, Culture, Education and Youth, the only way for Europe to recover from the coronavirus crisis is to work together. The scientific community has produced over 6000 journal articles and counting since COVID-19 emerged as a global problem. This body of work comes from scientists and researchers who have been working non-stop to understand how to stop and diminish this virus. Looking to the EU mechanisms that exist to make this process easier, Commissioner Gabriel commented: “As early as January, we mobilised Horizon 2020 (the EU’s current research funding programme) into urgently needed coronavirus research and innovation. All this gives us hope, as we continue to combat the virus and work hard to make Europe more resilient and sustainable.

Reflecting on that policy-changing fiscal disagreement now, Commissioner Gabriel said: “Like President von der Leyen, I have expressed that the budget reduction is a regrettable decision. We believe that the original Commission proposal was ambitious, but also realistic. Nevertheless, after the July European Council, Horizon Europe still keeps a robust budget envelope to prepare the competitiveness of the European economy for the widest social benefit of the citizens. And we need to recognise that additional funds will also be allocated from Next Generation EU, the €750 billion recovery instrument for Member States, to Horizon Europe.

“At the end of September, I presented my key initiatives for a European research area, a European education area and a digital education action plan, which contribute to a more inclusive, greener and digital Europe when it comes to research and education.”

“On 29 September, Member States came to agreement on the last remaining open issues of Horizon Europe concerning budget, international cooperation and synergies with other EU programmes. The Commission will act as an honest broker in the forthcoming trilogues (negotiations between the Council, Parliament and Commission) on the new budget. A timely agreement on Horizon Europe is in the best interest of our researchers, innovators, companies but most importantly citizens.

When Commissioner Gabriel took office, one of her major critiques were that she lacked digital experience – so the now urgent migration of services to the digital landscape became a major learning curve. These new initiatives were also welcomed by President von der Leyen, who stressed the importance of the EU keeping ahead of the pulse on “digital technologies and geopolitics” in Gabriel’s 2019 appointment letter. In context of the vaccine, Professor Tedros, WHO Director-General, has consistently warned the world about equal access to any available vaccines, decrying “vaccine nationalism” as a detriment to defeating COVID-19. Speaking at a conference in Geneva, he said: “If and when we have an effective vaccine, we must also use it effectively. In other words, the first priority must be to vaccinate some people in all countries, rather than all people in some countries.” To make this possible, the COVAX global vaccine allocation plan was launched. It currently has 184 countries signed up to donate money and resources that will ensure the vaccine reaches developing countries. On 12 November, the European Commission further raised their contribution by €100 million euros, taking their total input to €500 million – putting the EU in the lead as one of the largest donors to the programme. Jutta Urpilainen, Commissioner for International Partnerships, said: “The EU is demonstrating we are serious about our commitments to leave no one behind and make the COVID-19 vaccine a global public good.” The COVAX facility will attempt to purchase 2 billion doses of the vaccine by 2021. Commissioner Gabriel proposes a “sustainable and inclusive” EU recovery from COVID-19, which means funding for a range of scientific endeavours, alongside aid to developing countries.

“We need to be concrete about what we want to achieve in the short term: to bridge the innovation divide (the difference in innovation capacity between different parts of Europe), to foster a fair and just transition, to develop innovation ecosystems in every region in Europe, and to provide the necessary resources for all our talent.” Whilst it is yet to be seen how research, innovation and all attempts to protect the economy will come to fruition, it seems that Commissioner Gabriel remains ambitious and determined to make this recovery inclusive enough to make the EU stronger post-pandemic. “To be ready for the next crisis, we must support researchers and innovators to work together, share results and data openly and acquire the skills they need to provide us with solutions for our societal challenges. The EU can only succeed if everyone progresses.” EU Commissioner Mariya Gabriel © European Union

However strong the overall economic investment being made by the EU, the summer of fierce budget negotiations between the Commission and the Council remain close to the surface of R&I discussions. There was turbulence with the proposed 2021-2027 budget in May. While President von der Leyen aimed for a lofty, but all encompassing height of €1.1 trillion Multiannual Financial Framework (MFF), the European Council actually ended up passing a smaller, €1.074 trillion MFF on 21 July. When the MFF was negotiated, especially in light of COVID, countries were bitterly divided about how to finance some of the economic recovery packages that went to countries like Italy and Spain. Whilst some asked for those to be grants, others were insistent that these investments be treated as loans. Pushing for a bigger, more flexible budget back in April, von der Leyen commented: “The next seven-year MFF budget has to adapt to the new circumstances, post-corona crisis. We need to increase its firepower to be able to generate the necessary investment across the whole European Union.”


Critical events configurations shape wellbeing throughout life Stressful life events like the birth of a child, divorce, or the loss of a job can have a significant impact on individual wellbeing, and the effect may be magnified if several occur in a relatively short period of time. Researchers in the Wellways project aim to build a deeper picture of how these major life events affect wellbeing, as Professor Laura Bernardi explains. An individual’s wellbeing

tends to vary over time, and is greatly affected by life events like marriage, the birth of a child or bereavement. Based at the University of Lausanne, Professor Laura Bernardi is analysing data from both Switzerland and France as part of the Wellways project, aiming to build a deeper picture of how wellbeing fluctuates over the course of our lifespan, and of the factors that influence it. “For Switzerland, we’re using data gathered from the Swiss Household Panel (SHP), a longitudinal survey that was established in 1999. There are always new household cohorts coming into the survey, and we can follow people over long periods of time,” she explains “For France, we draw on two other panels, the epidemiological cohort Constances with a sample of 200,000 adults, and the Health and Professional Trajectories (SIP) collected in two waves of 2006 and 2010,” continues Professor Bernardi.

Wellways project The aim now is to use the data from these surveys to explore the relationship between major life events and different dimensions of wellbeing over the course of the average lifespan. Typically, wellbeing, when measured as how satisfied a person is with life in general, follows a U-shaped curve; young people report high wellbeing, middle aged individuals report lower wellbeing, whereas wellbeing is again higher among older people, which might seem paradoxical. “Generally, health

deteriorates as we age, and there are lots of other reasons why we would be less satisfied as we grow older,” says Professor Bernardi. The project will probe deeper into the questions around how wellbeing evolves as people age, while Professor Bernardi and her colleagues will also investigate the impact of work and family life on wellbeing. “The SHP has data on family history, on employment, and it follows people over these domains, while the French surveys provide more detail on health indicators,” she outlines. “We take a baseline level of happiness or wellbeing and investigate the impact of different life events.”

labour market exit, or upward- and downward mobility, and are looking at the cumulative impact of these stressful events on individual wellbeing. “Is there a threshold beyond which you cannot overcome that stress, a kind of vulnerability limit? And does it matter whether events occur in both the work and family domain?” asks Professor Bernardi. “Maybe if an individual has to deal with three or four stressful events, they get a bit down, but then recuperate and recover. But if they have to deal with seven of them and from different domains, then it may be more difficult to recover. Then it’s a cumulative disadvantage.”

If an individual has to deal with two or three stressful events, most will recuperate and recover. But if they have to deal with many more, in different life domains, it may be more difficult to recover.

When the number of events is higher or several events occur in a short period of time, we observe that the life satisfaction curve goes down. The life events examined in Wellways are fairly common and related to work or family, which can have positive or negative effects on wellbeing. One major event of interest in the project is the birth of a child, while researchers are also looking at the impact of less happy events. “We also consider union disruption, so separations or divorce, as well as the death of a relative or partner,” says Professor Bernardi. Researchers are also considering the effects of changes in people’s working lives, such as

The timespan over which these events occur may also affect an individual’s wellbeing, an issue Professor Bernardi plans to investigate. The starting point here is to look at positive and negative events over the course of a year, and then to assess whether these events have a stronger impact on wellbeing if they are concentrated in a short period of time. “We have seen that on average people experience about 10 critical events in their adult life (between 20 and 50 years old), mostly related to health, family,

Photo by Tyler Nix on Unsplash


EU Research

WELLWAYS Critical events and transitions in family and work and multidimensional wellbeing Project Objectives

Fig 1: Life satisfaction follows a U-shaped curve: young adults and the elderly report higher wellbeing than mid-aged individuals.

work and residential trajectories. When the number of events is higher or several events occur in a short period of time, we also observe that the life satisfaction curve goes down” explains Professor Bernardi. This suggests that what affects wellbeing is particularly the combination and concentration of events, and if they occur in a short period of time it can make it difficult to cope. “This is why, when you use cross-sectional data on wellbeing, we see that it is the middle years of life that are difficult, given that this is often the rush hour of life, when many things happen in multiple life domains” says Professor Bernardi. This is often a point in life at which many events occur and pressures accumulate, for example taking care of elderly parents or coping with bereavement while at the same time raising a family and holding down a job. The wider social and economic picture also affects individual wellbeing, something Professor Bernardi is taking into account in the project. “We include a control for a given period where a recession occurs,” she says. The impact of a recession tends not to be uniform however, and those who were in more precarious employment to start with are likely to be more severely affected than those who are more established. In addition, data on people’s personality traits have been recorded in the SHP, another area of interest to Professor Bernardi. “It will be interesting in the near future to look at whether the impact of these period effects vary according to these psychological characteristics,” she continues.

Fig 2: When individuals experience several events concentrated in a short time window life satisfaction is lower than when individuals experience fewer or more spread events.

Wider comparisons The project’s research is centered on Switzerland and France, but in the future Professor Bernardi intends to widen the comparative context and look at data from other countries with different approaches to social security. The thresholds may be very different in the US for example, where there is relatively limited social security, or in the Scandinavian countries where there is more in comparison to Switzerland or France. “The aim would be to see whether we can generalise about threshold effects, and to look at how the context can affect those thresholds,” says Professor Bernardi. While this work can be thought of as fundamental research, it also holds wider relevance in terms of public policy. “If we see that specific events and specific combinations of events lead to a nonrecoverable pattern in terms of wellbeing, then I think that would be of interest for policy-makers. They could then look to try and counteract that,” outlines Professor Bernardi. Researchers are not yet at a point where they can draw firm conclusions however, and they are still working on several papers for various different journals. The initial results of the project’s research indicate that experiencing more than three critical life events in a short period of time can have a dramatic impact on wellbeing, and Professor Bernardi hopes to publish some more results by the end of this year.

WELLWAYS aims to analyse, using longitudinal data from Switzerland and France, how work and family trajectories jointly affect wellbeing. The overall objective of the project is to understand the risks of a deterioration of wellbeing throughout the course of life, by taking a dynamic approach to the life course and a multidimensional approach to wellbeing. The aim will be to analyse the role of individual resources in preventing the risks or limiting their impact.

Project Funding

Duration: 01.02.2019 - 31.05.2021 Funding source : Project funding in humanities and social sciences (Division I: CHF 464’558)

Project Participants

• Please see website for details

Contact Details

Laura Bernardi Professor of Life Course Demography and Sociology Member of the SNSF Research Council Editor-in-Chief (co) of Advances in Life Course Research University of Lausanne Géopolis Building 1015 Lausanne T: +41-21-692-3846 E: W: interpub/noauth/php/Un/UnPers. php?PerNum=1071358&LanCode=37 Comolli, C., L, Bernardi, L. and Voorpostel, M. (submitted to EJP May 2020). Joint family and work trajectories and multidimensional wellbeing. Comolli, C., L, Bernardi, L. and Voorpostel, M. (submitted to SSM, June 2020) Multidimensional wellbeing over the life course. Barbuscia, A. and Comolli, C.L. (submitted to Vienna Yearbook, May 2020). Gender and socioeconomic health and wellbeing inequalities across age in France and Switzerland.

Professor Laura Bernardi

Laura Bernardi is Professor of Demography and Sociology of the life course at the University of Lausanne and member of the Swiss National Research Council. She is also Editor of the journal Advances in Life Course Research. Her current research is about family and migration and led several research projects in these areas. She is widely published in several international high rank journals in demography and sociology and edited a handful of collective volumes by Springer.


Uncovering new insights into depression Depression affects millions of people across the world, yet the underlying neurobiological causes of the condition are still not fully understood. We spoke to Dr Jean-René Cardinaux about his research into the molecular mechanisms and regions of the brain linked with depressive behaviour, which could in future lead to improved treatment. A number of

hypotheses have been developed to identify the root cause of depression, yet the underlying factors behind the development of mood disorders are still unclear. Most of the current hypotheses are based on the response to antidepressants. “Most conventional antidepressants act on monoamine neurotransmitters, mostly serotonin and noradrenaline. The monoamine hypothesis suggests that a depletion of those neurotransmitters leads to depression,” explains Dr Jean-René Cardinaux, a researcher in molecular psychiatry and epigenetics based at Lausanne University Hospital. However, it typically takes several weeks before antidepressants become active, suggesting that other factors are also involved, beyond the rapid restoration of the level of these neurotransmitters. “So that’s why other hypotheses have been put forward,” says Dr Cardinaux. The neurotrophic hypothesis of depression for example suggests that a specific class of stress hormones called glucocorticoids have a negative impact on neurotrophic factors, a topic at the heart of Dr Cardinaux’s research. These neurotrophic factors are a class of biomolecules that support neurons in the brain. “These neurotrophic factors support neuron survival as well as neurogenesis. One factor that has been shown to be important is called brain-derived neurotrophic factor (BDNF),” says Dr Cardinaux. As the Principal Investigator of a SNSF-funded project, Dr Cardinaux aims to help build a clearer picture of the etiopathogenesis of mood disorders, which could have important implications in terms of treatment. “It was previously shown that a transcription factor called CREB can regulate the expression of BDNF,” he outlines. “We’ve found that a coactivator of CREB called CRTC1 seems to also play an important role in this regulation.”

Model of depression By studying a mouse model in which CRTC1 has been inactivated, Dr Cardinaux and his colleagues aim to gain deeper insights into both depression and also the associated pathological conditions. A variety of symptoms are associated with


Possible role of the transcription coactivator CRTC1 in the pathogenesis of depression and associated disorders.

depression, including anhedonia, or an inability to feel pleasure, while there are also physical symptoms. “For instance, weight gain or weight loss can occur depending on the depression subtypes. Psychomotor retardation can also be observed, where people feel tired and lethargic, with a visible slowing of physical and emotional reactions” says Dr Cardinaux. It has been shown that

mice in which CRTC1 has been inactivated show certain symptoms associated with depression and mood disorders, now Dr Cardinaux is looking to probe deeper. “In our mouse model we see a decrease in BDNF expression, but also of other genes. The phenotype of these mice have similarities with depression – we call it a depressive-like phenotype,” he explains.

EU Research

The aim now in Dr Cardinaux’s lab is to study these mice in different situations, further identify the brain regions and circuits, which are involved in depressive behaviour and uncover more detail about the underlying mechanisms involved in mood disorders. Several behavioural tests have been approved for investigating depression in animals, one of which is called the sucrose preference test. “You present two bottles – one with just water, and the other with a little bit of sucrose mixed in the water – and you monitor what the mice drink,” explains Dr Cardinaux. Usually there’s a marked preference for the sucrose solution, but this is reduced in mice that are experiencing anhedonia. “This test is quite subtle, but a reduced preference is recognised as a measure of anhedonia,” continues Dr Cardinaux. “We are also looking at social behaviour. We’ve seen in our mouse model that mood is affected at different levels in the males. It can lead to aggressive behaviour, while it may also affect social interaction. These symptoms are very interesting in terms of understanding the neurobiological basis of depression.”

Role of the CREB coactivator CRTC1 in mood disorders and antidepressant response Project Objectives

Project team (from left to right): Jean-René Cardinaux, Laetitia Guiraud, Clara Rossetti, and Antoine Cherix.

there’s a link between atypical depression and obesity, and our colleagues have discovered that CRTC1 polymorphisms may be playing a role in this association.” A lot of attention in research is also focused on the antidepressant response. Researchers have found that the mice in which CRTC1 is inactivated do not respond to conventional antidepressants. “This suggests that CRTC1 is involved in the response to the conventional antidepressants, it needs to be there,”

We’ve seen in our mouse model that mood is affected at different levels in the males. It can lead to aggressive behaviour, while it may also affect social interaction. Mood disorders are often associated with metabolic syndrome and circadian rhythm disturbances. “We’ve found that male mice lacking CRTC1 not only display depressivelike behaviours, their circadian locomotor activity and feeding behaviour are altered as well, leading to obesity,” says Dr Cardinaux. “Our mouse model is very interesting to study as it associates many symptoms of major depression and its related disorders.”

Translational research The wider aim is to establish clearer links between these findings and human depression, which could then open up new possibilities in terms of translational research. Regarding CRTC1 levels in the brain, Dr Cardinaux hopes in future to look at postmortem tissue material of the human brain, which could help establish links between the animal models and human depression. “We want to see if there’s a decrease of CRTC1 in regions like the hippocampus or the prefrontal cortex,” he outlines. Research is also continuing into gene polymorphisms that could be linked to depression. “We’ve been collaborating with a group here at our institute who are working on the genetics of obesity, linked to psychiatric disorders,” continues Dr Cardinaux. “It’s been shown that

says Dr Cardinaux. A new antidepressant called ketamine is now emerging, which Dr Cardinaux says could overcome some of the problems associated with conventional antidepressants. “It takes several weeks before conventional antidepressants become active. The advantage of ketamine is that it can be active after just a few hours,” he explains. “We’ve tried treating our mice with ketamine, and we’ve found that the rapid response to a single dose of ketamine is preserved in CRTC1 knockout mice. This suggests that CRTC1 is not necessarily required for the rapid antidepressant effect of ketamine, but we plan to investigate its role in the long-lasting effects of ketamine.” This is a topic Dr Cardinaux plans to explore further in future, while he is also interested in helping to develop new antidepressants that would act on this CRTC1 pathway. Increasing, or restoring, CRTC1 levels could represent a route towards more effective antidepressants, yet Dr Cardinaux says new therapeutic approaches need to be built on a detailed understanding of the underlying basis of depression. “We would have to study and to know how those CRTC1 levels are decreased by chronic stress, and find the mechanisms. If you understand the mechanisms, maybe then you can help restore those levels,” he says.

Major depression is a still poorly understood psychiatric condition that affects millions of people worldwide. Preclinical studies with animal models of depression provide a better understanding of the mechanisms involved in its etiopathogenesis. Mice lacking a transcription coactivator called CRTC1 constitute a new and interesting animal model of depression, exhibiting several behavioral and molecular alterations related to human depression and associated disorders. The project focuses on further characterizing the consequences of CRTC1 deficiency on mood disorders-related symptoms and antidepressant response.

Project Funding

Funded by the Swiss National Science Foundation. Grant 31003A_170126. “Role of the CREB coactivator CRTC1 in mood disorders and antidepressant response”.

Contact Details

Dr. Jean-René CARDINAUX, PhD Centre de Neurosciences Psychiatriques Département de Psychiatrie Route de Cery 11b CH-1008 Prilly T: +41 (0)21 314 3596 E: W: Saura, CA and Cardinaux JR (2017). Emerging Roles of CREB-Regulated Transcription Coactivators in Brain Physiology and Pathology. Trends Neurosci. 40, 720-733, doi:10.1016/j.tins.2017.10.002.

Dr. Jean-René CARDINAUX, PhD

Dr Jean-René Cardinaux is a senior lecturer and group leader at the Center for Psychiatric Neuroscience and Service of Child and Adolescent Psychiatry of the Lausanne University Medical Center (CHUV), Switzerland. He holds a PhD in molecular biology from the University of Lausanne (UNIL) and gained postdoctoral experience in molecular neuroscience, first at UNIL and then at the Vollum Institute of the Oregon Health & Science University in Portland, USA. His main research interests focus on the transcriptional control of neuroplasticity genes and the mechanisms underlying their altered expression in depression.


Lifting the lid on illegal pharmacies Legal and illegal pharmacies offer different benefits to customers, and an increasing number of people are turning to illegal channels to buy drugs, potentially putting themselves at risk. We spoke to Professor Luis Diestre about his work in analysing the competitive dynamics of the informal economy and its implications for regulation. The pharmaceutical industry

is an important part of the global economy, with major companies investing large sums to develop, manufacture and distribute more effective drugs to treat disease. Outside the formal part of the economy, pharmaceutical products are also bought and sold in the informal economy, yet little is known about competitive dynamics in these environments beyond the reach of laws and regulations. “We know a lot from management literature about how legal firms compete, but not so much about competition in illegal environments,” says Luis Diestre, an Associate Professor at IE Business School in Madrid. This is a topic Professor Diestre is addressing in the ERCfunded ILLEGALPHARMA project, using data gathered from the US, which has some important differences with the European market. “Drug prices are much higher in the US than in Europe. It seems that more people rely on illegal sources for their drugs, because of the price differential,” outlines Professor Diestre.

IllegalPharma A second important difference is that it is relatively easy to identify a benchmark for comparison in the US market, whereas each European country has different regulations and there are wide price variations. Data has been gathered from a variety of sources, including illegal online pharmacies and legal ones, which Professor Diestre is analysing


in search of deeper insights. “The basic idea here is that the legal and the illegal channels provide different benefits. The legal channel provides safety – a customer going to a physical pharmacy will know that that pharmacy is regulated and that the drug they are buying is safe. So if you are getting a prescription drug, you know that this drug is good for you,” he says. This is not the case however with an illegal pharmacy, at which it is possible to buy drugs without a prescription,

significant proportion of cases the price of the illegal drug was higher than a legal one, even though the illegal pharmacy does not guarantee the safety of the drug,” explains Professor Diestre. One area in which an illegal pharmacy does have an advantage however is in maintaining individual privacy, which may affect drug prices. “If you don’t need a prescription then you don’t need to go to a doctor to be tested or diagnosed, and you don’t need to go to the counter to disclose

Legal and the illegal channels provide different benefits. The legal channel provides safety – a customer going to a physical pharmacy will know that that pharmacy is regulated and that the drug they are buying is safe –, whereas the illegal channel provides greater levels of privacy. which entails risk. “Is this drug right for me? And how do I know that the drug has been produced and assessed in a clean way? So there are some risks involved,” points out Professor Diestre. The legal pharmacy has an advantage in this respect, so on this basis we might expect prices for drugs from legal sources to be higher than those from illegal ones. Price is an important commercial consideration, and while on average illegal drugs are less expensive than legal drugs, there’s a high degree of variability. “We found that in a

what may be very sensitive information. You also don’t need to disclose information to your insurance company about your health record,” says Professor Diestre. “In that sense an illegal pharmacy gives a degree of privacy which you cannot get through legal channels.” This may be an important consideration for some individuals, maybe because they want to keep sensitive information private, or to keep their health insurance premiums as low as possible. In the project, Professor Diestre is looking at the value that patients with certain stigmatised conditions – like mental health

EU Research

conditions or sexually transmitted infections – attach to privacy. “Our prediction here is that these types of customers will value the benefits that the illegal channel provides more strongly, and this should translate into two things. First, we should see more of these drugs being offered in illegal pharmacies. And secondly, the price should be relatively higher in comparison to the legal one,” he outlines. It is not easy to gather data in this area; researchers are analysing forums commonly used in the US in which people express their feelings about being diagnosed with certain conditions. “We’re trying to capture the extent to which the way people talk or write about these diseases implies that there is some sort of stigma associated with them,” says Professor Diestre. The aim here is to identify which diseases are linked to certain stigmas, from which researchers can then look to investigate the impact of this on drug prices. Through careful statistical analysis, Professor Diestre hopes to see whether higher degrees of stigma lead to higher illegal prices relative to those in legal channels. “Sometimes we find that there is a premium, that people have to pay a higher price for the illegal drug than they would pay in legal channels,” he says. The local context matters in this respect, and geographical, demographic and socioeconomic factors can affect which conditions are stigmatised, issues Professor Diestre has taken into account. “We’re using data from the Medical Expenditure Panel Survey, which is performed in the US every year. From this survey we can get information about geography and demographics, such as gender, age and socio-economic status of consumers of particular drugs,” he continues. “We control for these variants and try to explore the data in a richer way, to see if we can identify certain patterns.”

European market This research has been conducted using data gathered from the US, yet illegal pharmacies are also a problem in Europe, where the amount of counterfeit and illegal drugs being purchased is estimated to be growing. This trend is likely to be accelerated further by the Covid-19 pandemic, underlining the importance of effective regulation, which is a major motivating factor behind the project. “There is evidence that some of the people purchasing illegal drugs are suffering serious side-effects that could be avoided. This represents a big threat to public health - we believe that one of the ways in which we can attack this problem is through better regulation,” explains Professor Diestre. Regulation needs to be underpinned by a deep understanding of why people buy drugs from illegal sources in the first place. “If we can somehow diminish the main advantage of illegal pharmacies – privacy – then maybe this is a way to address this,” says Professor Diestre. Many people are becoming increasingly used to buying products online, yet it can be difficult to distinguish between legal and illegal pharmacies online. This is an issue Professor Diestre plans to investigate in future. “We’re very interested in seeing how these websites try to deceive customers and regulators, to avoid being shut down, and to convince customers that they’re legitimate and trustworthy,” he says. There are also plans to investigate other issues around illegal pharmacies. “We want to understand the incentives for an illegal pharmacy to sell low-quality drugs. The fact that a source is illegal does not necessarily imply that its products are low-quality or contaminated – we suspect there will be a lot of variability,” outlines Professor Diestre.

IllegalPharma Competitive Dynamics in the Informal Economy: The case of Illegal Pharmaceutical Drugs Project Objectives

Our objective is to explore the competitive dynamics between legal and illegal businesses. We look at the sale of medicines in online pharmacies to estimate (1) the probability that a medicine is offered in an illegal pharmacy and (2) the gap between legal and illegal prices.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation program under Grant agreement ID: 715536 (EU contribution: € 1 374 185).

Research Team

Antonella Fazio E: Yasser Fuentes E:

Contact Details

Project Coordinator, Professor Luis Diestre IE Business School Associate Professor Strategy Department T: +34 915689600 E: W:

Professor Luis Diestre

Luis Diestre is an Associate Professor at IE Business School. He received his PhD in Strategic Management from the University of Southern California in 2009. His research interests evolve around two distinct topics: Non-market strategy and R&D activities in the Biopharmaceutical industry. Currently he serves as an Associate Editor of the ‘Academy of Management’ Journal.


How do forces affect cell behaviour? Mechanical forces greatly influence the behaviour of cells, for example in determining whether they differentiate, proliferate, or adopt a malignant phenotype. Understanding how mechanical forces are transmitted in cells could open up interesting new therapeutic avenues, as Pere Roca-Cusachs of the Mechano-Control project explains. The cells inside our body continuously exert mechanical forces on each other and the scaffold of fibres that surround them, the extra-cellular matrix. Cells are linked to the extra-cellular matrix through molecules called integrins, while another type of molecule called cadherins binds cells to each other. “Whenever forces are applied – either between one cell and the next, or between one cell and the matrix – these and other molecules are subjected to force,” says Pere Roca-Cusachs, the coordinator of the Mechano-Control project. The balance between the forces exerted on integrins and cadherins, and the collective action of these bonds, has a major influence on the behaviour of cells. “It may affect whether a cell proliferates or differentiates into a certain type of cell, or whether it adopts a malignant phenotype,” outlines Roca-Cusachs.

Invasion of tumoroid cells in 3D. In yellow intermediate filaments and magenta actin. Scale bar : 20um

Researchers are also looking at how cells respond to forces in more complex situations, such as in an epithelial or endothelial mono-

Whenever forces are applied – either between one cell and the next, or between one cell and the matrix – integrins and cadherins are subjected to force. Mechano-Control project This issue is central to the work of the Mechano-Control project, a multi-disciplinary project which brings together researchers from across Europe. Researchers in the project are investigating how cells transmit and detect forces, with Roca-Cusachs and his colleagues developing methods to monitor cellular behaviour at different scales, which could lead to new insights into how diseases progress. “On one level, we isolate cells and look at them one-by-one,” he says. Cells don’t exist on their own in tissues, yet researchers can still gain important insights from this work. “What happens if we change the composition of the extra-cellular matrix to look more like cancer? What changes?” asks Roca-Cusachs. “We are developing mimics of this matrix, where we can dynamically change their properties. We can make the matrix get stiffer or softer by applying different kinds of light.”

layer, as well as in 3-dimensional environments, which are more representative of how a tumour grows in vivo. One part of the project involves understanding how mechanical force is being transmitted to the nucleus of a cell and activating gene transcription. “We’re developing tools to monitor that so that we know when it’s happening,” says Roca-Cusachs. This research could hold important implications for the diagnosis and treatment of certain diseases, such as breast cancer. One current technique to screen for breast cancer is by palpating the breast and looking for hard lumps. “Breast cancer tissue is stiffer than normal tissue, and it is well-known that mechanical forces affect this,” outlines RocaCusachs. Understanding this process and the scales at which it occurs could open up exciting possibilities in developing new therapies or diagnostic tools, something Roca-Cusachs is exploring. “A molecule has been identified

MechanoControl Summer School, September 2019.

that unfolds when a force is exerted, which exposes a domain that was previously hidden. This triggers a biochemical cascade which eventually leads to defects and breast cancer progression,” he explains. “We are looking into drugs to block these interactions.” A lot of progress has been made in this respect, and some drug candidates have been identified, which researchers are now looking to test. These drugs would be designed to essentially fool cells into believing they’re not in an abnormally stiff tissue. “Cells change their behaviour when they are in a stiff place; they further stiffen the tissue by secreting more matrix. If you convince the cells that they’re not in a stiff place, then they will stop secreting this, and this may help restore the normal stiffness of the tissue,” says Roca-Cusachs.

MECHANO-CONTROL Mechanical control of biological function Funded under H2020-FETPROACT-2016-2017 (FET Proactive – Boosting emerging technologies): FETPROACT-01-2016 (grant agreement No. 731957, MECHANO-CONTROL) Partners: King’s College London (KCL), INMLeibniz Institute for New Materials, University Medical Center Utrecht (UMCU), Universitat Politècnica de Catalunya (UPC) and Noviocell BV. Clara Civit, Events and Communications Officer Institute for Bioengineering of Catalonia (IBEC) : @Mechanocontrol E: W:

Pere Roca-Cusachs is currently Associate Professor at the University of Barcelona, and Group Leader at the Institute for Bioengineering of Catalonia (IBEC). He is also an EMBO Member. His group studies the physical and molecular mechanisms by which cells detect and respond to mechanical signals. In 2019 he received the Young Investigator Award of the European Biophysical Societies Association (EBSA).


Fluorescent reporting guests switch on or change colour upon binding with host.

In pursuit of deeper understanding of dynamic hydrogels Prof. Oren A. Scherman, Professor of Supramolecular and Polymer Chemistry and Director of the Melville Laboratory for Polymer Synthesis reveals how the ERC funded CAM-RIG (Confocal Microscopy and real-time Rheology of dynamic hydrogels) project is providing state of the art innovation to understand hydrogel behaviour at a molecular level in real time. Hydrogels, in the simplest of descriptions, are water-based jellies. They have a range of potential real-world uses depending on their composition. For example, hydrogels can be used for applications including soaps, cosmetics, soft contact lenses or even just for filling empty spaces. Use of hydrogels as 3D support structures for cells, as a means of delivering bioactive molecules in the form of targeted medicines, for wound-healing applications, and to study biological processes, are all known. The versatility of their properties means they have huge potential in myriad areas. In particular, their tunable material properties make them ideal candidates for new biomaterials, for example, to replace human parts like ligaments. In short, they are a ‘wonder material’ but by the very nature of their versatility they can be ‘slippery’ when it comes to studying them.

The power of dynamic behaviour Before comprehending CAM-RIG’s achievements, we need to first differentiate between types of hydrogels. A hydrogel is a network of polymer chains that is held together by crosslinks, which hold the (polymer) strands together. These can be classed as either physical (dynamic, break and reform) or chemical (permanent, do not break) crosslinks.

“There are two different types of hydrogels - dynamic and covalent (static). We’re interested in studying dynamic hydrogels,” explains Professor Scherman. “With dynamic hydrogels you have materials held together by interactions that can readily break and reform. For example, if you have a dynamic hydrogel in a syringe and you push the plunger, the sheer force of pushing the mass of material out through a tiny orifice means all

crosslinks means that the hydrogel can be readily formulated and, for instance, a drug can be added to the gel, which will then diffuse out of the gel to the surrounding area as the crosslinks break and reform.” The hydrogels can also be used to deliver or grow cells and recently we have even begun to grow organoids in the gels. In these cases, the dynamic crosslinks allow the cells to move and grow without restrictions.”

By solving the intricate, microscopic mysteries of dynamic hydrogels a whole world of exciting and sometimes life-changing applications will be possible for future generations to benefit from. of those dynamic interactions that are holding together the big chains, they can be broken in an instant and then immediately they fall back into place when the material exits the needle, so the gel goes from being a network to a flowing fluid, and then immediately goes back to being a network. You can completely change the structure of the gel. In contrast, if you loaded a static hydrogel in a syringe you couldn’t push it through a needle.” This property is important, especially as we want to use these gels for biomedical applications. The reversible nature of the

Seeing hydrogels up close The CAM-RIG project has pioneered a new state-of-the-art experimental set up. It couples together two pieces of characterisation methodologies - super resolution microscopy and rheology - to allow tracking of changes in the hydrogels at a molecular level in real time. The crosslinks within the gels used by Scherman are specially designed to be fluorescent when the crosslink is together (‘on state’) and optically dark when the crosslink is broken (‘off state’). The super resolution microscope images the crosslinks within the hydrogels while the

(a) Schematic to highlight covalent vs. dynamic hydrogel networks. (b) Supramolecular ‘on’-’off’ reporting model hydrogel used within CAM-RIG.


EU Research

CAM-RIG ConfocAl Microscopy and real-time Rheology of dynamIc hyroGels Project Objectives

Image showing CAM-RIG setup, which combines super resolution fluorescence microscopy with piezo axial vibratory rheology.

rheometer simultaneously probes the material properties of the gel. The combined information allows for insight into how to fine-tune the hydrogel design towards a specific function or application, key when developing materials for use in the biomedical realm. “We are trying to get both spatial and temporal information of what’s actually happening with the crosslinks in the hydrogel rather than an ensemble average of what’s happening, so basically it’s about seeing a higher level of detail, as without the detail you’re just left with an average and it doesn’t give you any real information,” says Scherman, adding an analogy for clarity. “It’s like looking at a telescopic image of the Milky Way without any type of temporal or spatial resolution where you just see a smudge of light but if you’re able to get a fixed point in time – a snapshot – you would be able to see all of the stars at specific locations and how bright they are, so it’s like taking a smudge off of a telescope lens and giving you crystal clear information at a point in time.”

Pioneering new methods To achieve the aims of the project the team of researchers needed a piece of equipment that did not yet exist, so they were tasked to build a device that could reveal spatial and temporal resolution at extremely high resolution for the type of time scales and distance scales that are relevant. “At the start we knew what we wanted to be able to see but had to think about how to put together a set up that would allow us to do this. We decided to couple a super resolution microscope together to a rheometer to allow us to visualise the crosslinks whilst interrogating the material properties. The challenge came when we had to make these two pieces of equipment ‘speak’ to each other; in equipment terms you could say we have melded together a magnifying glass with a stopwatch and a hammer. These are tools that don’t normally work together in unison so making those connections was key.” To further complicate the development of this hybrid set up, there were nuances that

the equipment needed. Rheometers typically have a lag time before they output information about strain rates, this missing window was where most of the meaningful data that we needed was located. The other problem was that typical microscopy is not super resolution and limited to distance scales that are not below the diffraction limit of light. “We needed to find a way to put together super resolution microscopy with rheological measurements whose time scales could be immediate because the timescale of the ‘onoff’ transient crosslinks can be as short as sub-second. In collaboration with Dr Steven Lee, an expert in super-resolution microscopy, we found a way to do this and have spent the last two years building and testing our set up, which we call CAM-RIG. CAM-RIG includes spatial resolution that has a sub diffraction limit on the order of 10 nanometers or less and in addition it gives us time resolution on the order of thousands per second.” The science that will come out of the CAMRIG project will have direct implications in the advancement of biomaterials that the Scherman group are developing. They are world leaders in the development of biocompatible hydrogels for localised drug delivery and the insight from CAM-RIG will feed directly into how to design and develop new hydrogels for specific applications. The knowledge and insight gained by visualising these crosslinks within dynamic hydrogels can lead directly to advanced biomaterial applications in a wider sense by opening up the ability to precisely tune hydrogel physical properties. Halfway through the project, there are already many interested parties seeing great potential for industrial, healthcare and commercial applications. “We work with Cancer Research UK in Cambridge and the Brain Tumour Charity and of course we have a large amount of interaction with industry.” By solving the intricate, microscopic mysteries of dynamic hydrogels a whole world of exciting and sometimes lifechanging applications will be possible for future generations to benefit from.

CAM-RIG pioneers the combination of stateof-the-art characterisation techniques into a unique experimental setup, namely superresolution microscopy imaging modalities with simultaneous rheological measurements to investigate fundamental structure-property relationships of polymer networks and dynamic hydrogels. Using CAM-RIG makes it possible, for the first time, to deconvolute the molecularlevel dynamics of supramolecular physical crosslinks from chain entanglement of the polymeric networks and understand their relative contributions on the resultant properties of the hydrogels. We are using the knowledge gained through CAM-RIG to design and develop new supramolecular (bio)materials for a range of realworld applications.

Project Funding

Funded under the European Union’s Horizon 2020 research and innovation program under grant agreement 726470. (EU contribution: €2,038,120).

Contact Details

Project Coordinator, Oren A. Scherman Professor of Supramolecular and Polymer Chemistry Director of the Melville Laboratory University of Cambridge T: +44 1223 748850 E: W:

Professor Oren A. Scherman

Photo by Gabriella Bocchetti, © University of Cambridge

Prof. Oren A. Scherman is a supramolecular and polymer chemist with a PhD in Chemistry from the California Institute of Technology (Caltech). He is the Director of the Melville Laboratory for Polymer Synthesis in the Chemistry Department in Cambridge. His research focuses on dynamic supramolecular self-assembly at interfaces through the application of macrocyclic host-guest chemistry, using cucurbiturils in the development of novel supramolecular systems. The Scherman group exploits control over these molecular level interactions to design and fabricate soft materials with integrated function, exploring topics that include biocompatible hydrogels for drug-delivery applications, sensing and catalysis, tough supramolecular polymer networks and bioinspired supramolecular fibres.


Beating Covid19 with vaccination There is a race on, a scientific race of discovery, more urgent and pressing than any other for this generation. Around the world, research teams have been working tirelessly on different vaccines which can prevent infection from Covid19. In the last moments before 2021 the results of those efforts are just coming to fruition for some of the scientists. What can we expect from these vaccines and is an end of the pandemic now in sight? By Richard Forsyth


he London School of Hygiene and Tropical Medicines has a Covid19 vaccine tracker which states there are 260 Covid19 vaccine candidates with 56 in clinical testing as of November 2020. Of these, two are currently in the latter stages of the process before being available for use on the population. A vaccine has a singular aim, to make sure you are protected from infection without making you ill in the process. A vaccine injection tricks your body into thinking it is being attacked by a virus, stimulating your immune system to counterattack with proteins called antibodies. Once the strategy of counterattack is learned, your body remembers how to deal with the specific threat and you are


protected. How long you are protected is not always clear with new viruses. An injection does not necessarily mean protection for life, for instance, a flu jab is only effective for three to six months. If reinfection is a possibility, a vaccination strategy may require annual re-inoculation. With Covid19 there have been recorded cases of reinfection but they are rare. The key advantage with vaccines is they should not make you sick in the process. Although with approved vaccines there is sometimes a risk of small rashes, headaches and fatigue, nothing sustaining and dangerous should occur in terms of side effects or the vaccine will be deemed unfit for purpose. To test this, extensive clinical trials are carried out on volunteers well before a vaccine is declared safe for use on the general population.

EU Research

How do you make a vaccine? There are clear stages for development when creating a vaccine. First, you need to determine how the vaccine will work. Second, you need to know that it is safe and what kind of dosage is effective. The third stage is working out how effective it is. After a successful Phase 3 trial, manufacturers of the vaccine submit an application to regulatory bodies like the European Commission and the US Food and Drugs Administration, where the data is then put under scrutiny to determine if it can be approved. If the vaccines are licensed there will still be continued monitoring of production and safety. There are several different ways to develop a vaccine. To create one you can use a weakened section of the virus, a dead piece of the virus or a more modern technique cleverly uses RNA, raw genetic material. The point is that you are essentially showing the immune system a blueprint of the virus so it can understand how it can be overcome, without the risk of contracting it.

The first two vaccines that we have seen in the news in November and which are destined for rollout after final checks, are by Pfizer/ BioNTech and Moderna. Both these vaccines are what is known as genetic-code vaccines. They introduce genetic material in the form of RNA, which instructs cells to produce the protein that is found on the outside of the virus. The body then detects the protein and develops an immune response accordingly. These vaccines have the advantage that they can be manufactured in bulk quickly, which of course is good news in a pandemic. More good news around these vaccines is their success rates in Phase 3 trials. Pfizer/BioNTech’s vaccine and Moderna’s vaccine are around 95 percent effective.

Cold comfort However, there are significant challenges that remain. The Pfizer/ BioNTech vaccine presents what has been described as ‘a logistical nightmare,’ due to the conditions of its storage. It must be stored at

Depending on who you ask and their levels of optimism, some scientists are saying some degree of normality may return as soon as spring 2021 but more realistically it will be winter 2021, or perhaps a long time beyond that.


ultra-cold temperatures to sustain it before use. At -70 degrees Celsius for example, if can last for 10 days. Considering the scale of the pandemic, the need for this kind of ultra-cold storage in parallel is a huge problem. It would present an almost impossible requirement for trucks, cargo planes and clinics to access this kind of cold storage for hundreds of millions of doses needed in the vaccination programme. The answer had to come from further innovation. Pfizer has developed special deep freeze ‘suitcases’ with formulated dry ice, which they say means the drugs can be shipped even without cold storage transports, such as refrigerated trucks. Each one can hold between 1,000 to 5,000 doses. Whilst this is, on the surface, a neat solution, it too presents more problems. For example, the suitcase can only be opened twice a day for less than three minutes for the cold storage to remain effective. Whilst the vaccine only has 10 days of ultra-cold storage, it can then be put in a normal refrigerator for a further five days before it degrades. That’s still just 15 days from distribution point to the jabs in arms, with a need to use up thousands of vaccines in that time. You can see the problem, as it means potentially organising a fast ‘conveyer belt’ of shots for thousands of people in a narrow time frame. The dry ice storage also may be problematic in terms of excessive cost, replacement and due to the fact dry ice can be considered hazardous, although the manufacturers say it is at relatively safe levels. Finally, two shots are required for this vaccine to be effective, to be taken 21 days apart. Covering areas the size of the US or Europe, there are some real headaches coming for healthcare organisations tasked with delivering the vaccine and trying to vaccinate around 50-60 percent of their national population.

The promises of new solutions The Moderna vaccine applicant, hot on the heels of Pfizer/BioNTech in contrast can be stored for a month in a standard refrigerator, making it an attractive alternative, especially for reaching rural areas and places that have limited supporting resources. The US company intends to see one billion doses available for use internationally in 2021 after approvals are signed off. Testing is an ongoing process but with overlapping development stages and trials, the whole process of creating these vaccines has been reduced by years in response to the critical requirement of finding solutions at speed. With every passing month, more potential working vaccines will appear in the news and they may be more versatile and have different parameters of use. With this coronavirus pandemic however, time is urgent as families and economies strain under the tremendous impacts of the virus, and so governments are ordering the first likely candidates for working vaccines in enormous numbers. The big question everyone is asking, is of course, when do things get back to normal? This is not a straight-forward question. Depending on who you ask and their levels of optimism, some scientists are saying some degree of normality may return as soon as spring 2021 but more realistically it will be winter 2021, or perhaps a long time beyond that. Vaccines are one solution in a range of measures, and face masks, careful hygiene and social distancing may be part of our routines for a long time yet. When


EU Research

you consider the scale of the pandemic and its aggressive spread, you can be sure it will not vanish overnight and we may even be living with adjusted rules of behaviour in some ways, for years.

Staged rollouts The rollout for vaccination will be staged in most European countries with priority given to the high risk groups, including the elderly and those with high risk occupations like healthcare workers and emergency responders. At the time of going to press the vaccination process may have already begun, with pledges across the EU and the rest of the world to begin the process at the end of 2020 or the beginning of 2021. Some countries may have slight variations regarding who is considered the highest risk categories. For example, in France the high risk group will extend to chauffeurs, taxi drivers, retail staff, school staff and workers in construction and abattoirs. A challenge that governments are all too aware of is whether the public present themselves for vaccination in the first place. Onboarding the public for these necessary waves of vaccination may need strong public messaging and campaigns persuading uptake, as misinformation and resistance around vaccination is rife. Resisting vaccination with suspicions of everything from safety concerns to sinister political agendas, has become something of a movement, fuelled by popular conspiracies on the internet. It is worth remembering as we embark on the beginning of the real fight against Covid19, that pandemics are not new. We have been here before many times as a species. Consider smallpox killed between 300-500 million people in the 20th Century, and in the 1950s, 50 million cases of smallpox were recorded annually around the world. Smallpox was only declared to be eradicated as recently as 1980, by the World Health Organisation, via a massive vaccination programme. Many deadly viruses will likely never completely vanish and with these we simply learn protocols to contain outbreaks and live with them or evolve and design treatments that lessen their impact. This may well be an insight to a future with Covid19. We have long been living in a world with deadly viruses like Ebola, HIV and rabies. Whilst we certainly will have vaccination options, wiping out Covid19 completely is an enormous challenge, not to be underestimated. However, one thing has stood out in the way we face down this current virus. Our innovation, our global determination, the sheer weight of science behind solutions that are now appearing, it shows that science always rises to the challenge when we are confronted with this level of adversity by nature. When the first viable vaccine’s results came through from Phase 3 trials to a room full of researchers there were apparently tears of joy. We are at a juncture where we can turn the course of the pandemic and beat it down. Such moments are historic.


Spotting the genetic differences in barn owls The number and size of black spots on barn owls varies significantly, while major differences have also been observed in their colouration. We spoke to Professor Alexandre Roulin and Professor Jérôme Goudet about their work in investigating the genetic basis of this variability, research which could open up new avenues of investigation in several different fields. A female barn owl typically has larger black spots on its feathers than males, which is associated with a number of qualities that increase resistance to stressful factors, helping to improve their overall survival rate. By contrast, the opposite is the case for male barn owls, which have much smaller spots, an example of what is called sexually antagonistic selection. “The key idea here is that the two sexes share and express the same trait, but the males are selected in one direction, and the females in the other,” explains Alexandre Roulin, Professor of Behavioural and Evolutionary Biology at the University of Lausanne. As the Principal Investigator of an SNF-funded research project, Professor Roulin aims to investigate the genetic basis of this variation in barn owls. “We aim to identify the genes that underlie the expression of these large spots. We think that this might be related to genes that have pleiotropic effects,” he continues.

Pleiotropic effects This means a gene that acts on several different traits. For example, a specific gene might be involved in the production of the black spots, while also being responsible for a number of other physiological traits. “So there is a link between the size of the spots and the physiology of the owl,” outlines Professor Roulin. The aim now in the project is to explore melanin-based colouration in barn owls. “There are two forms of melanin – eumelanin is black melanin and pheomelanin is red melanin. What’s interesting in the owl is that both of these two are expressed,” says Professor Jérôme Goudet, a specialist in population genetics also based at the University of Lausanne. “The barn owl varies in the degree of reddishness in colouration, which is related to pheomelanin. They vary also in the number and size of black spots, which is related to eumelanin.” These two traits have different functions, with colouration in barn owls varying between white and a sort of dark-reddish colour, which is thought to be associated with predator-prey interactions. For example, barn owls may use their plumage to stun prey at


Picture of a family of barn owls. You can clearly see the variation of colour between the white and reddish owls. Photograph by Professor Alexandre Roulin.

certain stages of the lunar cycle. “Moonlight is reflected by the white plumage, and the prey are scared by this reflection. They panic and they freeze for longer, so white-coloured birds have a hunting advantage when there is a full moon,” explains Professor Roulin. The size of the spots however is related more to sexual selection, which is why Professor Roulin believes it’s important to study the genomics

arrive at a position where the research team have sequences from telomere to telomere for each chromosome. “Each genome has its own particularities. The genome of the barn owl is specific in that we can get the big chromosomes, but the small chromosomes are very difficult to get. This is one of the issues we are trying to tackle with these new sequencing methods,” he outlines. “With

The barn owl varies in the degree of reddishness in colouration, which is related to pheomelanin. They vary also in the number and size of black spots, which is related to eumelanin. in this area to build a fuller picture. “We have been able to find some genes associated with the reddishness, with natural selection, and some genes which are associated with the black spots, so sexual selection,” he says. Researchers already have data on the full genome of over 100 birds across Europe, and are now using genomic tools called Illumina and PacBio to generate even higher quality sequences. By using these sophisticated genomic tools, Professor Goudet hopes to

PacBio sequencing, we can get fragments of 15-20 kilobases sequenced almost perfectly, and we do this at a high coverage so we can then match them.” The length of the genome is constructed through a step-by-step process, resulting in contigs, short components of the assembly. These contigs are then mapped and chained together in a scaffold. “This is how we get to the chromosome-level assembly. We are also using other techniques to help us

EU Research


Using genomic tools, we are looking for the genes involved in variation in melaninbased color in the barn owl. This will help us understand why plumage color is associated with physiology, behaviour and predator-prey interactions, but also how coloration could help barn owls to spread over the globe.

Project Funding

Swiss National Science Foundation. Alexandre Roulin (31003A_173178) and Jérôme Goudet (31003A_179358).

to fill any gaps that are still present,” says Professor Goudet. This genomic data can then be analysed, from which researchers hope to gain deeper insights. “We have data on the phenotype of individuals – the black spots, the number and the size. By doing classical genome-wide association studies, we can then find which gene connects with a strong signal associated with the size of the spots,” explains Professor Goudet. “There, we will be able to use the pedigree data that Alex has accumulated over the last 20 years or so.”

Project Partners

• Dr. Luis San Jose Garcia • Prof. Jérôme Goudet.

Contact Details

Professor Alexandre Roulin Co-investigator T: +41 79 686 08 64 E: Professor Jérôme Goudet Co-investigator T: +41 (0)21 692 4242 E: W: menuinst/people/group-leaders/profjerome-goudet.html W: menuinst/people/group-leaders/profalexandre-roulin.html W:

Pedigree data By analysing the pedigree and the phenotype data, and drawing links between them, researchers hope to identify which genes are responsible for variation in both colour and the number of spots. Once a specific gene has been identified, Professor Roulin says he and his colleagues can then revisit the pedigree data and look to draw wider inferences. “We have blood samples of thousands of individuals, which is a valuable research resource once we know the gene, to calculate how strong the selection is. We can also look at data from other continents, to see what the picture is at a higher level,” he outlines. Alongside the barn owl, researchers have also analysed other avian genomes, and a common feature has been identified. “One thing that characterises avian genomes is a high synteny,” explains Professor Goudet. This means essentially that the order of the genes on chromosomes is maintained to a high degree in birds, and the improved quality of the new genome will help in terms of placing the barn owl in the phylogeny. While the project itself has centered on the barn owl, this research holds implications beyond this specific species, which Professor Roulin intends to explore further in future. “The genes associated with these black spots are known to be important from a biomedical point of view. Melanin is highly conserved in all vertebrates, and we expect the genes are also important in humans,” he says. One major topic of interest is the

Alexandre Roulin

Jérôme Goudet

Professor Alexandre Roulin holding a barn owl. Photograph by Séverine Rijken

melanocortin system. “This is a system with five melanocortin receptors. The melanin-stimulating hormones bind to these receptors – they are the same in all animals, including humans – and many diseases are associated with these hormones,” continues Professor Roulin. By combining ecological information on different organisms with the wider picture of population genetics, Professor Roulin and Professor Goudet hope in future to gain new insights into the role of these genes. At this stage however, the immediate priority is to finish the assembly of the genome. “We want to get a much better, highquality genome, then it will be easier to find out which genes underly the plumage traits,” says Professor Roulin. Another set of data relates to the history of barn owl colonisation of Europe after the last ice-age, a topic of great interest to Professor Goudet. “This is a very interesting project. We published a paper a couple of years ago in which we suggested that the barn owl re-colonised Europe in a ring-like manner, around the Mediterranean Sea. We are looking at these scenarios in genomics,” he continues.

Alexandre Roulin is a full professor of biology at the University of Lausanne, Switzerland. For already two decades he is studying barn owls to answer evolutionary and ecological scientific questions. His main scientific interests are the adaptive function of melanin-based coloration and negotiation processes taking place in animal societies. Jérôme Goudet is an associate Professor of Ecology and Evolution at the University of Lausanne, Switzerland. His research uses genomic data in order to understand the interplay of population structure, trait architecture, and selection. For this, he uses different approaches, from theory and the development of statistical tools to field observations.


Smarter, sustainable mining solutions Reducing environmental impacts from mining is an important aim for Europe, as is pushing efficiencies to bolster supply, streamlining operations and reducing costs. Professor José Sanchidrián, coordinates the SLIM project, addressing these concerns with technologies. Mining, by nature,

is destructive to the environment with crude explosive power shattering rock to harvest ores for industry. Researchers working on the SLIM project focused on a range of ways to improve the processes for mining, including new innovations, data driven techniques, all with the intention of driving more precision and intelligence around the blasting and processing of the rock. The SLIM project’s full title describes the objective clearly, Sustainable low impact mining solution for the exploitation of small mineral deposits based on advanced rock blasting and environmental technologies. By adopting the methods offered by the project, mining operations will reap rewards not only in terms of improving sustainability but in terms of performance and margins as well. Simply put, it is a ‘win-win’ strategy. Good mining, Professor Sanchidrián would argue – a man who describes himself first and foremost as a miner – is about having significant control, in the way you blast the rockface, excavate the materials and run the operation. Whilst the larger mines employ more advanced technologies to help run operations, smaller ones do not always have the capacity and time to implement them, relying on traditional techniques for blowing the rock apart and processing it. The SLIM project addresses this, with effective offthe-shelf, easy-to-deploy technologies and sustainable solutions that not only lessen the impact on the environment but also drive large efficiencies.

Preparing explosives for testing.

Start with a bang Coordinated by the School of Mines and Energy of the Technical University of Madrid, the SLIM project involves 13 partners from Austria, Denmark, Sweden, France and Spain. The solutions were deployed at mining sites (upstream) and validated at the processing plants (downstream) to ensure they were improving performance and working effectively. The solutions empowered mining operations with methodologies to benefit

the whole end-to-end mining process but a particularly important development by the project was making explosives that do less damage to the environment. “From the environmental point of view, what we have done with one of the partners is develop an explosive that is less polluting,” says Sanchidrián. “Explosives are made of ammonium nitrate, which is highly soluble in water. These explosives are used in bulk, so you bore into the rock and you pour the explosive into the hole. Normally, if you do that under the water table, some of that ammonium nitrate that’s in the explosive composition is dissolving into natural water and polluting it with nitrates. It can be an important consideration for the local community and there’s more and more concern about environmental legislations on that, so the project has developed a composition that has proven to have less solubility by means of some chemistry and physics and technology. They have made this formulation dissolve less into the water, to dissolve less ammonium nitrate.” There can be a range of uncertainties with traditional blasting techniques, not knowing exactly how straight the drilling is, or not knowing exactly the rock geometry and its geotechnical characteristics, rendering engineers blind to the exact volumetric distribution of the explosion and what exactly its effect on the rock will be. The SLIM project created tools derived from an accessible pool of useful ‘off-theshelf’ technologies that already existed and put them together in a solution aimed

Automatic rock characterization and burden optimization.


EU Research

specifically at smaller mining companies. By using these technologies before and after blasting, like photogrammetry, LiDAR and unmanned aerial vehicles, mine engineers and managers can better read the rock, and develop intelligence-based planning. SLIM is also developing technologies of its own.

Machine learning algorithms for fragment size measurement.

Sustainable Low Impact Mining solution for exploitation of small mineral deposits based on advanced rock blasting and environmental technologies Project Objectives

Limiting the impacts on the rock An aim was to blast with better geometric control to ensure that precision blasting achieved the ideal fragmentation in the rock with the premise that with improvements in fragmentation, it will affect downstream costs positively. “When you control the amount of explosive that you use, the density of explosives in your rock mass, you get a controlled fragmentation which means you are able to optimise the size of the particles in your muckpile, and optimise downstream. For example, when you optimise fragmentation you reduce costs, like digging costs, you reduce the time for digging, you reduce the consumables of the machine, like the tooth on these excavators. This all leads to an operation working more

soften the blast in the last row of holes close to the remaining rock mass.”

Plotting with good data The SLIM approach is very holistic, intending to manage mining better upstream to impact favourably downstream. As well as less polluting, with less risk in mining generally, it has been found that a quarry or mine can increase production by up to 46%. The intention is to make mining operations more data driven, to make each blast more efficient and effective by design.

From the environmental point of view, what we have done with one of the partners is develop an explosive that is less polluting, efficiently. We have made these intermediate tools for mines, to have a better control of the operation, and without too much effort.” With the approach, less impact on the environment is achieved in many ways, from generating less dust to minimising impacts to the rock remaining after each blast. Indeed, rock damage control is a safety issue. An explosion puts tremendous pressure on the rock remaining in place, post blast. When this remaining rock is cracked, it may collapse and cause rock falls. “Working in the wrong place when excavating puts miners at risk, so you need wide spaces free, which makes the operation more expensive and less productive. In this case we have made a model to reduce the impact on the remaining rock, to help you


The SLIM project concluded but there is still exciting work to be done in further developing the algorithms for advanced machine learning, which can analyse muckpile properties based on aerial imaging, with the ability to design blasts based on the characterisation of rock. “The next step is to allow the user to use this technology and give the user the solution automatically by some sort of AI – machine learning, deep learning techniques. That’s still for development though. What I think SLIM has achieved, is a fundamental understanding of data. It’s very much about plotting graphs and having good data. The system can plot you those graphs, and then if you are a miner, mine engineer or mine manager, you will know how to use that information for an advantage.”

SLIM aims to develop cost-effective and sustainable selective low impact mining solutions for small mineral deposits including those with chemically complex ore-forming phases. For this, a new generation of explosives and an advanced automatic blast design software will be applied, based on improved rock mass characterization and fragmentation models for optimum fragmentation, minimum rock damage and far-field vibrations.”

Project Funding

The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement no 730294.

Project Partners


Contact Details

José A. Sanchidrián, Ph.D Professor (C.U.), Department of Mining Engineering and Earth Sciences ETSI Minas y Energía Universidad Politécnica de Madrid Ríos Rosas, 21 28003 Madrid, Spain T: +34 910 676 493 E: W: Prof. José A. Sanchidrián, Ph.D.

Prof. José Sanchidrián has a Ph.D. in Mining Engineering from Universidad Politécnica de Madrid. He served 7 years in the Army Corps of Engineers and worked 10 years in the mining and explosives sector. He has authored more than 150 papers for scientific Journals and Conferences, and more than 100 research and technical reports. He has been involved in more than 40 research projects. José is a member of the International Society of Explosives Engineers and president of the International Committee for Rock Fragmentation by Blasting (Fragblast).

A small, difficult underground mine test case: Minera de Órgiva (Granada, Spain).



Planning for the new energy landscape Renewable sources of energy like wind and photovoltaics produce a variable output, which poses new challenges in terms of planning and operating the European Energy System. We spoke to Sandrine Charousset and Danny Pudjianto about the plan4res project’s work in developing new tools for supporting the integration of increased renewable energy within the system. The European electrical

system was designed for a world in which energy was mainly generated by traditional power plants, like those based on burning fossil fuels. However, with the EU setting ambitious objectives for reducing carbon emissions by 2030 and then 2050, many countries are now looking to decommission these plants and replace them with renewable sources of energy. “This means mainly photovoltaics (PV) and wind power, but also hydropower,” says Sandrine Charousset, research director at EDF and coordinator of the plan4res project. These renewable sources are intermittent by nature and so cannot provide a guaranteed supply, which poses a challenge in terms of planning and operating the energy system, a topic at the heart of plan4res. “In plan4res we are trying to develop and implement tools that will help to plan and manage the European energy system more effectively,” explains Charousset. “The objective is to have a system that can host more renewables and take advantage of the flexibilities that exist.” This research is centred around the question of how energy demand can be met while also increasing the use of renewable sources. While previously a system planner would typically use data on peak demand to identify


how much capacity is needed in a system, the inclusion of more renewable sources of energy greatly increases complexity. “The number of operating conditions that need to be taken into account during planning increases exponentially,” says Danny Pudjianto, a researcher at Imperial College London, one of the partners in the project consortium. The growth of the electric

suited to this changing situation, an issue that the project is working to address by developing an end-to-end planning and operation tool built on a set of optimisation models. The project brings together partners with deep mathematical and technical expertise to deal with the complexity of the different problems around energy planning and operation. “A large amount of work has been done on algorithms

In plan4res we are trying to develop and implement tools that will help to plan and manage the European energy system more effectively. The objective is to have a system that can host more renewables and take advantage of the flexibilities that exist. vehicle market and the trend for people to engage in individual demand response are also important considerations. “People may adapt their energy load according to different incentives and tariffs, like using their washing machine at a point when energy is cheaper,” outlines Charousset. “There is also load curtailment, where people may choose not to consume energy at a specific time.”

Planning and operation The tools currently available for planning and operating the energy system are not well-

and mathematical optimisation methods in the project. Our partners have deep expertise in optimisation methods” says Charousset. “We are developing an entirely new and innovative modelling framework, endowed with state-of-the-art algorithms, that will be open-source. It will be available at the end of the project, and it will contain many different modules, among which are ones for stochastic optimisation, decomposition algorithms and large-scale mixed-integer programming, together with many ready-to-use modules for energy optimisation problems.”

EU Research


Synergistic Approach of Multi-Energy Models for an European Optimal Energy System Management Tool Project Objectives

Plan4RES is a collaborative research and innovation project which aims at developing an end-to-end planning tool to successfully increase the share of renewable energy into the European Energy system without compromising on system reliability. The targeted platform will account for the Pan-European interconnected electricity system, potential synergies with other energy systems, emerging technologies and flexibility resources, providing a fully integrated modelling environment.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement N. 773897

Project Partners

Multi-dimensional challenges being addressed by plan4res tools.

Contact Details

This is part of the goal of developing improved planning tools in line with wider objectives around reducing carbon emissions. An integrated representation of the European energy system, including sector coupling of electricity, heating & cooling, mobility and fuels, is an important step towards the development of improved planning tools, as certain forms of renewable energy may be generated in higher quantities in certain areas. “One of the difficulties is to identify the right balance between building new generation capacity in some places, extending the capacity of the interconnections between different zones, and increasing storage capacity in other zones,” says Charousset. “The solution is to find the right balance between generation, inter-connection and storage, in different regions. This is closely correlated with the existing energy mix, as well as physical factors, like prevailing wind patterns.” A set of tools have been developed in the project to help build a deeper picture in this respect. On the supply side, tools for planning energy generation have been developed, while Pudjianto and his colleagues have also been working on a transmission planning tool. “The challenge there is in the uncertainty involved in planning the pan-European transmission system. For example, there is uncertainty about how much wind energy will be generated in the North Sea,” he explains. While storage capacity can be expanded relatively quickly, building a transmission network is a costly, long-term investment, so the asset in question must be used as efficiently as possible. “Suboptimal capacity means it is a stranded asset or heavily constrained, which both cost money. And

ultimately the customer will pay,” points out Pudjianto. “There are risks associated with making this kind of investment.” The next step, once the tools have been validated, would be to apply them in the practical sphere. The project consortium brings together both academic and commercial partners, and Charousset says EDF are keen to bring these tools to practical application. “One of our main objectives is to develop tools that we could use in our own operations,” she says. A variety of different tools have been developed in the project, and they will be used in several studies, while they will also be implemented in a related H2020 project called Open Entrance. “This project is less industrially-oriented. The objective is to build a platform for sharing data results and open models,” continues Charousset. “The models that have been implemented in plan4res are currently being connected to the Open Entrance platform where they can be used in some studies. This platform will be open access, so they can also be used by other people from outside the project.”

Project Coordinator, Sandrine Charousset EDF Lab Paris-Saclay 7 boulevard Gaspard Monge 91120 Palaiseau France E: W: : @plan4res :

Sandrine Charousset

Sandrine CHAROUSSET holds an Engineering degree from Ecole Centrale Paris. She is the director of the “Gaspard Monge Program for Optimisation and operational research” and a project coordinator for H2020 Projects within the Low Carbon Energy field at EDF. She leads research teams on Statistics for Renewable generation and energy prices, and Mathematical Optimisation.

High performance computing A further dimension of the project’s work involves the development of an IT platform, designed to provide access to data and highperformance computing (HPC) resources. Many of the problems around energy management and energy modelling cannot be solved on an office PC, so access to HPC resources can be highly beneficial, another topic that is being addressed in the project. “Our partners are implementing tools for dealing with workflows and parallelisation, and the use of different kinds of HPC resources,” continues Charousset.


Fuelling the future of biomass conversion Using biomass in the production of transport fuels would help reduce dependence on fossil resources and contribute to the wider goal of de-carbonising the European economy. We spoke to Dr. Stella Bezergianni, Principal Investigator of the BioMates project, which is developing innovative biomass conversion technologies that could open up new possibilities in the production of hybrid fuels. The European Union has set ambitious targets for de-carbonising the economy over the coming years and changing the way that transport fuels are produced could make a significant contribution to this wider objective. The BioMates project is developing new biomass conversion technologies, with the aim of accelerating the market uptake of biomass in the transport sector. “We aim to produce a reliable refinery co-feed, from widely available feedstock of lignocellulosic material. This can then be co-processed in a refinery, together with fossil fractions, to produce hybrid fuels,” says Dr. Stella Bezergianni, the project’s Principal Investigator. These fuels will have partly a fossil and partly a biological origin, so it will be possible to take advantage of the currently under-utilised conversion capacity of existing refineries. “There is considerable unused capacity in refineries at the moment. If we could substitute this unused capacity with bio-based feedstocks that can also be processed at the refinery, then we will not have to make heavy investments in new systems,” continues Dr. Bezergianni.

BioMates project Researchers in the project are now working to produce a reliable, sustainable refinery feedstock, starting from lowgrade biomass in the form of straw and miscanthus, a type of grass rich in lignin. A technique called ablative fast pyrolysis (AFP) is an important preparatory step, which involves converting solid


The largest Miscanthus species grows up to 4 meters. © Fraunhofer UMSICHT / Volker Heil

material into liquid form. “We are using a type of pyrolysis that maximises the conversion efficiency towards the desired liquid product,” explains Dr. Bezergianni. A further step involves mild hydrotreatment of the bio-oil, which Dr Bezergianni says is an important step towards

water and oxygen content. A more stable bio-oil is less likely to harm downstream processes when introduced in a refinery, which is an important consideration in terms of its potential future application. “That’s why we focused on a single, mild hydrotreatment step,” says Dr. Bezergianni. A novel electrochemical H2 processing infrastructure has been developed in the project to effectively recycle the hydrogen used in this process, which Dr. Bezergianni says helps reduce the environmental impact of hydrogen production, while also leading to some other major benefits. “It can clean and convert offgas hydrogen into clean hydrogen that can then be recirculated back into the process. It also compresses it electrochemically, which

We aim to produce a reliable refinery co-feed, from widely available feedstock of lignocellulosic material. This can then be co-processed in a refinery, together with fossil fractions, to produce hybrid fuels. the eventual production of a high quality, reliable feedstock. “Pyrolysis bio-oil contains water, oxygenates, acids, furans – compounds that are unsuitable for final transportation fuels. Hydrotreatment can be used to saturate unwanted olefinic bonds, and to remove oxygen, increasing calorific value, reducing acidity, and increasing oxidation stability,” she outlines. “However, the more hydrotreatment steps that you introduce to upgrade your biooil, the more costly the overall scheme and the higher the final cost of the fuel.” A single hydrotreatment step is being used in the project primarily to stabilise bio-oil, meaning reducing its acidity, as well as its

is a more cost-efficient and economic option than the conventional mechanical hydrogen compression. This helps to significantly reduce the cost of this single hydrotreatment step.” Alongside the technical investigation, the project’s agenda also includes research into environmental, socio-economic, policy and health sustainability issues, which Dr. Bezergianni says is an important aspect of BioMates. “We are evaluating the technical, environmental and socio-economic sustainability issues constantly in the project,” she stresses. This is a highly innovative project which brings together not only academic partners, but also a major oil company and

EU Research

BIOMATES Reliable Bio-based Refinery Intermediates Project Objectives

Different locations, one overall process: ablative fast pyrolysis (l) and mild hydrotreatment (r) in TRL 4. © CERTH / CPERI + Fraunhofer UMSICHT

SMEs, and while a lot of attention is focused on developing the technology, Dr. Bezergianni is very much aware that it must be reliable and cost-efficient if it is to be applied more widely. “We are evaluating the potential for pyrolysis oil to be introduced in a refinery, and we are very happy to have a significant industrial presence in the project,” she continues. The wider goal in this research is to produce a reliable, biomass-based intermediate, which it is intended to be co-processed in a refinery, together with fossil-based feedstocks. While the main priority in the project has been producing a reliable renewable refinery feedstock, researchers have also investigated the optimal entry point in a refinery for such an alternative feedstock. “Optimal here means not intruding into the conventional processes of the refinery, and not limiting the final targeted yield,” says Dr. Bezergianni. There is no single process in a refinery that produces a specific type of fuel, but rather there are many types of conversion units that process different types of fossil fractions. “Some of these fossil fractions are relatively light and/or less challenging, while others are heavier and/ or more challenging,” explains Dr. Bezergianni. “There is not one single technology that is used to produce gasoline for example.”

Transportation fuels The production of transportation fuels is a complex process, and at this point it’s not possible to say with any confidence that the project’s bio-intermediates are wellsuited to the production of any specific type of fuel. However, Dr. Bezergianni says it is possible to identify the percentage of each fuel type that would have a biological

origin, depending on the type of refinery in which they are introduced and blending ratio. “We can say that 15 percent of jet fuel will have a bio-origin for example, and 55 percent of diesel,” she outlines. At this stage researchers are validating the technology in an industrially relevant environment at TRL (technology readiness level) 5, while Dr. Bezergianni says investigation continues into further potential improvements to the technology. “We’re looking to a more optimal hydrotreatment capacity design, to better combine the fuel properties of the produced bio-oil and its downstream upgrading. Certainly other technological opportunities have been identified which could be researched further,” she says. This research holds exciting potential in terms of reducing Europe’s dependence on imports of fossil fuels and securing the energy supply, while it is hoped the project’s work will also have wider benefits, for example in creating new jobs and boosting the competitiveness of the European economy. Co-processing is accepted and used in many refineries, therefore the project’s industrial partners are keen to harness the potential of this technology. “We have partners in our consortium who are actively evaluating the current legislation and the specifications. They are very interested in the co-processing approach that this project is targeting,” stresses Dr. Bezergianni. BioMates will host a workshop at the 29th European Biomass Conference & Exhibition 2021. The aim is to gauge the views of key stakeholders on the technological and sustainability issues around the production, marketing and use of hybrid transportation fuels.

Objectives: The BioMates project aspires in combining innovative 2nd generation biomass conversion technologies for the costeffective production of bio-based intermediates (BioMates) that can be further upgraded in existing oil refineries as renewable and reliable co-feedstocks. The resulting approach will allow minimisation of fossil energy requirements and therefore operating expense, minimization of capital expense as it will partially rely on underlying refinery conversion capacity, and increased bio‐content of final transportation fuels.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 727463

Project Partners

• Centre for Research & Technology Hellas (CERTH) • Fraunhofer UMSICHT • University of Chemistry and Technology Prague (UCTP) • BP Europa SE • HyET Hydrogen • Ranido s.r.o. • Imperial College London • Institue for Energy and Environmental Research Heidelberg (ifeu) • Research Institutes of Sweden (RISE)

Contact Details

Project coordinator, Dr. Stella Bezergianni Centre for Research & Technology Hellas (CERTH) Chemical Process & Energy Resources Institute (CPERI) 6th km Harilaou-Thermi Rd GR57001 Thermi-Thessaloniki GREECE T: +30.2310.498315 F: +30.2310.498380 E: W: “This article reflects only the authors’ view; the European Commission and its responsible executive agency INEA are not responsible for any use that may be made of the information it contains.“ Dr. Stella Bezergianni

Dr. Stella Bezergianni is a Research Director and Head of the hydroprocessing group of CPERI/CERTH. Her research activities involve production and evaluation of environmentally friendly fuels and biofuels, development of new technologies for 2nd and 3rd generation biofuels production, as well as optimization of hydroprocessing for biofuels production.

Straw-bales, drying in the late summer sun. © Volker Heil


New uses for demolition waste Recycling and re-using the large quantities of concrete and demolition waste that are generated every year would bring both environmental and economic benefits. Researchers in the VEEP project are developing new technological solutions that could help increase the use of recycled materials in the construction sector, as Anna Paraboschi explains. The building sector

consumes large amounts of raw materials, with significant quantities of concrete and other materials used to construct new homes, offices and other buildings. Increasing the use of materials recovered from demolished buildings would help to reduce the sector’s energy consumption, a topic at the heart of the VEEP project. “We aim to develop and demonstrate a series of technological solutions for the massive retrofitting of the built environment,” says Anna Paraboschi, the coordinator of the project. One important part of this work involves developing novel pre-cast concrete elements (PCE), in which a high proportion of construction and demolition waste (C&DW) is embedded. “The objective is for C&DW to account for 75 percent by weight of the raw materials in our panels,” continues Paraboschi.

VEEP project This is not a simple task, as C&DW materials need to be treated in certain ways before they can be used in construction. Two technologies have been developed in the project, called Advanced Drying Recovery (ADR) and Heating Air System (HAS),


Cost-Effective Recycling of Construction and Demolition Waste for Energy Efficient Building Retrofitting This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No: 723582 aspx?CAT=PARTNERS&IdPage=60397ce2-ef70-47fcbde3-171ce3671eaf

Project Partners

Anna Paraboschi Project Manager RINA Consulting S.p.A. Via Cecchi, 6 – 16129 Genova, ITALY T: + 0039 010 31961 E: W: Anna Paraboschi is a Project manager for the RINA group, she is PMP® and Lean Six Sigma Management certified. She has over 10 years of experience in the development and management of R&D projects funded under the European Commission Research and Innovation Programmes, with a focus on energy efficiency and sustainability topics.


VEEP mock-up

which address the treatment of both the coarse and fine fractions of demolition debris. “The coarse fraction of debris, from 4-12 millimetres in size, has to be effectively sieved using ADR, and then it can be 100 percent recycled in concrete,” explains Paraboschi. The finest fraction of debris, below 4 millimetres, is treated using HAS. “The

durability” she says. Another dimension of the project is the development of new, green aerogels containing recycled C&DW materials, which hold rich potential in terms of insulation. “Silica from the C&DW is used in these green aerogels. This helps to reduce costs, which could encourage their wider adoption in the construction sector,” continues Paraboschi. This is one of the products which researchers hope to bring towards practical application, while Paraboschi and her colleagues are also looking to test the effectiveness of the PCEs at two demonstration sites in Spain and the Netherlands. The aim here is to test the thermal performance of the PCEs in very different climates, as well as to investigate their performance against several other criteria. “The panels developed during the project will go through acoustic and fire resistance testing,

In the VEEP project we aim to develop and demonstrate a series of technological solutions for the massive

retrofitting of the built environment. material goes into a hotter system, and the pollutants are then removed, in order to produce a fine fraction which is hardened into a cement paste,” outlines Paraboschi. “This can be very useful, as it can partially replace a conventional concrete formulation.” These two technologies are both extremely mobile, and can be moved to a demolition site, which opens up wider possibilities in terms of producing recycled concrete particles and improving resource efficiency. These recycled materials still need to meet rigorous standards before they can be used of course, and Paraboschi says the results from the project so far are positive. “We developed a novel concrete formulation in the project, and it performs well in terms of mechanical properties, thermal conductivity and

while we’ll also monitor them for any cracks,” says Paraboschi. This is essential if these panels are to eventually be applied in the market, which Paraboschi says is an important objective in the project. “There are some manufacturers involved in the project, and they are interested in developing products to add to their portfolio,” she explains.

VEEP aerogel

VEEP Project Consortium visiting partner Nuova Tesi System’s plant.

EU Research

Understanding new institutions in consumer credit Institutions like online marketplaces and high-cost credit markets are an important source of credit to consumers, yet taking out high-cost credit can have long term financial consequences. We spoke to Professor Daniel Paravisini about his research into how the growth of these institutions is affecting the consumer credit market. An

increasing proportion of consumer credit is being provided by sources from outside the regular banking sector, for example online credit markets in the US, and high-cost lenders. As the Principal Investigator of a new ERC-funded project, Professor Daniel Paravisini is investigating how this trend is affecting the consumer credit market. “With online marketplaces people can borrow quite large amounts of money, up to £30,000, while high-cost lenders offer loans to people on the basis of very little information,” he says. The focus in the project is on the functions of these new sources of credit, the benefits they bring, and also the consequences to consumers of using high-cost credit. “In the UK once you’ve used high-cost credit, and this information is disclosed to lenders via your credit report, then you are essentially flagged by lenders as a potentially high-risk borrower, even if you repaid the loan,” outlines Professor Paravisini. High-cost credit

Photo by Avery Evans

The people who use high-cost credit are commonly thought to be more likely to be in financial distress, yet it is not clear whether this is a consequence of taking a high-interest loan or whether their finances were already over-stretched beforehand. There can be a fine line between being accepted and rejected for a loan, a point that Professor Paravisini has been exploring in his research. “We know that below that a certain cut-off on a credit score, you are more likely to be rejected when you apply for a loan,” he explains. By tracking two people with marginally different credit scores over time – one who took a loan and one who didn’t – Professor Paravisini hopes to shed new light on the importance of reputation to financial health. “Once you take out the loan,

we look at what happens subsequently. When you renew the loan, what’s the likelihood of you repaying those loans?” he continues. This research holds important implications for regulations around credit reports, an area where there are clear differences between some countries. While in the US for example the FICO credit score does not include the use of high-cost credit, this information is included

consumer credit. In terms of high-cost credit, the focus is on characterising the reputational cost to the borrower, while with online marketplaces, Professor Paravisini is looking at how high and low-risk borrowers can be distinguished. “We’re looking at the mechanisms these online marketplaces have to distinguish between high- and low-risk borrowers,” he continues. Online

In the UK once you’ve used high-cost credit, and this information is disclosed to lenders via your credit report, then you are essentially flagged by lenders as a potentially high-risk borrower, even if you repaid the loan. in the UK. “In the UK if you take out a highcost loan, there’s going to be a flag in your credit history,” explains Professor Paravisini. The US approach to an extent reflects a desire to remove the stigma attached to using high-cost credit, yet limiting the amount of information available on consumers in their credit rating could have wider effects. “The ability of regular banks to gauge your risk would potentially be diminished. This could mean that the costs of mortgages would go up – it could also mean that some people that would have had a mortgage approved in the past now don’t,” says Professor Paravisini. “The impact really depends on the extent of the reputational cost.” The wider aim in this research is to build a deeper picture of the benefits and costs associated with these new institutions in

marketplaces typically serve a very different segment of the population to high-cost credit markets, yet they both serve the same purpose and are increasingly prominent features of the financial landscape, now Professor Paravisini aims to gain further insights into how they function. “We’re looking at the consequences, costs and benefits of these new institutions in consumer credit,” he says.


Institutions in Consumer Credit Funded under H2020-EU.1.1. / Grant agreement ID: 772200. Overall budget € 1 017 851 Project Coordinator, Professor Daniel Paravisini London School of Economics and Political Science Houghton Street London WC2A 2AE UK T: +44 (0)20 7107 5371 E: W: Finance/People/Faculty/ Paravisini Daniel Paravisini is Professor of Finance at the London School of Economics and Social Sciences in the UK. He gained his PhD in Economics from MIT in the US, and has held academic positions in South America, the US and Europe. He is the co-editor of the Journal of Law, Economics and Organization.


The unintended effects of financial sanctions Financial sanctions are a powerful foreign policy tool, restricting the activities of both states and individuals, but they can also have wider unintended effects. Financial sanctions are at risk of being over-used today, which poses significant challenges to global banks under pressure to comply fully with the rules, as Professor Gregoire Mallard explains. The world’s banks

play an important role in interpreting and implementing financial sanctions targeted against both states and individuals. If sanctions are used for narrow and consensual objectives, then this will typically command wide agreement and the compliance of financial institutions. “For example the US and other countries may ask a global bank to screen their records for the names of suspected or confirmed terrorists, checking against a list provided by the UN Security Council. They will then freeze the transaction and make sure that they cannot move money around,” explains Gregoire Mallard, Professor of Anthropology and Sociology at the Graduate Institute in Geneva. The situation becomes more complicated for global banks if financial sanctions are imposed by a single nation in line with their own objectives at the time, while other countries take a different position. “President Trump withdrew the US from the Joint Comprehensive Plan of Action (JCPOA) on Iran in 2018, now the banks aren’t clear what to do. Should they follow European or US law?” asks Professor Mallard.

Digital photograph, 2016. Library of Congress Prints & Photographs Division

delivery of food and medicine. It is very challenging for international organisations to do this in a conflict zone like Syria and maintain neutrality,” he outlines. “The situation is different with Iran. The Iranian economy is not in good shape, and financial sanctions are having an impact, but it’s not a conflict zone.” The problem in supplying food and medicine to Iran emerges in arranging the finance. A European company may well be able to get a licence to export medicine to Iran, but Professor Mallard says it is difficult to then get a bank to deal with the transaction. “That’s where the banking exclusion may affect humanitarian supplies,” he explains. Regulatory bodies then have to reform the system to deal with these unintended consequences. “We essentially study these

We are interested in the unintended effects of targeted financial sanctions, and their impact when they are complemented by other banking regulations. This could mean anti moneylaundering regulations and transparency requirements.

PROSANCT project This topic is at the heart of the ERC-backed PROSANCT project, in which Professor Mallard and his colleagues are investigating the wider effects of financial sanctions, particularly those imposed unilaterally by the US. Much of this research is focused on sanctions against banks in sanctioned territories, like Iran, Venezuela and North Korea. “We are interested in the unintended effects of targeted financial sanctions, and their impact when they are complemented by other banking regulations. This could mean anti money-laundering regulations and transparency requirements, that push banks to reach quite a broad interpretation of what targeted sanctions mean,” says Professor Mallard. This research is built on analysis of sanctions, as well as interviews with bank compliance officers, diplomats and others. “How do compliance officers assess different categories of risk? Are they risk averse, or are they willing to take


the risk of punishment for processing a transaction for someone who may be linked to a sanctioned regime?” outlines Professor Mallard. “When private financial institutions become very risk-averse, it can have effects on the general population.” Many banks in the US have decided to expel clients thought to have relations in Iran for example and chosen to cut any ties. Over the last decade or so a number of banks have chosen to interpret financial sanctions in an extremely strict way. “They have started to de-risk, to the extent of prohibiting accounts for people with Iranian origins, for fear that they would still have ties with their communities in sanctioned territories,” says Professor Mallard. This can have wider effects in these sanctioned jurisdictions, for example in the supply of food and medicine, an issue that Professor Mallard is exploring in the project. “The UN have to work with the Syrian government to administer the

cycles of what we call transnational rulemaking, which start with simple, targeted sanctions, such as to prohibit nuclear activities in Iran in the 2000s, or in the DPRK. We follow all the unintended consequences, and the new cycles of rule-making that have been created to deal with them,” continues Professor Mallard. “We aim to understand these recursive cycles of rule-making at the transnational level, we call it viral governance. We interview people involved in diplomacy, as well as people in the banking world, then use a technique called process tracing.”

Viral governance This means trying to map all of the events recounted by interviewees that are thought to have been important in these different cycles of rule-making. Analysis of these cycles shows that they are related to each other, in what Professor Mallard describes as

EU Research

PROSANCT Bombs, Banks and Sanctions: A Sociology of the Transnational Legal Field of Nuclear Nonproliferation Project Objectives

The project investigates the transnational field of sanctions from a sociological perspective by evaluating how sanctions are creating new rules for the banking sector. It assesses how the private sector is implementing policies to ensure financial transparency, which is, in turn, transforming the making of international law.

Project Funding

This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No.716216 - PROSANCT).

Research Associates • Farzan Sabet • Erica Moret • Aurel Niederberger • Anna Hanson • Jin Sun

Contact Details

a viral process. “This is why we call this new mode of governance viral governance,” he explains. The wider picture here is a general shift away from the multi-lateral model associated with the Bretton Woods system established in 1944, towards a model based on hegemonic rule-making. “It’s basically one country, the US, that changes the laws and then uses multi-lateral organisations, multi-national corporations and global banks, to put pressure on all private actors to apply their laws,” says Professor Mallard. “That goes against the norm of national sovereignty, and the idea that foreign law does not have extra-territorial effects. It really changes the fabric of rule-making at the trans-national level.” The US market is central for all global banks, so they are under pressure to comply fully with US regulations in order to retain their licence to operate in the country. This is particularly the case for those EuropeanAmerican banks which were essentially bailed out by US cash in the aftermath of the 2008 financial crisis. “In exchange for this direct US help, they became more open to accepting US regulations, so they started acting as US banks everywhere,” explains Professor Mallard. This means complying with financial sanctions imposed by the US, even where it creates headaches for bank compliance officials. “Previously financial

sanctions were used on states like Iran and the DPRK, but now they’re increasingly used to threaten whoever disagrees with US foreign policy,” continues Professor Mallard. “The latest example is the declaration by the US that whoever helps the chief prosecutor of the International Criminal Court (ICC) in her investigation of alleged US war crimes in Afghanistan or Iraq, will be the target of US sanctions.” This is just one example of the way in which financial sanctions are being overused, believes Professor Mallard, which then creates problems for banks. If banks are not clear on which regulations they need to follow and which legal framework applies, then Professor Mallard says some may engage in deceptive practices. “This is what some Chinese banks are doing, they are in a sense continuing to participate in the oil trade between Iran and China. So they basically isolate some of their banks from the global system, which leads to a fragmentation of financial regulation,” he says. The project will make an important to the debate in this area, with Professor Mallard working on a monograph and several articles in which he will further explore how the US has changed the regulation of financial markets. “We’re in a way moving towards a position where one legal system over-rules the others,” he warns.

Project Coordinator, Professor Grégoire Mallard Department of Anthropology and Sociology Director of Research The Graduate Institute GENEVA E: W: research-centres/global-governance-centre/ bombs-banks-and-sanctions W:

Professor Grégoire Mallard

Grégoire Mallard is Professor in the Department of Anthropology and Sociology and Director of Research at the Graduate Institute of International and Development Studies (Geneva). He is the author of Gift Exchange: The Transnational History of a Political Idea (Cambridge University Press 2019) and Fallout: Nuclear Diplomacy in an Age of Global Fracture(University of Chicago Press, 2014).


“Let Them Eat Cake. The Political Economy of Agrarian Republicanism” Contrary to popular belief, the phrase “Let them eat cake” was never actually uttered by Marie Antoinette: it was dreamed up by the Genevan political thinker Jean-Jacques Rousseau in his 1762 educational treatise Émile. We spoke to Professor Béla Kapossy about the eighteenth-century Swiss intellectual context in which Rousseau warned of a major European revolution caused by agricultural crisis and violent urban unrest. Béla Kapossy, Professor of History at the University of Lausanne, has spent his career studying the intellectual history of eighteenthand nineteenth-century Switzerland, encompassing French- and German-speaking cantons as well as wider international networks that included such figures as Edward Gibbon and the marquis de Mirabeau. These interests have led to a new FNS-funded project entitled “Enlightenment Agrarian Republics: From the Vaud to Poland and America,” which builds upon Kapossy’s previous work on the Economic Society of Bern. The Economic Society was a leading European center for the promotion of scientific agricultural practices, which contributed to significant increases in agricultural productivity during the eighteenth


century. This “agricultural revolution” paved the way for the Industrial Revolution of the nineteenth century, by freeing up peasant labor to work in manufacturing. The “Enlightenment Agrarian Republics” project examines a different and complementary aspect of the Economic Society’s activities, dedicated to understanding the role of markets and governments in managing the transition to a balanced form of economic growth.

The Political Economy of Agriculture and Industry The members of the Economic Society who came from the Pays de Vaud – the agrarian subject territory of the aristocratic republic of Bern – were prolific analysts of political

economy, and their writings were translated and published throughout Europe as well as in America. “They saw that states like theirs couldn’t acquire the manufacturing industry that they desired unless they began by reforming the agricultural sector. Agricultural productivity was the ultimate limiting factor to industrialization, and this led into an array of complex questions relating to markets, prices, trade, taxation, legislation, finance, credit, land reform and what we would today call industrial policy,” Kapossy explains. If governments failed to strike the right balance, the consequences could be disastrous. Overheated and premature industrialization threatened to draw farmers off the land too rapidly, creating a toxic combination of urban

EU Research

unemployment and food shortages. “There was a paradox, in that food prices should in principle tend to be high, since everyone needs to eat, which should in turn stimulate investment in agriculture; but, at the same time, food prices can’t rise too high, because then the poor will starve, and also because high food prices will drive up labor costs and make exports uncompetitive, causing unemployment and more starvation.” This paradox lay at the heart of the Economic Society’s efforts to understand the role of the state in successfully unleashing and mastering the positive potential for exchange between town and country. The ringleader of the Vaudois political economists, Elie Bertrand, was recruited in 1762 by the prominent Mniszech family of Poland to serve as tutor to their sons, then in their early twenties and destined to soon take up prominent roles in government. Kapossy decided that it would be worth studying more carefully the range of activities they engaged in during the subsequent years. “If you compare Bertrand to Adam Smith, for example, who at virtually the same moment became tutor to the young Duke of Buccleuch, accepting a kind of sinecure to accompany him on his Grand Tour, you can begin to see the differences. Bertrand organized a major research project around the Mniszechs, involving the leading lights of Swiss and French economic thought in new publishing ventures as well as designing a kind of ‘economic’ Grand Tour which intensively documented

and analyzed economic conditions across Europe. He wound up advising the Polish king and was invited to stay on in Poland, though he didn’t accept the offer in the end.” At the same time, Kapossy points out, there were also similarities. The Seven Years War had just ended, permitting a sudden opening up of travel and a flourishing of reform thinking and exchanges of ideas. Some of the Continental thinkers whom Smith famously encountered, including Voltaire and the Physiocrats, were connecting with the Mniszechs around the very same time.

coalitions of reformers and the domestic factions which opposed them. The powerful Mniszech clan, who were at the center of these struggles, resolved to send their sons to Western Europe to be trained by the leading minds of the Enlightenment, in order to prepare them to help guide Poland’s future reforms. They settled on Bern as their preferred destination, rather than the more fashionable Paris – a notable choice for Catholic Poles given that Bern was Protestant and Elie Bertrand was himself a minister. Their decision was influenced by the famous Swiss

Our approach to the political and economic ideas of the Enlightenment is to examine how the grand theory informed, and was itself informed by, the experience of reformers who sought to address economic and social problems in various real-life contexts. Major changes were also occurring within Poland. In 1763 the Polish king died and an ambitious reformer was soon elected in his place. “Historians are very familiar with the philosophes like Voltaire and Diderot who cultivated relationships with so-called ‘enlightened despots’ like Catherine the Great of Russia and Frederick the Great of Prussia, or Joseph II of Austria. It it is worth remembering that Poland belonged to the same reform networks,” Kapossy points out. Because Poland was a republic with an elective monarchy and a powerful diet, scholars must be carefully attuned to shifting

natural law theorist Emer de Vattel, whose writings Professor Kapossy has previously edited. “Vattel spent the Seven Years War in Warsaw as advisor to the Polish crown, and he persuaded the Mniszechs that Bern, and particularly the Vaud and Lausanne, which was the cultural and intellectual capital, was where they would find the most vibrant and sophisticated intellectual scene.” Thus Bertrand, a friend of Vattel’s, was chosen as tutor, and the Mniszechs became unofficial Polish ambassadors to the Republic of Letters. As he was developing the idea of using Bertrand and the Mniszechs as a window onto

Lausanne, von Nordosten by Johann Jakob Biedermann (1783–1830) - Collection Gugelmann, Swiss National Library, Prints and Drawings Department. View of Lausanne and Lake Geneva in the background.


wider European reform networks and ideas, Kapossy invited Harvard Ph.D. student Graham Clure to join the project as a postdoctoral researcher. Dr. Clure was then in the final stages of his dissertation on Jean-Jacques Rousseau’s Considerations on the Government of Poland, the most famous of the many plans devised by Enlightenment thinkers for the economic and political reform of Poland. Clure had a natural interest in the project in part because his research on Polish reform discourses included figures such as the French Physiocrats and the abbé Gabriel Bonnot de Mably, who were themselves directly linked to the Economic Society of Bern. Moreover, Rousseau spent the period between 1762-65 living in Neuchâtel, on the border with Bern, where he had numerous contacts in the Economic Society. Although Rousseau and Bertrand had frosty personal relations and never collaborated, Rousseau influenced economic thinking in Bern and the Vaud, just as these ideas in turn constituted an important point of reference for one of the major texts that he composed during his time in Neuchâtel, a reform project for Corsica in which he began to outline the main themes that he would later develop in his Considerations on Poland. For Clure, the Enlightenment Agrarian Republics project provided an ideal setting in which to complete a book based on his dissertation, titled Rousseau’s Last Masterpiece: The Political Institutions of Poland. The project has also enabled him to pursue further research on reformers who were active in both Poland and America, such as Pierre-Samuel Du Pont de Nemours and Filippo Mazzei, important theorists of the French and American Revolutions who served at various times as advisors to both the Polish king and the future American president Thomas Jefferson.

In addition, the project brings together three doctoral students supervised by Professor Kapossy, Auguste Bertholet and Radosław Szymański of the University of Lausanne and Aris Della Fontana of the Scuola Normale Superiore of Pisa. Each had conducted previous masters-level research that intersected with the themes of the project. Bertholet studied a co-founder of Physiocracy, the marquis de Mirabeau, through his extensive lifelong correspondence with his Vaudois friend, Frédéric de Sacconay. Kapossy discovered this previously unknown cache of letters in a private collection, and he and Bertholet are now editing them, both in digital form on the scholarly webproject Lumieres.Lausanne and as part of a collection of essays by leading Physiocracy scholars which will soon be published by Slatkine. Bertholet’s doctoral dissertation will provide a major study of both the Vaudois and international networks of Bertrand and the many authors with whom he collaborated, notably on a volume of essays called the Spirit of Legislation, which was translated into every

major European language during the 1760s. Szymański’s previous work on Enlightenment Polish reforms positions him to reconstruct Bertrand’s multi-year educational program with the Mniszechs. Traveling through and carefully observing many different European economies, the Mniszechs and Bertrand produced a vast number of manuscripts, now held in archives in Switzerland, France, Poland and Ukraine. In addition to composing detailed empirical analyses, they studied the range of different economic and political reform theories that they encountered on their travels, and Szymański has begun to publish important findings about their contributions to the Polish reception of Swiss natural law theory, German Cameralism, French Physiocracy, and more. Della Fontana studies the Italian reception of Vaudois political economy texts, such as the Spirit of Legislation, as part of a broader project on Venetian and Italian reform thought. Italians were among the most avid of Bertrand’s followers, and their efforts to apply his ideas in the context of

Bernardo Bellotto - View of Warsaw from the Royal Castle 1773.

1798 Berne Half Thaler. © Sincona AG

Far left: Front cover of Mémoires et Observations Recuellies par la Société Économique de Berne. Left: Passport to Poland of Pierre-Samuel Du Pont de Nemours, 1774, ID:20091029_4_002,Winterthur Manuscripts (Accession WMSS), Manuscripts and Archives Department, Hagley Museum and Library, Wilmington, DE 19807.


EU Research

ENLIGHTENMENT AGRARIAN REPUBLICS “Enlightenment Agrarian Republics : From Vaud, to Poland, and America” Project Objectives

The “Enlightement Agrarian Republics” project studies eighteenth-century European and American reform theories from the perspective of countries with a predominantly agrarian economy. Their theoretical and practical contributions to Enlightenment debates on the relation between agriculture and manufacturing, town and country, were widely studied at the time and had a major influence on nineteenthcentury theories of balanced growth.

Project Funding

Funded by the Swiss National Science Foundation (FNS) number 100011_172846.

Project Partners

• Professor Iain McDaniel, University of Sussex, United Kingdom E. I. du Pont Drawings of Powder Mills and Machinery, No. 047, Graining Mill, c. 1803-1830, ID:dupontdrawing_4700001, The Longwood Manuscripts (Accession LMSS), Manuscripts and Archives Department, Hagley Museum and Library, Wilmington, DE 19807.

the fading glory of the great Renaissance commercial republics provides a new way of thinking about the challenges of economic modernization in Italy.

Transnational Reform Networks One of the original aims of the project was to build outwards from the Vaud as a way of gaining a better understandings of the concerns that animated European political and economic reform thought more generally. “The notion of ‘agrarian republics’ provides a loose shorthand for some of the main themes which unite our individual research projects,” Kapossy says. “Bern, Poland, the United States, Venice and Genoa were all republics, but there were still huge differences in their politics and their economies. By studying them together, we can identify separate nodes and linkages in a larger network. Initially, we started with the Polish-Swiss strands of Bertrand and Rousseau, then we added the Polish-American strand of Du Pont de Nemours, who, like Bertrand and Rousseau, was also very much part of a French and Swiss context. Now we’ve also included Italy, which connects to the others in a specific and interesting way. This provides a kind of common denominator, which allows us to begin by examining different reform languages in multiple overlapping contexts, proceeding to add in new contexts and pan out step by step.” The project itself is now entering its final year and, while the Covid-19 pandemic has disrupted the group’s activities – especially travel to foreign archives and libraries –

Professor Kapossy hopes to hold a conference next year in order to widen his team’s own transnational intellectual networks and establish new collaborations with junior and senior researchers in other countries, such as Portugal, Denmark, Sweden and Russia. There are multiple publications in the works, and Kapossy is already looking to the future and to subsequent projects which can harness digital tools to facilitate the study of economic reform thinking in a comparative context. One of the main results of the project is that it advances Kapossy’s longstanding interest in tracing the development of Enlightenment ideas into the nineteenth century, among successive generations of thinkers that included Du Pont de Nemours, Malthus, Hamilton, List, Tocqueville, Hegel and many more. “Jean-Jacques Rousseau” by Jean-Antoine Houdon (1778).

Contact Details

Project Coordinator, Professor Béla Kapossy Section d’Histoire Université de Lausanne Bâtiment Anthropole 5189 1015 Lausanne T: +41 21 692 29 41 E:

Project Website W: Bertholet, Auguste “The intellectual origins of Mirabeau”. History of European Ideas: 1–5, 2020. Clure, Graham “Rousseau, Diderot and the Spirit of Catherine the Great’s Reforms”. History of European Ideas: 41 (7):883-908, 2015. Szymanski, Radoslaw “Vattel as an Intermediary Between the Economic Society of Berne and Poland” in: K. Stapelbroek & A. Trampus (eds.) The Legacy of Vattel’s Droit Des Gens, 2019, Palgrave Macmillan, pp. 29-52. Kapossy, Béla et al. (eds.) Commerce and Peace in the Enlightenment. Cambridge University Press, 2017. Kapossy, Béla et al. (eds.) Markets, Morals, Politics. Harvard University Press, 2018.

Professor Béla Kapossy

Béla Kapossy is Senior Professor in early-modern History at the University of Lausanne and currently Dean of the Collège des Humanities at EPFL. His work focuses mostly on the history of eighteenth- and nineteenth-century European and Swiss political and economic thought.


How should one live? A glimpse of agreement between Greece and China The Greek and Chinese philosophical traditions both have roots in antiquity, and both try to work out how we should live. Overcoming disagreement on this subject, among others, using definitions is a key part of philosophy in both ancient civilisations. Richard King and Anders Sydskjør are looking at the use of definitions in the writings of Plato and Xunzi, an important contribution to the emerging field of Sino-Hellenic studies. Ancient



developed the art of argument, one ancestor of what we call ‘logic’. This provides them with a framework for debate, teaching and science. Chinese philosophers argue, but mostly they are not interested in what argument is. “Although the main Chinese philosophical traditions give us many highly formalised arguments, they developed no rule-book. This has presented modern scholars with the problem of identifying how early Chinese arguments work,” explains Richard King, Professor of the History of Philosophy at the University of Berne. This topic is at the heart of an SNF project, in which King and his colleague Anders Sydskjør are comparing the two traditions. “One way of doing this is by choosing a class of linguistic elements common to both traditions. We have chosen definitions,” says Sydskjør.

Definition and disagreement A clear definition and thus a shared understanding of a term provide a foundation on which philosophers can then debate, and thus refine their views. Clearly, some topics are hard to define. “For example, if you say; ‘let’s


talk about justice’, you will find that people disagree about what ‘justice’ means,” points out Sydskjør. So, definitions naturally go with disagreement. In the Western rhetorical tradition, definitions generally come either at the beginning of a discussion or at the end, yet even a casual observer of contemporary politics or academia will recognise that not everybody always follows the niceties of debate. “Much misunderstanding might be avoided if people were slightly keener to define their terms,” says Sydskjør. The way in which this activity was conducted differed in Greece and China however, an issue that Sydskjør is exploring by analysing the use of definitions in the works of Plato and the Chinese philosopher Xunzi. They both addressed many topics in their work, including the Socratic question ‘How should one live?’. “It is hard to evade the Socratic question, and ancient answers are clear and fresh, where contemporary philosophy all too often gets lost in technicalities,” he stresses. “One point about the Platonic Dialogues is that they induce you to philosophise. In them, we see people learning how to do philosophy. For Plato, philosophy is an activity, not a set of

doctrines which you learn and then practice.” Disagreement about terms is one beginning to this activity, and definition one of its crucial aims, an aim which often proves elusive. When attained, it provides knowledge. What is crucial for Plato is that definitions, like other statements, can be true. Thus, they can be used to ground arguments for action and belief. Much attention in the project is focused on Plato’s Charmides, a dialogue between two relatives of Plato’s and Socrates in which the meaning of a virtue, sophrosynê, ‘selfdiscipline’, is debated. One of these relatives is Critias, later part of an unsuccessful junta in Athens, and an important figure to Plato. “Both Critias and Charmides want political power, and self-discipline is notoriously hard for the power-hungry. And philosophy is intimately concerned with ruling for Plato,” says Sydskjør. This is partly a legacy of the Sophists, a group who tried to influence statesmen in ancient Greece. “What the Sophists tried to tell people was; ‘we can make your life better’. That tended to mean, ‘we can make it possible for you to have great political power’,” he explains. “Plato is critical

EU Research

of the Sophists, and insists on the importance of knowledge, above all - on both a personal and political level - to the project of leading your life correctly.” Xunzi belonged to a group of scholars who travelled between courts advising rulers before the unification by Qin Shihuangdi of (part of) what is now called China in 221 BCE. So, he is almost exclusively interested in the correct conduct of communal life. Many of his texts focus on the Socratic question, and ‘Live using knowledge!’ is a key part of his answer, but the context of that answer is very different from Plato’s. For Plato, definitions ground justifications, for actions and opinions. The background is adversarial; make a claim or decide upon an action, and you risk being put on the spot to answer for it. For Xunzi, there are two problems to which definitions provide the answer. How to produce smooth co-operation among people who specialise in different things? Shared definitions enable specialists to communicate with nonspecialists, and allow both leaders and the led to know what the tasks are and when they have been carried out. A cooperative aim.

can become honourable members of society. In Xunzi’s view, earlier sage kings worked out how humans can live together, and we must follow them. “This involves a hierarchical society in which certain rites, religious rituals, play an important role. An individual’s understanding of how and why to fit into this hierarchical society, and the pleasure they take in these rites, are basically what Xunzi conceives of as making a good officer in this society,” outlines Sydskjør. Xunzi is addressing here the elite officers of large states, who were building careers in the official service. “These people had to be knowledgeable and disciplined to make sure that they conformed to the rites, the backbone of Xunzian society,” notes Sydskjør.

Sino-Hellenic studies One contemporary backdrop to this research is tension between China and the West. The globe is ever more interconnected by travel and trade, so Sydskjør believes it’s vital for philosophers to understand and compare different traditions in a scholarly way. “The important thing is to work towards a clear

“Live using knowledge!” is a key part of Xunzi’s answer to the Socratic question, but the context of that answer is very different from Plato’s. The other problem has to do with Xunzi’s own central task – to convince decisionmakers to accept his overall programme. Arguing for it against stiff opposition involves defining a special understanding of human nature, emotion, capacity, and knowledge. The final aim of his treatises is not definition, but the moulding of the reader, ‘massaged’ into accepting the views so clearly presented. His aim is persuasion more than justification. As a philosopher, Xunzi takes a certain view of how humans work, namely that we are born with desires which must be controlled so we

view of the relationship between these different traditions. Rational argument is a crucial meeting point – even if the styles of argument appear very different,” he says. The project contributes to this aim, with Anders Sydskjør writing a monograph on definitions and disagreement. “Many researchers are comparing Graeco-Roman and early Chinese culture. Deepening our understanding of how they argued, in addition to what they argued for, will be an important contribution to philosophy in the field of Sino-Hellenic studies,” he concludes.

DEFINITION AND DISAGREEMENT IN PLATO AND XUNZI Definition and Disagreement in Plato and Xunzi Project Objectives

The project focuses on the use of definitions by the use of two thinkers, namely Plato and Xunzi. Plato is a central figure in the Greek philosophical tradition, while Xunzi made important contributions to Chinese philosophy. This research represents a contribution to the field of comparative ethics, yet it distinguishes itself from much previous investigation by adopting the angle that the use these authors made of definitions is best understood in the light of how they understand the nature of moral disagreement. The comparison between the two philosophers gains further traction by understanding moral disagreement in light of the Socratic question; how should we live?

Project Funding

This project is funded by the Swiss National Science Foundation.

Contact Details

Professor Richard King University of Bern Institute of Philosophy Länggassstrasse 49a 3012 Bern T: +41 78 216 80 80 E: Anders Sydskjør E: W: projects/definition_and_disagreement_in_ plato_and_xunzi/index_eng.html W: about_us/staff/king/index_eng.html Anders Sydskjør

Prof Richard King

Richard King is Professor of the History of Philosophy at the University of Berne. He has held positions at several institutions in Germany, the UK and Switzerland. He has written several monographs on Greek philosophers, and served as the VicePresident of the International Society for Chinese Philosophy between 2016-2018.


US & EU relations: Change is Coming How will President Biden’s policies and approach affect Europe? Will we now see closer collaborations in science, from climate change policies to re-joining the World Health Organisation and how will the USA’s new leader align with European interests? By Richard Forsyth


s the European Parliament Liaison Office in Washington DC puts it: “the relationship between the EU and the US is one of the most important bilateral relationships in the world.” In terms of might, economies, military, trade and politics this relationship is a crucial mechanism in modern civilisation and a strong catalyst in contributing to outcomes for our global future. Whilst the right-wing Republicans and the left-wing Democrats have always see-sawed in power, never has there been such an important election for America and indeed the whole world, and US voters knew it. Around 65% of eligible voters cast their ballot, which translates to more votes in the 2020 election than any other in 100 years. It was a fight, but Biden was projected the winner when he created an uncatchable lead, despite Trump’s refusal to concede. The New York Times called the recent presidential election in the US ‘an American cliff-hanger’. The policy differences that really stood out between the two presidential candidates were starkly opposing when it came to matters of science. Science is everything in the current era, ignoring it could be a catastrophe and using it wisely to steer policy could save populations. The journal Nature conducted a survey where around 86 percent of 579 scientists planning to vote in the US supported Biden, declaring the coronavirus and climate change were their main issues. The Covid19 pandemic is escalating through winter, it’s the last chance to slow climate impacts and it’s not an exaggeration to say what we do in terms of environmental management on our planet in the next four years could make a huge impact for the next generations. With the pandemic and climate change as key issues today, Biden’s win


offers a little hope for common sense to be the guide and prevail.

Following simple rules for Covid19 Science should play the key role in tackling a pandemic. To make the population aware of how to behave to avoid transmission, with social distancing, mask wearing and hygiene, it takes simple messages and strong, consistent leadership. Trump constantly, openly treated the virus as at best a nuisance, at worst nothing to fear, dismissive of promoting guidelines for safety with Covid19. His administration held and welcomed rallies and gatherings without social distancing measures or mandatory mask wearing, creating confirmed ‘super spreader events’, all the time playing down the threat of the virus. The White House Office of Science and Technology even declared Donald Trump had ‘ended the Covid-19 pandemic’ saying it was a major accomplishment, to the fury of many officials. Science became marginalised, mocked, ignored and used as a political tool by Trump personally, despite some of the efforts of those he tasked with fighting the pandemic. In contrast to his predecessor, Biden’s first use of his podium of power was to instruct Americans to wear masks as a ‘patriotic’ act in an unambiguous message. It sets a tone that is strikingly different from the casual indifference of the Trump camp to these basic rules of care and stopping person to person transmission. As if to confirm the administration’s approach was reckless, many of The White House staffers and the President himself caught Covid19. The error of Trump’s approach and leadership has been widely held in contempt by healthcare professionals and scientists on the frontline against this disease.

EU Research

Biden was Vice President of the Obama administration at the time of the Paris agreement, where Obama recognised the climate change threat, stating it as a matter of ‘national security’ because the rising sea levels and climate impacts would reshape America’s map and threaten many cities, populated areas and the economy. Importantly, Biden said in the run up to the election that he ‘would listen to the scientists’. It is also speculated that re-joining The World Health Organisation is on the cards, after Trump announced he was pulling funding for the healthcare body with unsubstantiated accusations they had somehow colluded with China against American interests. The disentanglement from WHO has not technically happened yet but was set for July 2021, and Biden will likely seek for as little disruption to a continuation of involvement, as possible. The WHO has always been the voice of healthcare for the world and America’s involvement in the organisation will be critical for it to function as well as it has as a monitor of global diseases and health concerns. Since the election, Pfizer and BioNTech have made an announcement they have developed a Covid19 vaccine with 90% effectiveness. They stated when they broke the news of the achievement, it was a ‘great day for science and humanity’. It is the first of many potential vaccines, some of which may be more effective and easier to transport (the initial vaccine needs to be stored and transported at a very cold minus 80 degrees Celsius). It offers a genuine hope to thwart the relentless spread of the disease and will feature in Biden’s response. What America does next with regards to Covid19 is critical. Biden

has the basis of a plan. Firstly, he has assembled a Covid19 task force of scientific experts to collaborate and coordinate the response. Next, he will focus on increasingly the supply and accessibility to PPE (Personal Protective Equipment) that is in short supply for healthcare workers. He intends to ramp up a national track and trace programme (with 100,000 people enrolled to make this possible) including doubling the amount of drive-in test centres. He wants clear messaging for people on mask wearing and social distancing. A full national lockdown is also in the mix to cut the spread of the virus. A national lockdown of four weeks or more is one of the major ideas that Biden has floated after Trump insisted there would be no further lockdown, despite rising infections. Lockdowns are blunt tools for reducing the spread of the virus but seemingly the most effective way to put the breaks on a runaway infection rate, beyond a vaccination programme. The greatest hope of all is of course, the vaccine roll-out, which will be seen as the beginning of the end of the fight against Covid19. In its entirety Biden’s approach is mirroring many European countries. Whilst the pandemic is the most urgent global science problem the US is fighting, there is another one that is sitting behind it, with equally stark consequences, if ignored.


Climate U-turn When Joe Biden won the election in the US in November, it likely had a lot to do with his stance on climate change, which is the opposite to Trump’s position. America has real division around the subject of climate change, and conspiracy theories, climate change denial and all manner of different opinions on what is and isn’t true in this subject are rife. However, climate change is a scientific reality, it has been calculated that 97 percent of actively publishing climate scientists agree humans are causing global warming and climate change, including the leading scientific organisations in the world. For the informed voters on this subject, the importance of it is clearly therefore key voting criteria. In 2019, Trump made the shock announcement that he intended to pull America out of the historic Paris climate agreement where 196 countries pledged action to prevent worst case scenarios with climate change. Focusing on American interests in energy instead, Trump’s rebuttal of the agreement, as a ‘bad deal’, became a concerning challenge for other nations already invested in transformative infrastructure and feeling the impacts of climate change in their countries. Since the early days of his bid for America’s top job, Joe Biden always declared that one of his first actions as President would be to reverse Trump’s decision to pull out of the Paris agreement that had an ambitious global plan for climate action. Biden was Vice President of the Obama administration at the time of the Paris agreement, where Obama recognised the climate change threat, stating it as a matter of ‘national security’ because the rising sea levels and climate impacts would reshape America’s map and threaten many cities, populated areas and the economy. Today, Biden, like most world leaders, is well aware of the projections, the data and the science and comprehends the urgency for climate action and the impact an industrial giant like the US has on the world. Biden, as a result, wants to go further than any other US leader in making the transformation to a carbon neutral America happen. Frustratingly, Trump’s withdrawal from the 2015 Paris climate agreement occurred on 4 November 2020, literally just after the


election. So, with Biden now elected, he has pledged to reset this, as a major priority alongside tackling Covid19. It will mean the US will have to devise a new national emissions reduction pledge with a view to net zero carbon emissions by 2050, with a collaborative global aim to keep global warming under a 2 degrees Celsius rise by that deadline. Biden’s plans are ambitious, with a proposed two trillion dollar investment in transforming millions of buildings to make them more energy efficient, a 400 billion dollar boost for renewable energy research over 10 years, a crackdown on corporate polluters and tax breaks for sustainable practices. He also wants a rethink on public transport with a switch to electric vehicles, aiming for 500,000 new electric vehicle-charging stations by the end of 2030. Whilst these are all enormous initiatives in scope and scale a goal that stands out is for US electricity production to be totally carbon free by 2035 - a huge task in the space of 15 years. This is what Biden refers to as a ‘Clean Energy Revolution’. It reflects Obama’s plan to see climate change as a way to create new jobs through growing and stimulating sustainable industries, viewing the huge changes not as an economic problem but an economic opportunity. Europe’s green agenda and investments into sustainable research will play well into America’s interests for Biden’s next four-year term. Innovations, research, technologies and new greener methods could now peak the interest of US industries, for potential collaborations and procurement. The implications of the election and Biden’s win means a lot for the Paris agreement of 2015, it will make a huge difference at a crucial moment in time for choosing a pathway for our shared future climate. It should be noted that there may well be logistical problems pushing through the new policies and reversing the old ones in the Senate setting, with Republicans having a firm footing there, but Biden has already stated that he is willing to use executive powers to keep the pressure on toward success, perhaps the one and only tactic borrowed from his predecessor.

EU Research

The tech standoff Whilst change is inevitable in many areas of policy around scientific pursuits, promoting increased collaboration between the EU and the US, there are also some clashing interests, that will remain difficult to resolve amicably. One sore point in the relationship is in the realm of digital technology. Europe has become more defensive about US dominance in digital, with social media, search, digital retail and cloud. Amazon, Facebook, Google – these giants of the tech industry have become go-to spaces for Europeans online. It’s not just the dominance of big US tech firms which is irking European politicians, it’s also the fact they pay very little taxes when making such huge sales in the European space. There is a push from Europe for digital taxes for these kinds of companies. It’s a sticking point which has led America under Trump to threaten tit-for-tat tariffs on other exports from Europe. With nearly 140 countries in talks by the OECD (Organisation for Economic Cooperation and Development) there is a real effort to create global tax rules for digital times and this could get ugly, whoever is running the US. There are also concerns that EU data is not safe with the US. For instance, The Court of Justice of the European Union concluded there was not adequate protection in light of US surveillance law.

A change in approach Generally, however, the way the US engages with the rest of the world, particularly with science, could dramatically shift after the Trump era.

For one thing, a more open approach, that is less protectionist would be welcomed by industrial nations, not least for closer vaccine research collaborations. Robert-Jan Smits, former Director-General of the European Commission’s research and innovation directorate is expecting more trust and openness from the Biden administration, with the hope of a ‘common pot of money for the best brains to work together,’ and in a best case scenario, more reengagement and communication with China. It’s been said that many research leaders and politicians in Europe are trying to work out what a Biden presidency means but there is a general feeling that Biden is pro-Europe in outlook. Ursula von der Leyen, European Commission President said the EU “stands ready to intensify cooperation with the new administration” on their response to climate change and COVID-19. James Moran, Associate Senior Research Fellow at the Centre for European Policy Studies was more to the point in saying scientific diplomacy had ‘been AWOL during the Trump administration’ but with ‘Biden’s longstanding belief in multilateralism and international cooperation, we can look forward to seeing Washington coming back to the table, and that can only be a good thing for Europe and the world at large”. It is very early days into the Biden era and with Trump’s stubbornness frustrating a clean transition of power, a high level of scrutiny is on the Western world’s new leader. Will he rise to the promise? In the next four years we will find out


Photo by ActionVance on Unsplash

A platform for growth in textiles The textile industry has historically been an important part of the European manufacturing sector, but over recent years a large amount of production capacity has shifted away from the continent. The FBD_BModel project is developing a new interactive digital design platform that could help improve competitiveness and rejuvenate the European textile industry, as Dr Xianyi Zeng explains. The textile and clothing industry has historically been an important part of the European economy, but over recent years a large amount of production capacity has shifted to countries with lower wage costs. An interactive, cloud-based digital design platform could rejuvenate the European textile industry and greatly enhance its competitiveness, a topic at the heart of the FBD_BModel project. “The wider aim is to optimise the resources in the textile supply chain,” says Dr Xianyi Zeng, the coordinator of the project. This work centres around building up a new digital technology platform to connect different parts of the supply chain, from fibre, yarn and fabric producers to garment manufacturers, for the development of a new, knowledge-based business model in the big data era. Renowned members of academia (Ecole Nationale Supérieure des Arts et Industries Textiles, University of Manchester, University of Borås, Deutsche Institute für Textilund Faserforschung Denkendorf), representatives of the textile and


A Knowledge-based business model for small series fashion products by integrating customized innovative services in big data environment (Fashion Big Data Business Model) The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement N. 761122 Professor Xianyi Zeng, GEMTEX 2 allée Louise et Victor Champier BP 30329 59056 ROUBAIX CEDEX 1 – FRANCE T: +33 (0) 3 20 25 89 67 E: W: W:

Xianyi Zeng is a full professor in ENSAIT Textile Engineer School, France, and Director of the GEMTEX National Laboratory. His main research interests include artificial intelligence, digital fashion, intelligent wearable systems, and supply chain management. He has published more than 130 papers in peerreviewed international journals.


Digitalized 3D garments with rendered virtual fitting effects.

fashion industry (Bivolino, Beste, Azadora, Kuvera), technology and engineering providers (Fitizzy, Desap Entreprises and Grado Zero Espace) as well as a consulting firm (BeWarrant) are contributing to the ambitious goals of FBD_ BModel. “We are developing a digital platform in the project. Small textile manufacturers already exist in Europe, but they are relatively isolated, they have no knowledge of different markets,” explains Dr Zeng.

The consumer can even be directly involved in the design process - one of the businessto-consumer scenarios being explored in the project - which Dr Zeng says represents a sharp departure from the conventional approach. A set of recommendation systems, taking account of fitting, hand feel, and wear comfort have also been developed, which act effectively as virtual sales advisors. “The system takes account of a consumers’ sensory

Previously the consumer had to passively accept a new design, there was no interaction with the designer. Now, we have created a platform for the designer and the consumer to interact. Digital platform This represents a significant problem, which the FBD_BModel consortium is working to address through developing a digital platform that will enable different parties in the supply chain to collaborate, from fibre and yarn producers, fabric and garment manufacturers, to designers and production planners. “With a digital platform, partners can collaborate, for example in choosing the right suppliers,” outlines Dr Zeng. Researchers in the project are also developing intelligent data services, where databases from different parts of the supply chain are connected. “For example, brands and companies have data about the market, and this can be used to provide different data services,” continues Dr Zeng. “Also, a wizard to support managers’ decisions for supplier selection and dynamic planning of production orders by graphical simulations has been developed.”

preferences and their specific requirements. It then proposes the best-matching products,” outlines Dr Zeng. The technology platform and data services have been developed over the course of the project, now Dr Zeng and his colleagues are turning their attention to how they will be used. The platform and the data services will be applied to four different business cases by using new business models. “One is about the personalised design of shirts for men. Our platform is being used to provide more services to consumers,” says Dr Zeng. The wider aim here is to enhance the competitiveness of the European textile industry, and so encourage its further growth and development “While labour costs are relatively high, the advantage for the European economy is that there is the possibility to use the power of technology to improve the efficiency of traditional industry,” says Dr Zeng.

EU Research

Cash transfers and the fight for decent work A lot of attention is focused on issues around exploitative labour, with researchers exploring ways to aid the fight for decent work. The WorkFREE project aims to build a deeper picture of circumstances on the ground, investigating how the combination of cash transfers and participatory action research can help, as Dr Neil Howard explains. The reality of

work today for many people involves providing their labour in difficult, undignified and dangerous circumstances. While work like waste picking has been labelled as exploitative by some observers, many people nevertheless choose to do it, though their options are often severely limited. “People are agents, yet that agency is seriously circumscribed. Structural pressures place major limits on what people can and cannot choose,” explains Dr Neil Howard. As the Principal Investigator of the WorkFREE project, Dr Howard is now investigating whether providing cash transfers (or what some would call a ‘basic income’) to

action research and more grass-roots support for collective action. “The hypothesis behind this is that traditional policy-making around phenomena like so-called modern slavery or exploitation remains very top-down, even though top-down approaches are documented to be ineffective,” explains Dr Howard. “There’s an impressive tradition in action research of essentially reversing the order of doing things and saying; ‘Ok, we are here with you in solidarity, rather than to offer you any charity’.” The difference is that whereas traditional policy-makers come with a pre-conceived idea of the right course of action, action

There’s an impressive tradition in action research of essentially reversing the order of doing things and saying; ‘Ok, we are here with you in solidarity, rather than to offer you any charity. a community involved in waste-picking in Hyderabad, India will enhance their freedom and widen the opportunities available to them. “If we can understand freedom as being partly about the ability to say no to things, then the idea of ensuring that people have enough money in their pocket is that this should increase their ability to say no to the worst forms of work,” he says.

Working choices The WorkFREE social experiment has two arms to it, only one of which centres around this intuitive idea that if people have more money, they’ll be able to make different choices in relation to their work. The second arm of the experiment involves participatory

research involves asking rather than telling. By asking people what they see as their problems and what they need to improve their wellbeing, Dr Howard hopes that WorkFREE’s approach to action research will both model a different way of understanding and engaging with exploitative forms of livelihood and lay the foundations for sustainable change within participant communities beyond the life of the project. “It’s about asking people questions such as; what skills are there here that we can draw upon? What support might you need to use them well?” he outlines. The cash transfers will be delivered over an 18-month period, while the action research will be undertaken over two full years, providing a basis for researchers to investigate a number

of scientific questions. “Can we support people to experience more of ‘the power to say no’? How can we theoretically define concepts like exploitation and freedom together with people who are typically excluded from that kind of discussion?” asks Dr Howard. This project is still at a relatively early stage, and the priority at the moment is establishing partnerships and building relationships with the participant communities. The research will be conducted in collaboration with several respected Indian institutions, including the Indian Network for Basic Income (INBI), the Initiative for What Works to Advance Women and Girls in the Economy (IWWAGE), and the Montfort Social Institute (MSI). Dr Howard hopes the project’s findings will advance academic debates around labour unfreedom and help inform future development policy. “Members of the research team in India are very well connected with the Indian policy establishment, while the Europe-based team have similar connections there,” he says.

WorkFREE Slavery, Work and Freedom: What Can Cash Transfers Contribute to the Fight for Decent Work? Dr Neil Howard Prize Fellow University of Bath Beyond Trafficking and Slavery T: +44 7456 373 272 E: W: https://researchportal. workfree-erc-startinggrant-transfer-in Dr Neil Howard is Prize Fellow at the University of Bath and PI of the ERC Starting Grant, WorkFREE. He researches exploitative work and efforts to prevent it.


Taking the temperature of social media Circulation



Antagonistic relations between groups

Constructive relations between groups

Social media plays an important role today in shaping our views, and the analysis of online platforms can provide a better understanding of public opinion. We spoke to Dr Eckehard Olbrich about the work of the Odycceus project in seeking ways to both tap into the information swirling around on social media, and also use it to help resolve conflicts.


The major social media platforms and

Odycceus project

discussion forums can provide a snapshot of public opinion on all manner of topics, from climate change to geopolitics and everything inbetween. News stories often provoke debate among social media users, and this can become heated when people encounter differing world views or opinions they deem to be offensive, an issue central to the work of the Odycceus project, an EU-backed initiative which brings together eight academic partners from across Europe. “Under what conditions does debate become hostile? What possibilities are there to resolve that?” asks Dr Eckehard Olbrich, the project’s coordinator. The wider aim in the project is to develop methods to analyse the content of debate on social media platforms and take the temperature of public opinion, work which is built around a few main research pillars. “In one pillar of the project we are developing theoretical concepts such as projective game theory for understanding cultural differences,” says Dr Olbrich.

The other pillars in the project centre around modelling, text analysis and developing tools to help scientists, journalists and the general public deal with the huge amount of information and opinion that is available via social media and discussion forums. Researchers in the project are analysing a variety of different text corpora as part of this wider goal. “We’re using different techniques including topic modelling, word embeddings and semantic frame extraction to analyse these data. Our partners at Ca’Foscari in Venice and at the University of Amsterdam are working with word embeddings,” says Dr Olbrich. “A Summer School was held last year as part of the project, in which participants developed media monitoring tools and investigated their potential applications, for example in monitoring the spread of hate speech and excitable speech.” Researchers in the project are also employing these methods to study the use of anti-semitic language. Researchers at the Ca’ Foscari University in Venice use diachronic word

embeddings to analyse a corpus of French documents over the period between 17891914 to observe the different dimensions of the antisemitic discourse (economic, social, religious, racial, conspiratorial, ethic). “For instance, it can be used for analysing electoral manifestos, but we have also used it for the analysis of twitter data,” continues Dr Olbrich. One group of researchers in the project analysed large volumes of Twitter data on the climate change debate. “We used our methods to understand what the different groups on Twitter – like the media, NGOs, climate-change sceptics, activists – discuss, and how they engage with each other,” outlines Dr Olbrich. “If you look at who re-tweets whom, then do a network representation of that and look at the clusters, then you can analyse how a debate evolves.” A tool called the twitter explorer has been developed during the project to represent different strands of opinion in a visual, easily accessible way, from which researchers can then look to build a deeper understanding of the dynamics of a debate.

EU Research

While in the past people may have relied on newspapers, TV and radio for information about current affairs, nowadays people may be influenced by internet memes from a variety of different sources, another topic of interest to Dr Olbrich and his colleagues in the project. “One group in the project, in the digital methods initiative based at the University of Amsterdam, did a lot of research on 4Chan, a website which generates a lot of new memes. 4Chan is an image port, a platform for exchange,” he says. “A conspiracy theory called QAnon comes from 4Chan, a lot of fringe communities are there. It’s the source of a lot of alt-right ideas and memes. We’re also looking at other platforms which have attracted less research attention.” The spread of these ideas and memes has been compared by some observers to the spread of a virus, an analogy of interest to some researchers in the systems science field. However, while many of us have grown used to limiting social contact to prevent the spread of Covid-19, limiting the spread of ideas runs the risk of stifling public debate. “In our models we study polarisation. These models are related more to attitude changes and less to information diffusion. We are currently looking at how to effectively combine the two,” he explains. The idea of complex contagion is important here. “This basically means that ideas are not spread simply through this social contact process. The extent to which an idea can be transmitted is governed by a more complex process,” continues Dr Olbrich. “The analogy of the spread of a virus is however still relevant to these complex contagion models, and our project partners in Amsterdam are exploring these ideas.”

Penelope platform A number of different tools and methods have been developed in the project which will be integrated on an open modular platform called Penelope, along with other components from outside Odycceus. The idea is that these components, essentially web services that can be accessed by APIs, can then be incorporated in different applications. “One component for instance is a causal frame detector; you have a text, and you try to identify causal arguments. We have applied it on the climate change debate, but it can also be applied in other contexts,” says Dr Olbrich. These interfaces can be used for different kinds of analysis, while Dr Olbrich and his colleagues are also working to develop an interface that can be used in a more intuitive way. Researchers are also developing several interfaces that use the Penelope components, including a Climate change opinion observatory, 4cat or the twitter explorer, which are designed to help citizens assess the overall picture with respect to a certain debate. While a lot of effort currently goes into filtering content in online debate, Dr Olbrich and his colleagues are thinking more about what has to be done to improve the quality of debate, which is the central idea behind the Opinion Facilitator – another tool developed by the project. “This is about essentially making the kind of representations that different sides of a debate use more transparent,” he says. The priority in the project is scientific investigation and technical development, yet Dr Olbrich is very much aware of the wider context in terms of understanding the opinions and content swirling around on social media. “At this stage, we are focused on getting the science right. But if we achieve what we want to, then in future we may engage with digital media organisations,” he says.

We use our methods to understand what the different groups on Twitter – like the media, NGOs, climatechange sceptics, and activists – discuss, and how they engage with each other.

ODYCCEUS Opinion Dynamics and Cultural Conflict in European Spaces Project Objectives

The growth of social media is having far-reaching effects on how both individuals and communities communicate with each other and organise. Can the information and opinions swirling around on social media be tapped in order to provide deeper insights into the challenges facing modern society? The ODYCCEUS project combines theoretical, modelling and empirical research, together with the development of new tools and methods to tap into social media. This could offer a way to monitor diverging opinions and detect and address social problems before they become acute.

Project Funding

This infrastructure is part of a project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 732942. (Overall budget € 5 817 276,25)

Project Partners

• Max Planck Institute for Mathematics in the Sciences (Leipzig) • Università Ca’Foscari Venezia, Departments: Dipartimento di Management, Dipartimento di Studi Umanistici • Chalmers University of Technology (Gothenburg), Departments: Energy and Environment • Sorbonne Université (Paris), Laboratoire d’Informatique de Paris 6 (LIP6) • Université de Paris, Departments: Géographie • Vrije Universiteit Brussel, Artificial Intelligence Lab • Universiteit van Amsterdam, Amsterdam Institute for Social Science Research (AISSR) and the Digital Methods Initiative• Universität Leipzig, Institute of Sociology

Contact Details

Project Coordinator, Dr. Eckehard Olbrich Max Planck Institute for Mathematics in the Sciences Inselstrasse 22 D-04103 Leipzig Germany T: +49 341 9959 568 E: W: Eckehard Olbrich

Eckehard Olbrich is group leader at the Max Planck Institute for Mathematics in the Sciences (MPIMiS). He studied physics at the TU Dresden (Germany) and has a PhD in theoretical solid state physics. He is working on several aspects of complex systems theory, such as information decomposition, complex networks, game theory and mathematical modeling of social dynamics and communication with a focus on data analysis.


Metro Networks in the 5G Era

Overall network schematic showing access, metro and core regions.

New 5G networks will not only provide greater bandwidth but also solve problems with existing connectivity, which provides additional functionality for a range of new vertical industries. We spoke to Professor Andrew Lord and Dr Daniel King about the work of the METRO-HAUL project in designing and building scalable infrastructure that services the needs of these potential new applications. The introduction of 5G telecommunications networks will provide not only greater bandwidth to consumers and businesses, but also bring additional connectivity and functionality to new vertical industries. While the provision of increased bandwidth to radio devices is an essential driver behind the ongoing development of 5G, these new, highly-specialised, vertical industries may have other priorities. “They might need very low latency, low jitter, or the ability to perform compute and storage functions, which they might want to set up very quickly,” outlines Professor Andrew Lord. As the Principal Investigator of the METRO-HAUL project, Professor Lord is working to design and build a smart optical metro infrastructure to support different applications. “In METROHAUL, we’re looking at how you architect and build a cost-effective and energy-efficient metro network that can meet the challenging requirements for the upcoming vertical industries in the new 5G era,” he explains. A metro network is just one part of the broader telecommunications infrastructure, that can be compared to the structure of a tree. The core network can be thought


of as the trunk, and the metro network as the branches that spread out from it, while the access network is the leaves of the tree which interconnect with customers. “The core network is the big pipes that bring traffic to data centres and internet exchange points around the country. The metro network takes traffic from the access network into central locations such as regional data centres or moves it on to the core network,” explains Professor Lord. The nature of the traffic emanating from the access networks is expected to change significantly with the introduction of 5G, a topic central to the METRO-HAUL project. During the project, attention focused on architecting nodes to interconnect the metro network with the core and access networks. “The access metro edge node (AMEN) is the interface between the access and the metro networks, while the metro core edge node (MCEN) is the interface between the metro and the core networks,” continues Professor Lord. “Those AMENs and MCENs potentially have the IT resources necessary to host both the content and the function necessary to deliver 5G applications.”

METRO-HAUL project The project’s overall agenda also encompasses several other strands of research around the central topic of a metro network, with Professor Lord and his colleagues working to develop a variety of technologies, including optical transmission technologies and overall service orchestration software. Providing services such as augmented reality and holographic communications alongside residential and business internet on a single infrastructure is very difficult: another major topic of interest in the project. “From a networking perspective, companies would previously have simply built different networks to provide these different types of services. This model is not sustainable anymore, especially with exponential growth of bandwidth demand,” says Dr Daniel King, a researcher also working on the project. “One of the goals of METRO-HAUL is to develop a shared optical infrastructure, which is highly malleable, supporting varying traffic characteristics for vertical services.” The project’s work developing a Control, Orchestration and Management (COM)

EU Research

system has been central to this objective. In a 5G network, services and applications run on top of the optical layer infrastructure, which is an essential consideration in terms of partitioning - known as network slicing - infrastructure resources. “Network orchestration and IT control need to come together in order to assign and slice the resources to deliver services,” explains Dr King. This idea of network slicing has attracted much attention as a means of meeting the requirements of particular services. “You essentially partition the physical resource, the network, into multiple virtual slices and instantiate IT resources where required, to meet not just the needs of 5G specific services, but this could service other types of fixed network services as well,” says Dr King. “This infrastructure flexibility is an enabler for achieving some of the 5G KPIs, as well as to meet the techno-economic requirements of future networks.” A flexible network capable of controlling resources when required, including IT, and transport resources, leads to significant savings in infrastructure and operating costs as it can be reconfigured for different purposes and to address varying traffic demands. The sliced aspect addresses the requirements of different use cases and applications; it also provides partitioning and isolation of customer and service traffic. Various options are available in terms of the hardware design of a metro transport network, a

Management of 5G network slicing as part of end-to-end network orchestration.

The recent developments in photonic switching, now with the capability to implement switching using photonic integrated circuits and so reduce costs and power consumption, provides a further example of the opportunities of disaggregation. While photonic switching is well suited to a core network, it is overly expensive and, Professor Lord says, vendors don’t have many different offerings for each customer. Somewhere between the two extremes of everything being supplied by a single vendor, and the networks being built entirely independently, there is a happy medium on the right amount of white box disaggregation. “In this case, we can be more agile, and we can use these innovative new technologies that are emerging,” outlines Professor Lord.

Intelligent systems have been deployed in the network that constantly monitors the network performance against different requirements for each unique type of service. topic Professor Lord and his colleagues have explored in the project. “One option might be to go and buy big complex boxes from a specific manufacturer, but this often locks the network to one type of equipment. We wanted to embrace function separation and break the network into smaller components using commercial off the shelf components, generic servers and white box switches, and glue them together ourselves – so-called disaggregation,” he says. Equipment today is increasingly intelligent, which opens up further possibilities. “We’ve been trying to find ways to enable operators to include new, exciting technology, and not necessarily just from the incumbent vendor,” continues Professor Lord. “You do that by making sure that all of these little bits are intelligent in the ways that they communicate.”

It is about delivering one of the cornerstones of software defined networking (SDN), namely the separation of control and forwarding planes, and enabling a multivendor capability. Another very important part of the project centred around network monitoring, an area which Professor Lord says has developed dramatically over the last 20 years or so. “In the past, we just didn’t have the capability to monitor the entire network performance in any great detail, but now it seems you can measure just about everything,” he says. While this enhanced capability has much promise, the data gathered is vast and must be dealt with appropriately to make informed decisions automatically. “Part of the project was about building an architecture able to take all this data and handle it in the right way

using distributed machine intelligence,” says Professor Lord. “You don’t necessarily want all that performance data going to a central point and overwhelming human operators.” The architecture has been designed to gather information and process it at the most appropriate point in the network, and then make decisions about network operation. Data can be gathered from different components, and knowledge of which component originated the data may be relevant for decisions on what actions to take. “It’s an event-driven architecture, in contrast to the traditional network infrastructure, which is much more reactive,” explains Dr King. It is not just about collecting vast amounts of information at different layers and points in time, but also analysing it and then making the right decisions based on conditions. “We’ve got the capability to collect the data, store and analyze it. We use machine learning techniques to improve network control – sometimes you need an algorithm that’s good at spotting behaviour that may lead to failure, but at other times you may want to instantiate a network function in a new part of the network to improve traffic efficiency,” continues Dr King. This type of intelligent monitoring allows the metro network to autonomously react to performance degradations, to act before faults occur proactively, and to automatically reconfigure the network to accommodate new services with special requirements as demand changes. This may depend on the types of services for which a network is being used. Remote surgery, where a 5G network is used to enable doctors to perform operations remotely with robots, is one service that has generated a lot of interest. “With remote surgery, we can have a primary and a back-up path and IT nodes that host medical applications. One problem, however, is minimising the delay variation between them. A system has


METRO-HAUL METRO High bandwidth, 5G Applicationaware optical network, with edge storage, compUte and low Latency

Project Objectives

The aim of Metro-Haul is to design a smart optical metro infrastructure to support heterogeneous 5G access networks, addressing future capacity increase and characteristics like latency and jitter. This infrastructure supports a wide variety of services and use-cases with special emphasis on services from various industries vertical to the ICT.

Project Funding

Funded under H2020-ICT-2016-2 RIA (Research and Innovative Action), Grant Agreement Number: 761727.

Project Partners

• 21 Project Partners

Contact Details

Andrew Lord Senior Manager, Optical Networks Research British Telecommunications Public Limited Company Orion 5, Adastral Park, Martlesham Heath, Ipswich, IP5 3RE United Kingdom E: W: Daniel King

Andrew Lord

Andrew Lord heads BT’s optical core and access research including quantum communications. He has worked on a wide range of optical network systems and technologies, including long haul subsea and terrestrial DWDM networks. He has published over 100 research papers and holds a BA in Physics from Oxford University.

to be capable of seeing the delay variation between a primary and a back-up path and the IT resources, and if a particular bound is exceeded, then a new path or a new backup path has to be created,” says Dr King. It is simply not possible for humans to perform this kind of real-time monitoring and control, so Dr King says. “Intelligent systems have been deployed in the network that constantly monitors the network performance against different requirements for each unique type of service,” he explains.

intelligent layer is required, that effectively brings everything together, end-to-end and enables more effective communication that reduces the need for additional network equipment for specific services. “This layer can orchestrate different sectors, such as transmission and networking, and help them communicate,” explains Professor Lord. “This may not necessarily cut overall energy consumption, but at least it stops it growing exponentially.” A large number of research papers have

In METRO-HAUL, we looked at how you architect and build a cost-effective and energy-efficient metro network that can meet the challenging requirements for the upcoming vertical industries in the new 5G era. Energy efficiency The overall energy efficiency of this infrastructure is also an important consideration. Historically, network operators have responded to demand for increased capacity by investing more money and installing new equipment in their networks. Professor Lord says this kind of approach will eventually prove to be unviable. “That won’t work in the future, so we have to get canny. While some problems can be solved by throwing cash at them, that’s going to break eventually,” he acknowledges. This traditional strategy leads to higher operational costs, eventually reaching a crunch point when work is needed to upgrade the facilities hosting the network equipment in order to accommodate the increased equipment footprint and power consumption. A new,

been published over the course of the project on a variety of topics, including the physical layer, control plane, and new monitoring and control architectures. A lot of progress has been made on various technical issues, while Professor Lord is also thinking about the bigger picture. “We’ve taken all of the innovations, costed them and put them together, and we’ve asked; What difference can it make? How can we quantify the difference METRO-HAUL brings about? The only way you can really do that, in the end, is techno-economically,” he explains. This is part of the progress towards more intelligent networks, which Professor Lord believes will make a tangible difference to users. “In the future intelligent network, we want network resources to be available and shared for all the different applications,” he concludes.

Daniel King is a consultant at Old Dog Consulting and Senior Researcher Associate at Lancaster University. He worked previously in early-stage roles for leading technology companies including Bell Labs, Cisco Systems, Redback Networks, Movaz Networks, and was a co-founder of Aria Networks. He is an active leader, researcher and contributor to open Internet and optical standards.


EU Research

Harnessing the power of earth observations The Copernicus Sentinel-6 mission comprises two identical satellites launched five years apart. It not only serves Copernicus, but also the international climate community. © ESA/ATG Medialab

© ESA/ATG Medialab

Satellites continuously orbit the earth and gather huge amounts of data, which can be highly relevant for decisions in areas like flood risk assessment and food security. We spoke to Dr Guido Vingione and Maria Gabriella Scarpino about the EOPEN project’s work in developing an open platform to enable non-expert users to make full use of satellite data. The



systematically monitors the globe through a constellation of satellites named Sentinels, which continuously acquire imagery and data on our planet. This data is enormously relevant in areas like flood risk assessment, food security and climate change mitigation, yet it must be presented in an accessible way if it is to meet the needs of end-users, a topic central to the EOPEN project. “EOPEN is essentially a platform targeted at non-expert users who want to easily access this big volume of Copernicus data,” says Dr Guido Vingione, the project’s coordinator. Three use cases have been identified, where satellite data could help inform decision-making on the ground. “The first is flood risk assessment and prevention in an area under the Eastern Alps Water District Authority, partner in the project, particularly focussed over Vicenza, in the Veneto region. When there is a heightened flood risk, satellite data can provide mapping and information,” outlines Dr Vingione. “The second use case is

about demonstrating that satellite data can be useful to entities in South Korea managing food resources and food security.” A third use case centres around using both non-earth observation datasets and (satellite) earth observation to mitigate the impact of climate change in Finland and adapt to its effects, which differs in nature from the first two use cases. With the climate change use case, the emphasis is more on identifying longer-term patterns. “With climate change, it’s a matter of processing time series, to produce global maps. It’s different in terms of volumes and complexity, it’s about processing a huge volume of data,” explains Dr Vingione. The focus in this specific use case is on monitoring the extent of climate change in the Arctic, as well as assessing its likely impact on regional infrastructure. “The Finnish Meteorological Institute aim to support the national transport agency in road maintenance operations and future planning, as well as to support Finnish Lapland communities. Reindeer herders are on the frontline of climate change,”

says Maria Gabriella Scarpino, a researcher at Serco who is also closely working on the project. “The aim with the EOPEN platform is, among others, to provide users, stakeholders, with the tools they need to process data of various typologies. In this use case, that’s weather forecast, climate projections, observation of snow covers and time series of temperature and precipitation, to support particularly reindeer herders and reindeer researchers.”

Earth Observation data The end-users in the different use cases may not necessarily be experts in interpreting satellite data, yet nevertheless it can provide important insights and help them work more effectively. The platform is targeted at those staff members with a degree of authority, people who are called upon to make decisions based on the information available to them. “In the case of flood risk assessment, it is the civil protection authority in Italy who need to know the area which is likely to be affected by flooding. That


© Contains modified Copernicus Sentinel data (2020), processed by ESA, CC BY-SA 3.0 IGO.

SAR (Sentinel 1) and DEM data fusion, based on Deep Convolutional Neural Network (AI/DCNN)

EOPEN Team (

user has to decide how to deploy resources over the affected area,” says Dr Vingione. A satellite image of the extent of a flood holds clear relevance in these terms, providing a snapshot of the situation at a particular point in time, while a satellite also provides images of the same area over extended periods, allowing researchers to assess the extent of any changes. “The sentinel satellites have a reconnaissance cycle (repeat-cycle) of between 7-10 days, depending on the mission. So, they pass over the same area, at the same local time, every 7-10 days,” says Dr Vingione. The EOPEN platform itself has been designed with the needs of these users firmly in mind, with researchers developing a set of software and applications which can be run on different platforms, thus tackling interoperability. EOPEN is an open platform, agnostic of both the programming language and the platform in which processes are running (federated platforms execution framework), and is capable of integrating algorithms from users, consumers, and third parties, as well as of integrating more data from other sources. “We have a high computing capability, with the ability to process huge volumes of data, and to deal with different typologies of data,” explains Dr Vingione. The satellite data is currently mainly acquired from sentinel 1 and 2, but the EOPEN platform is agnostic of the data, and Dr Vingione says it is


possible to include data from other sources. “If there is an application which requires sentinel 3 data, then we can inject that in our working environment, and we can build algorithms to deal with that information,” he outlines. This satellite imagery is combined with data from other sources in the platform, including weather forecasts and tweets from social media users, which can help to build a deeper picture of the situation on the ground. A social media user may send a tweet about the weather in their region for example, yet it

tweets and deliver it to the consumers of the platform,” says Dr Vingione. “This process is automatically managed through an algorithm based on machine learning.”

Decision-making The wider aim here is to support a prompt reaction to a potential flood event. Shortly after the weather forecast is issued, layers showing maximum expected precipitation amounts in each municipality at various times are available in the platform, with a frequency of update

A typical EOPEN user can even build a new application, re-using EOPEN data and available algorithms, and complement it with other sources of information, like for example data of restricted usage. is not easy to combine this with information from satellite images. “Satellite data is wellstructured, whereas a tweet may be classified as non-structured data. There may be text accompanying an image, but the text may not be relevant to the image, so it is not structured data,” explains Dr Vingione. Researchers are applying big data analytics methods, in particular machine learning techniques, to detect the relevant (anonymized) tweets as well as filter out the unwanted ones. “We try to gather information from these

which can be set to 1-4 times a day. In addition, a warning email is sent to relevant people when the precipitation is expected to exceed given thresholds in the following 48 hours. If an event occurs, geo-localized tweets can complement the information derived from satellite-based maps. Also, on an event occurrence, the Water District Authority is using the platform resources to generate an advanced early warning flood forecast product which would take tens of hours to be produced on its systems. This information can help staff involved in flood risk management

EU Research

to make more effective decisions. “We can provide an early assessment, which is extremely important for the authority managing the risk.” says Dr Vingione. A different use case may have different requirements, for example the food security use case requires big data management, distributed processing, low dependence on ground truth information, and a highly generalised machine learning modelling. In these cases, solutions can be provided which encompass high performance computing, semi-supervised machine learning, model generalisation and interoperability, in line with the EC call which funded the EOPEN platform. “A typical EOPEN user can even build a new application, reusing EOPEN data and available algorithms, and complement it with other sources of information, like for example proprietary data (e.g. field data),” says Dr Vingione. A lot of progress has been made over the course of the project in terms of developing new tools and software, and with the funding term nearing its end, researchers are now looking to bring their findings to wider attention. Several events have been organized to collect feedback from use case stakeholders at which wider possibilities in terms of the potential of the platform will be explored. “An event targeting both, stakeholders and policy makers is being organized by the EC Project Officer. It will involve all projects funded under

the same call, to see how these platforms can be used to address current social problems and challenges,” says Scarpino. There is also the possibility of commercial exploitation, with Dr Vingione looking to provide the EOPEN platform to other parties who may benefit from the ability to combine different sources of data. “Our intention is to start commercialising services based on EOPEN capabilities,” he outlines. “There is also the possibility to use the platform to build other applications, so this means EOPEN not as a provider of information, but as a working environment.” The EOPEN platform is able to provide a context or working environment, allowing the user to build a specific application. “So, if the user has specialised knowledge, they can derive all the applications they want, based on the satellite data that is available through EOPEN,” says Dr Vingione. The project consortium partners are currently exploring possible business models to sell the EOPEN services more widely, and Dr Vingione believes this versatility is an important attribute. “We’re developing a local platform that allows users to inject more detail if they choose to,” he explains. “We are also agnostic from the infrastructure hosting EOPEN, we are not constrained by a specific cloud provider. We want to be interoperable with other platforms, so our software and applications, can be run on different platforms.”

EOPEN Open interoperable platform for unified access and analysis of Earth Observation data

Project Objectives

The objective of EOPEN is to ensure scalability of the data standardisation, fusion and exchange methods, combining also nonEO data and metadata annotation. EOPEN combines mature ICT solutions and scalable processing techniques, building on top of existing European High Performance Computing (HPC) infrastructure. The project’s team consists of large service industry, specialised SME, public research centres and university.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement 776019 (EU contribution: 1.999.500 Euros).

Project Partners

Contact Details

Project Coordinator, Dr Guido Vingione Serco Italia S.p.A. Via Sciadonna, 24 00044 Frascati Rome, Italy T: +39 06 9835 4408 E: W: Dr Guido Vingione

Dr Guido Vingione is the EMEA Space Business Development Director for Serco. He gained his PhD in Space Science and Technologies from the University of Naples and joined Serco in 2006, before taking up his current role in 2018. He has authored a number of publications in reviewed journals and has participated in several EC-funded projects.

Application-Service full life cycle support.


The Weight of the World on our Shoulders By the end of 2020, the combined weight of all man-made objects will likely exceed the weight of all the living things on Earth. That’s a heavy burden to take for a world under ecological threat. We are literally outweighing nature with our manufacturing and far from trying to reduce this tsunami of artificial material, it looks set to grow exponentially. By Richard Forsyth


he estimated weight of all the ‘stuff’ that humans make, from concrete and steel buildings, to bricks and plastic, roads and the vehicles on them, and all the machinery and products you can imagine, is estimated to be about one teratonne. A teratonne is an exotic sounding measurement that few would have ever hear of, precisely because it is rarely used in a sentence due to its size. It is equivalent to 1000,000,000,000 tonnes. Despite economies slowing down with the coronavirus pandemic, the end of 2020 sees a milestone that is almost impossible to fathom, we’ve built more things than there is biomass on the Earth. We are churning out stuff at a rate of around 30 gigatonnes (30,000,000,000 tonnes) each year. To put this in some kind of perspective we can imagine, for every human in the world, more than their body weight in stuff is being produced each week.

Photo by Miltiadis Fragkidis

Our place and responsibility


These seemingly incalculable calculations have been produced by a research team at Weizmann Institute of Sciences in Israel and published in the journal Nature. “It’s a gradual process that is happening on the face of the Earth, we have really transformed the situation from human impacts and human production being tiny to human artefacts and mass being equal to and very soon much more than all the living things on Earth”, stated Professor Ron Milo from the Department of Plant and Environmental Sciences at the Weizmann Institute, who led the research group. “I think that just realising the fact that we are at this point, where humanity is such a large effector and such a large force in shaping the face of the Earth… Already tells us something about our place and our responsibility.” Milo is keen to show that all our decisions have an impact from the size of houses we build, to the way we consume, to whether we choose to recycle products and waste, it all adds up in the tally of artificial mass and the impact that we will have on nature as a whole. While the biomass is relatively constant the ‘anthropogenic mass’, the manmade stuff, has been approximately doubling every 20 years, rapidly increasing, to the point at the turn of 2021 when it will amount to more than all living things. It’s a runaway train that’s not in any way scheduled to stop, either. Projecting just to 2040 we can say that the weight of all our ‘stuff’ will reach two teratonnes, which equates to more than double the mass of all living things and so it will go on. The sheer magnitude of the things we make means we will leave a layer of our materials in the rocks millions of years into the future, like fossil remnants. At the beginning of the 20th century the anthropogenic mass was only 3 percent of total biomass so in just over 100 years we have gone from this low proportion to equalling mass. From the research there was a distinct surge of building from around the 1950s, post-World War II, when building materials like concrete became widely available and accessible. Post-World War II saw economic expansion that was labelled

Photo by Denys Nevozhai

EU Research

Projecting just to 2040 we can say that the weight of all our ‘stuff’ will reach two teratonnes (2000,000,000,000 tonnes), which equates to more than double the mass of all living things and so it will go on.

the post-war economic boom, otherwise known as the Golden Age of Capitalism. Indeed, the US, Russia, Western Europe and East Asia countries experienced a sustained period of growth and high employment. This boom was broadly a worldwide phenomenon – ending in the mid 70’s. From this time marker of accelerated building, of homes, of infrastructure, the result is an illustration of how unbridled economic growth means what we create, gradually, inevitably, dominates nature. Another significant, contributing factor is population growth. Consider that during the 20th century the total world population has risen from 1.6 billion in 1900 to more than 6 billion in 2000. In reality the things we produce, manufacture and build have even outmatched this pace of human population growth, as each person directly or indirectly accounts for the generation of so much material and consumption over a lifetime and adds to the need for infrastructure. Waste that cannot be broken down or is not biodegradable also lingers, adding to the total weight. It should be noted that the precise timing of crossover is dependent to a degree on exact definitions of life or biomass, which may lead to a range of estimates, something the research authors are aware of. For example, they used dry-weight estimates (so not water) but even with wet-mass factored into calculations, the time differentials for the crossing over point would still be within around ten years.

One in a line of tipping points The grim tipping point is a stark reminder of our role in accelerated climate change and the ongoing threat to and destruction of biodiversity and life sustaining ecosystems. Consider in the same year we match nature’s mass with manufactured mass, in 2020 the destruction of the world’s largest rainforest rose 9.5 percent, and a UN report states that all the 20 Aichi biodiversity targets made in 2010,

from preventing pollution to protecting coral reefs, have not been achieved by the international community for 2020. Indeed, one of the findings from this latest study is the scale of our impact on the Earth’s biomass as a whole. It was found that since the first agricultural revolution Mankind has halved the world’s biomass from around two teratonnes to the current figure of one teratonne. Agriculture, encompassing the patchwork of fields and crops that are the main feature of so many landscapes around the world, combined with the relentless and increasing devastation of deforestation, means wild nature has no place to succeed. It’s easy to see how the scales of balance tip against wilderness and will continue to do so until it has mostly vanished. There are efforts to reserve areas for replanting and massive greening projects, but will this win the battle for space in a competitive, resource hungry world? That remains to be seen. Without a workable, alternative model to consumerism and growth, realisations of our unrelenting impact are grim. For all our grandstanding ecological goals and aims around reducing pollution, even with a pandemic, the numbers don’t lie. The aim for human nature, today at least, is continued industrial growth, building, creating and ultimately at best encroaching on and at worst destroying the nature we have left. As a concept, the latest research by Milo and his team illustrates very clearly how much we have impacted our world with our model of consumption and building. The research concludes: “This study joins recent efforts to quantify and evaluate the scale and impact of human activities on our planet. The impacts of these activities have been so abrupt and considerable that it has been proposed that the current geological epoch be renamed the Anthropocene.” It appears future generations will be inhabiting in a literal sense, an artificial world, unless we can drastically reduce our rate of manufacturing, building and consumption.

Photo by Tanvi Sharma


Participatory Imagin(eer)ing. Merge of the visual outcomes of different participatory workshops: Susanne Käser, research team.

Visualising the future of our cities

Cities are home to people, organizations, and businesses with different interests and development priorities. We spoke to anthropologist Dr Aylin Tschoepe, and MA image researcher Susanne Käser, about their work in using an interdisciplinary approach to explore the possibility of enabling participation in urban planning processes through multi-authored images as a way of communicating future urban visions. Many urban development plans are illustrated through the use of highly realistic images to show how districts would be transformed under certain proposals. Professional image-makers produce such visualizations for architectural offices, urban developers, and administrators, to show how development plans would affect an urban landscape. However, forms of representation like architectural renderings must meet specific guidelines and comply with disciplinary conventions, and this can make them difficult to interpret for many external actors affected by the plans. This reduces the space for emancipatory participation processes and can leave some groups at risk of being excluded, an issue central to the project team at the Critical Icono-Ethnography Lab ( Dr Tschoepe and Mrs Käser are working on a project investigating the use of visual representations in urban planning processes. “Our intention in the project is to foster social negotiation, to better understand how participatory processes can take place, and to investigate what kind of images have the capacity to enable participation,” say the researchers.


Participatory Imagin(eer)ing. Workshop May 18, 2019 in collaboration with CVP Basel-Stadt, Gewerbeverband BaselStadt, Mittelstands-Vereinigung Basel and Zukunft.Klybeck.

Urban planning processes The City of Basel is being used as a case study in the research. There is a legal framework to enable participation in urban planning processes, yet the understanding of what this means varies greatly between the different actors, from urban planners, to government bodies, to local residents. “Participation is interpreted differently in urban planning processes in Basel and ranges from very controlled to emancipatory forms of participation,” says Dr Tschoepe. The study of these forms of participation is therefore an important basis for further research, with the project focusing on two main considerations. The first of these is how an urban

centre and its possible future development is understood and negotiated through the creation of images, while, secondly, researchers are also looking at how those images are then used subsequently: “We’re investigating the alliances and narratives that emerge when different actors collaborate in Imagin(eer)ing Processes. We question hereby the hierarchizing of different types of knowledge.” Dr. Tschoepe and Mrs Käser consider imagining and imagineering as interwoven, hybrid practices that combine forms of expertise and knowledge that lie beyond normative assumptions: “Following this logic, we design and facilitate workshops with various participants and take into account their previous knowledge, experiences, and everyday practices in order to make these workshops more inclusive. From an anthropological and design perspective, we use our expertise to promote communication between diverse participants through the image, in its capacity as a carrier of imagin(eer)ed realities.” An image process called participatory imagin(eer)ing is central to helping different actors to engage with each other and shape the future of a city in a more inclusive, active way. The research project itself is interdisciplinary in

EU Research


scope, with the team developing critical ‘iconoethnography’ as a methodology based on cultural anthropology and image design. Through this approach, the researchers are taking a fresh look at how the different parties involved in urban planning can engage with each other.

Project Objectives

In the interdisciplinary research project “Visual Communication in participatory urban planning processes” the research team investigates the relationship between social negotiations of different actors and the resulting visions of the future in the form of images.

Participatory workshops One part of the project involves bringing together groups with different ideas about urban planning in participatory workshops, to both take part in the design of these images of urban futures and also debate issues arising from them. “Different parties come together in these workshops so that they can negotiate over their co-created images,” says Mrs Käser. The workshops are conducted as open interviews, where images are shared with the aim of identifying issues and questions relevant to the different actors. “In these interview settings, we aim to get to the core of the qualities that images must have in order to be participatory” says Ms Käser. “It’s not so much about creating an alternative planning document as getting to a point where people have a better understanding of other participants’ concerns when they leave the workshop and begin to think toward common futures,” explains Dr Tschoepe. Instead of a top-down approach, urban planning processes should be based on compromise to become more inclusive and allow city residents to participate and identify issues important to them. The team believes these participatory image processes can help open avenues forward in this respect. “The workshops are about a shared understanding of what is important for different stakeholders: the urban commons that we negotiate over on an everyday basis, whereby the images reflect conflict and compromise. A participatory image is always about including and listening to multiple voices and being open to different interpretations,” continues Dr Tschoepe. The images themselves are developed through multi-authorship, representing different viewpoints about the future of a space, from relatively minor shifts in a community to more radical ideas about the urban landscape. The researchers believe that if different, competing actors engage with each other in a meaningful way, it will encourage them to take an active role in shaping and envisioning the future of their city.

Possible further applications A visual communication method can help different actors to communicate effectively, appreciate other points of view, and identify what they value in city life. The project itself has been conducted in Basel, but the methods that have been developed could be applied in different settings and locales. “Our methods could also be

Project Funding

The project is funded be the Swiss National Science Foundation (SNF 100013_176459)

Project Partners Participatory Imagin(eer)ing. Visualization of verbal narrations: Silvia Balzan, research team

adapted to other spatial contexts, such as smaller municipalities or the countryside,” continues Dr Tschoepe. Further workshops will be conducted, as the research team aims to address the challenges that have arisen over the course of the project, and also improve the processes that have been developed. “Currently we’re also working on a digital version of a collaborative space, where image-making processes enable participants to engage with each other virtually. We strive to understand how to navigate between virtual and embodied space.” she outlines. Social activities and interactions may look very different in the aftermath of the Covid-19 pandemic, another issue relevant to the project’s research. “What kinds of new socialities and participation will arise? How will they change the process of making the images?” asks Dr Tschoepe. This research is not just of academic interest, but also holds wider relevance to organisations such as planning departments, cultural institutions, local activist groups, or NGOs that thrive on participatory processes. The hope is to influence policy-making, and inspire architects and urban planners among others. “They could further adapt what we have developed, then shape it according to their own processes, and it could be applied in a variety of settings,” outlines Susanne Käser. The next step could be to test the project’s approach in other locations and put these findings on a firmer footing, which would be an important step towards wider application. One important issue is to find the ideal scale at which to develop this icono-ethnographic approach, which will depend on local circumstances. “We are interested in expanding and testing this approach in other physical and virtual spaces as well.” continues Dr Tschoepe. “Ideally, our efforts will inspire and be useful to a growing community engaged in emancipatory urban practices who can take over, change, and invent further inclusive design and social interaction processes.”

• FHNW Academy of Art and Design, Institute of Visual Communication, Basel, Switzerland schools/academy-of-art-and-design/ institutes/institute-of-visual-communication • University of Basel, Institute of Cultural Anthropology and European Ethnology, Basel, Switzerland https://kulturwissenschaft.philhist.unibas. ch/de/home/

Research Team

MA Silvia Balzan (Academy of Art and Design Basel) MA Susanne Käser (Academy of Art and Design Basel) Dr. Aylin Yildirim Tschoepe (University of Basel)

Project Management

Professor Michael Renner (Academy of Art and Design Basel) Professor Ina Dietzsch (Philipps University of Marburg)

Contact Details

Susanne Käser Fachhochschule Nordwestschweiz Hochschule für Gestaltung und Kunst Institut Visuelle Kommunikation Freilager-Platz 1 4002 Basel E: W: Dr. Aylin Tschoepe Seminar für Kulturwissenschaft und Europäische Ethnologie Universität Basel Rheinsprung 9/11 CH-4051 Basel E: W:


Understanding low-end innovators Low-end innovations can prove highly profitable for businesses, yet many decision-makers remain biased towards high-end innovation. We spoke to Professor Sebastian Gurtner about his research into the characteristics of low-end innovators and how organisations can effectively support the development of new low-end products. The emergence of

a new product or technology often generates great interest, yet low-end innovations targeted at consumers with a low ability to pay can also boost profits and improve a company’s commercial prospects. While many commercial success stories are linked to high-end innovation and the development of new products targeted at consumers willing to pay above the average price, low-end innovation can also be profitable, says Professor Sebastian Gurtner, Head of the Institute for Innovation and Strategic Entrepreneurship at Berne University of Applied Sciences. “Some companies have made a lot of money with low-end innovations,” he explains. However, evidence shows there is still a bias towards high-end innovation projects among decision-makers. “We see that 8 out of 10 people would invest in high-end innovation over low-end innovation, even where there is the same profit and risk forecast for both scenarios,” outlines Professor Gurtner.

LOW END INNOVATION Low-end Innovation: An Investigation of Key Individuals and Organizational Ecosystems Funded by the Swiss National Science Foundation (SNSF) Professor Sebastian Gurtner Bern University of Applied Sciences Business School Institut Innovation and Entrepreneurship Brückenstrasse 73 3005 Bern T: +41 31 848 34 27 E: W: en/research/researchprojects/2018-657-112-149/

Low-end innovation This topic is at the heart of Professor Gurtner’s research in a new SNF-funded project in which he is exploring two larger themes, the first of which is to understand the low-end innovator as a person. Within the project researchers have reviewed the literature and conducted interviews with people from industry, particularly in the energy, transportation and healthcare sectors, aiming to identify specific individual characteristics common to low-end

instrument to help companies make decisions about new products,” outlines Professor Gurtner. The second major theme in the project involves investigating the influence of the surrounding eco-system or environment of an innovator. “We are looking at innovators at all different levels. Most low-end innovators are entrepreneurs, working in small companies and start-ups. But we also have case studies where we see low-end innovation happening in larger companies,” says Professor Gurtner.

We see some innovators that seek opportunities, regardless of whether they are low-end or high-end. We also see innovators that focus only on low-end innovation. They are typically much more demand-focused. innovators. “We are trying to find out how low-end innovators work,” explains Professor Gurtner. Typically, low-end innovators are motivated by the goal of addressing societal challenges, while others are more financially driven. “We see some innovators that seek opportunities, regardless of whether they are low-end or high-end,” continues Professor Gurtner. “We also see innovators that focus only on low-end innovation. They are typically much more demand-focused, or focused on problem-solving.” A deeper understanding of the motivations of key personnel could help guide a company’s commercial strategy, an area of great interest to Professor Gurtner. Part of the project involves developing a scale to assess whether an individual is more likely to focus on highend or low-end innovation. “This could be an

The right kind of working environment can encourage development and innovation, yet this does not mean that the working environment has to be completely harmonious however. This is another topic of interest to Professor Gurtner. “Sometimes creativity and innovation arise out of conflict,” he says. There is a lot still to learn about how the relationship between an individual and the organisation affects the capacity for innovation, while Professor Gurtner also plans to build further on the project’s findings so far. “We’re interested in measuring the societal impact of innovation. Our hypothesis is that those low-end innovations actually have a greater positive societal impact than high-end innovations, while having the same economic impact. This would be a quantitative, larger study,” he continues.

Professor Sebastian Gurtner is the Head of the Institute for Innovation and Strategic Entrepreneurship at Berne University of Applied Sciences, he previously held research positions at the Technical University of Dresden. His research interests include innovation, entrepreneurship and strategic management.


EU Research

Sensitive spectrometers for material analysis Electron paramagnetic resonance (EPR) is an important tool in science, enabling researchers to gain deeper insights into the structure and function of different materials, yet classical methods have relatively low levels of sensitivity. The PETER project is developing a new, more sensitive method to analyse species and materials at a microscopic level, as Professor Tomáš Šikola explains. A plasmon can be thought of as a type of quasiparticle related to the oscillations of free electrons in a metallic material, where light causes the oscillations of those electrons. When electrons oscillate inside a metal, then the radiation related to it is highly concentrated in the immediate vicinity of those metallic particles. “There is a very high electromagnetic field in this tiny area,” explains Tomáš Šikola, a Professor of Applied Physics at Brno University of Technology (BUT) and leader of the research group “Fabrication and Characterisation of Nanostructures” at CEITEC BUT in the Czech city of Brno. This means it is in principle possible to get more information about a material’s properties through enhancing the local magnetic field component of this radiation, a topic at the heart of the PETER project, in which Professor Šikola and his colleagues are developing a new microspectrometer instrument. “The magnetic nearfield component is enhanced by plasmons, then we can get information just from small areas in the vicinity of those particles,” he says.

Plasmonic antennas This capability is built on the use of resonant microstructures, or plasmonic antennas, which help to intensify the magnetic component of terahertz electromagnetic radiation in the vicinity of metallic particles and so significantly improve sensitivity. The electro-magnetic field is enhanced when the dimensions of these plasmonic antennas are closely related to the wavelength of the electromagnetic radiation. “We fabricate these microstructures by either electron beam lithography, or optical lithography,” outlines Professor Šikola. It’s important here to tune the microstructures to a high level of precision, as Professor Šikola says there is a relationship between the resonant wavelengths and the dimensions of the structures. “We can simulate those

Prototype of the developed instrument for Plasmon Enhanced THz EPR spectroscopy and microscopy installed in Stuttgart (Bottom: a view of THz quasi-optics used in the instrument)

in the project has primarily centred around the development of plasmonic antennas and the AFM system, Professor Šikola is very much aware of the wider potential of a more sensitive micro-spectrometer with improved spatial resolution. “It could be used for example to analyse batteries, for understanding the function of catalytic centres, while there are also potential applications in quantum computing,” he says. Researchers are working to develop the instrument further to a point where it can eventually be commercialised. “We would like to offer our prototype to the Electron Paramagnetic Resonance (EPR) community. Our prototype is currently ready in Stuttgart,” says Professor Šikola.

The electromagnetic field is enhanced by plasmons, then we can get information just from small areas in the vicinity of those particles. structures using numerical methods, and we can establish relatively clear relationships,” he continues. “Once we come to fabricate the microstructures, we can be confident that the resonance is as we expected.” These resonant microstructures – plasmonic antennas – are then fabricated on planar substrates for EPR spectroscopy applications. “If we manage to fabricate one antenna on a probing tip of a specially developed atomic force microscope (AFM), then we can use the EPR methods for microscopic investigation of analytes with a spatial resolution deep below the diffraction limit (≈ 1μm). And this would be a real breakthrough in this EPR field” Tomáš Šikola adds. The wider aim in this research is to develop a new, more sensitive method of analysing paramagnetic species and materials, with the project bringing together four partners from across Europe (BUT, Univ. of Stuttgart, IC nanoGUNE, and Thomas Keating Ltd.) with different areas of expertise. While CEITECs role

A number of tests still need to be conducted before the prototype can be made available more widely, but the idea is very much on the agenda, with researchers aiming to bring it to a technology readiness level (TRL) of 5 or 6. This would represent an important step towards producing an off-the-shelf machine available to the EPR community. “This is a major driving force in the project,” stresses Professor Šikola.


Plasmon Enhanced Terahertz Electron Paramagnetic Resonance Professor Tomáš Šikola CEITEC - Central European Institute of Technology Fabrication and Characterisation of Nanostructures T: +4 20 54114 2707 E: W: Tomáš Šikola is Professor of applied physics at the Brno University of Technology (BUT, Czech Republic). He is director of the Institute of Physical Engineering” (BUT, Czech Republic), as well as coordinator of CEITEC ‘Advanced Nanotechnologies and Microtechnologies’ research program and leader of the ‘Fabrication and Characterisation of Nanostructures’ research group.


Simulations for complex engineering problems Simulations can play an important role in civil engineering projects, enabling staff to assess how different wind conditions will affect a structure and adapt the design accordingly. We spoke to Professor Fabio Nobile and Professor Riccardo Rossi about the work of the ExaQUte project in constructing a framework to help address complex engineering problems. The local wind

patterns are an important consideration in the design and construction of civil engineering structures, and sophisticated simulation techniques are used to ensure that buildings, bridges and other structures are resistant to different conditions. It is not possible to simulate all possible wind scenarios, so the aim instead is often to simulate a sufficient number to be broadly representative of the conditions in which the structure will function, which raises several questions. “How many of those simulations are required? How accurate should they be? Can we improve the design of the structure by selecting certain types of scenarios?” asks Fabio Nobile, Associate Professor of Scientific Computing and Uncertainty Quantification at EPFL. These questions are central to Professor Nobile’s work in the ExaQUte project, an EU-backed initiative which brings together researchers from across Europe. “We are addressing these questions in the project in probabilistic terms, with a type of Monte Carlo simulation,” he explains.

Exascale systems This research is built on leveraging the power of the next generation of Exascale supercomputers, systems of interconnected central processing units (CPUs) that can

perform up to 1018 floating point operations per second. The aim is to utilise the power of these systems, in particular the potential for parallelism. “A simulation can take hours and use a lot of resources. We don’t want to waste these resources, we want to take advantage of them, so we have to build some maths to back up what we are doing,” says Riccardo Rossi, Research Professor at CIMNE and the project’s Principal Investigator. One of the challenges in exploiting the

available resources efficiently, with the aim of ‘quantifying the uncertainties’ in the outputs of interest. Some simulations may take a longer time than others, which is another important consideration in terms of the project’s overall agenda. “The more resources we have - so the larger the computer - the more difficult it is to orchestrate the simulations. This is where the maths plays a very important role, to devise a strategy to do those simulations in

The more resources we have - so the larger the computer - the more difficult it is to orchestrate the simulations. This is where the maths plays a very important role, to devise strategy to do those simulations in parallel. potential of these Exascale systems is that great care has to be taken with respect to the amount of information that needs to be transferred between the processors. “If there is a constant need to exchange information between CPUs, this has a negative impact on performance,” continues Professor Rossi. “The communication time between those processors effectively reduces the power of the simulation.” The challenge here is in orchestrating the simulations effectively and using the


parallel,” explains Professor Rossi. Multiple simulations can be run in parallel if there is no need to exchange information between processors, while others require interactions that need to be carefully scheduled. A second major topic on the project’s agenda centres around mixing simulations of varying levels of accuracy and with different costs. Running simulations for civil engineering structures can be extremely costly, so Professor Nobile is investigating the balance between the accuracy of the

Views of flow and pressure field around the CAARC (Commonwealth Advisory Aeronautical Council) tall building model.


EU Research

simulations and the number of scenarios that are represented, using a Multi-Level Monte Carlo (MLMC) approach. “Maybe we could run many simulations, but at a lower level of accuracy. Or maybe we could run just a few simulations with a higher level of accuracy, but representing a smaller number of scenarios,” he outlines. Maths plays a central role here in determining the optimal allocation of number of scenarios to be run at each accuracy level, taking also into account the available resources and the need to orchestrate the ensemble of simulations. “We try to find the optimal allocation by building a body of knowledge. We run several scenarios, and based on this we then adapt our approach and reach decisions on the time-span of the simulations and the level of accuracy,” continues Professor Nobile.

Optimising design The wider goal of the ExaQUte project is to help optimise the design of a building, which is intrinsically an iterative process. A simulation is run on a building for different wind conditions and the results are then collected. “We start from certain scenarios and we assess the dynamic loads on the building. Then we change the different parameters a little bit, and we run more scenarios until we reach an optimal design that is robust with respect to the prevailing conditions,” says Professor Nobile. The Multi Level Monte Carlo approach is key here to reducing the overall computational cost. This represents an alternative to traditional experimental approaches to simulating the effects of wind on a building, which are known to have some significant shortcomings inherent to the experimental setup. “We believe that some of those shortcomings could be addressed through the use of computers.

However, computational wind engineering is a relatively novel field, as it’s only relatively recently that we have developed the required computational power, tools and knowledge,” outlines Professor Rossi. A significant degree of progress has been made over the course of the project, with mathematicians and engineers from the different partners benefiting from the opportunity to share knowledge, ideas and expertise. A number of new tools and methods have been developed in the project, with all of the developments open-source, while Professor Rossi says studies into the possible commercial use of some of the technologies are also planned. “This is about assessing the realistic potential applications of what we are doing outside academic settings,” he says. The primary focus of the project is wind engineering, yet some of these tools and methods could eventually be applied in other areas of technology. “There are the same kinds of problems in other fields. So, once we’ve learned how to manage the extreme level of parallelism needed to tackle otherwise unfeasible robust optimization problems, then we can definitely look to move into other fields,” stresses Professor Rossi. There are still many open questions in this area however, and Professor Nobile says there is great scope for further research, which will form an important part of his agenda in future. “We have learned a lot in this project, yet it’s also raised new questions,” he outlines. “We will definitely continue our research in this field. For example we are trying to develop a sound algorithm to optimise robustly the design of a building in unsteady flow conditions, and there is a lot of margin for improvement there.”

ExaQUte EXAscale Quantification of Uncertainties for Technology and Science Simulation Project Objectives

The ExaQUte project aims at exploiting next generation HPC capabilities in tackling Uncertainty Quantification (UQ) problems and Optimization Under Uncertainties (OUU). Such technologies will be demonstrated in application to “wind engineering” with the ultimate objective of improving the design procedure of buildings with respect to wind loads.

Project Funding €3,124,255

Project Partners

Contact Details

Project Co-investigator Professor Fabio Nobile EPFL - SB - MATH - CSQI MA B2 444 (Batiment MA) Station 8 CH-1015 Lausanne Switzerland T: +41 21 69 34244 E: W: Professor Fabio Nobile Professor Riccardo Rossi

Dr Riccardo Rossi, Civil Engineer, is an Associate Professor at the UPC. He is active in the field of CFD and Multiphysics. He is the head of the “Kratos Multiphysics” group at CIMNE where he serves as a Full Research Professor, and coordinator of the EU projects ExaQUte and EdgeTwins.


Morphing rotor blades for more sustainable helicopters Dr Benjamin Woods, Principal Investigator of the SABRE (Shape Adaptive Blades for Rotorcraft Efficiency) project explains the complexities and significant benefits of developing a new approach to helicopter design, which uses blade morphing technologies to optimise performance and reduce fuel burn, CO2 and NOx emissions by 5-10%. Helicopter rotor blade

design has progressed significantly over the last 50 years, with major advancements having been made in materials, manufacturing methods, and aerodynamic sophistication, leading to improved performance. Engineers are approaching the limits of what can be achieved with traditional designs however, as even the most modern designs are stuck with a single, fixed aerodynamic shaped which is expected to perform well in the widely varying operating conditions the blades will see. This is where the teams working on SABRE come in. Their goal is to create a rotor blade that can continuously adapt its shape to the changing operating environment of the helicopter. Considering that rotor blades spin at hundreds of rotations per minute, generating huge forces and strains on the parts in the process, this is no mean feat. But by adapting to these very quickly changing aerodynamic conditions allows the rotor to always be at peak efficiency. The same morphing geometries can be used to adapt to slower changes in conditions, such as changes in air temperature or the amount of payload carried. As Woods puts it: “The industry has spent decades trying to ‘eke’ out performance with rigid blades and we are way into the diminishing returns portion of that curve, so now if you have a blade design that is one per cent better in terms of fuel burn, through airfoil shape or some fancy new material, that is considered quite an achievement with current technology. What we are proposing is to have a step of between a five to ten percent reduction in fuel burn, which would be fantastic for the industry. Once you introduce the option to have the blade shape change continuously at the same rate that the operating conditions change at, you have a lot more potential for efficiency and the main focus for us is to reduce the fuel burn required for the helicopter to fly.”

When airflow is a drag Current blade geometries are prone to an inevitable flaw in efficiency. The problem with the helicopter, particularly when flying forward, is that the part of a blade sweeping into the wind, known as the ‘advancing’ side,


FishBAC morphing airfoil.

will see very high velocities since the forward flight speed and rotation velocity add together. However, on the other side of the rotor (the so called ‘retreating’ side) the blade is moving away from the oncoming wind, and so the rotation velocity subtracts from the forward flight speed – leading to much lower air velocities. These differences in velocity greatly affect the amount of thrust the rotor generates, and significantly increase the amount of drag

Six ways to change a blade SABRE’s research efforts derive from several European teams working in collaboration including University of Bristol, Centro Italiano Riecerche Aerospazaili, the German Aerospace Center, Delft University of Technology, Technical University of Munich and Swansea University. Between them, six promising morphing blade concepts are being investigated. These are Fish Bone Active Camber (FishBAC), Translation

If we can handle the physics of making morphing work in a tiny helicopter blade spinning around at hundreds of rpm with huge centrifugal forces trying to rip them apart, there is a good chance we can make it work for wind turbines and commercial airliners. that must be overcome by burning more fuel to make more power. Current designs of helicopters are able to change the overall pitch of the blade as it spins around the helicopter in order to partially balance out the impact of these changing velocities, but since this moves the entire blade, it is not able to account for desired changes along the length of the blade – leading to increased fuel burn. If instead the blade geometry could be continuously changed and optimised – both along its length and as it spins around, then the need to compromise on the aerodynamic performance would be removed, with the rotor always having the best possible shape for the exact combination of conditions it is experiencing at that moment in time. Furthermore, these morphing blades would be automatically controlled by computers and they would respond to measurements of the exact current conditions using pressure and strain sensors, to ensure that the reductions in fuel burn don’t make the pilot’s job any more difficult.

Induced Camber (TRIC), Inertially Driven Twist, Shape Memory Alloy Driven Twist, Chord Extension, and Active Tendons. There are many key design parameters that should be considered when creating rotor blades. For example, you should ask ‘how much curvature is there in the airfoil?’, ‘how does the length of the airfoils vary over the blade?’, ‘how much twist is there in the blade?’. These are the types of features that traditionally engineers play with to optimise rotorcraft performance, The SABRE project vision

EU Research

SABRE Shape Adaptive Blades for Rotorcraft Efficiency Project Objectives

Reducing energy consumption and environmental impact of aviation: Reducing helicopter fuel burn and emissions by 5 – 10% Improving European industrial competitiveness and reinforcing employment: The results of this project will generate substantial recurring savings in direct operating costs, offsetting the initial cost of the added technology over the life or the airframe thus keeping Europe competitive and securing employment within the industry Knowledge transfer of morphing technologies: The methods used in SABRE could have positive implications for fixed wing aircraft and through active blade control it is also potentially adaptable to large scale wind turbines, making them more cost efficient Delivering international collaboration for innovation: Fostering new partnerships which will help integrate new knowledge into industry after the project end

with compromises being required due to the wide range of operating conditions. “Those are the exact same things that we would love to be able to change – but in real time – to allow us to fully respond to the different operating conditions without the need to compromise. We are looking at exactly those factors,” explained Woods. “We have two different concepts that can actively change how much twist the blade has. We are also investigating two different ways of actively changing the airfoil curvature (known as camber). We have a concept which can increase the length of the airfoil in a smoothly varying way, and we have an active tendon concept which uses tensioned cables, not to change the shape of the blade, but to alter its dynamic response of the blade. A harsh reality of helicopters is that the huge variations in force during rotation excite, or vibrate, the long skinny blade structures. When you are thinking about changing the blades as much as we are with these morphing devices, it is important that we have a way to mitigate any negative impacts on the dynamic response of the blades, and our active tendon concept gives us that ability. ”

A revolution in efficiency The technologies being developed in SABRE have the attractive mix of making economic sense whilst in tandem helping achieve a step in the right direction for sustainability aims. “Efficiency matters for the operators,” said Woods. “My personal motivation is sustainability, but even if the operator of the helicopter doesn’t care at all about sustainability (which thankfully is becoming less common), they would definitely still benefit from the fuel that they don’t have to

buy because we made the helicopters more efficient. So, increasing efficiencies is the best way forward for everyone using helicopters.” The implications for the environment are major if the technology is rolled out. Reducing emissions from aviation is a goal of the EU in line with its 2030 climate objectives, so finding technologies that slash fuel use is a vital part of the solution. During the global Covid19 pandemic, aviation as a sector has suffered greatly and when the dust settles, surviving operators will be keen to innovate their aircraft toward higher efficiencies. This brings us to a major point around knowledge transfer of SABRE’s research. Whilst the focus of the project has very much been on helicopters, the concepts that have been developed could be adapted for fixed wing aircraft and wind turbines. Considering these are two sector giants, the real savings and implications for reducing fuel use, pollution and saving money are very exciting indeed. “The physics of helicopters makes it super challenging to find efficiencies. While wind turbines and fixed wing aircraft are by no means ‘easy’ either, if we can handle the physics of making morphing work in a tiny helicopter blade spinning around at hundreds of rpm with huge centrifugal forces trying to rip them apart, there is a good chance we can make it work for wind turbines and commercial airliners.” The partners involved are now preparing to test demonstrators, turning the analysis into hardware and lining up these flexible and adaptive structures for wind tunnel and whirl tower testing. Helicopters may soon benefit from these research efforts in terms of performance, economic savings and reduced environmental impact of aviation by burning less fuel in flight.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 723491

Project Partners

Contact Details

Benjamin King Sutton Woods Department of Aerospace Engineering Queen’s Building University of Bristol University Walk BS8 1TR United Kingdom T: +44 (0) 117 33 15366 E: W: W:

Dr Benjamin King Sutton Woods

Dr Benjamin K.S. Woods started in the Department of Aerospace Engineering at Bristol in 2015 and is now a Senior Lecturer in Aerospace Structures. He is currently an EPSRC Early Career Fellow and is leading the Horizon 2020 project SABRE. He holds 7 US patents, has authored 28 journal publications, two book chapters, and 58 conference papers.


Growing graphene from the bottom up Graphene and other 2-dimensional materials could be used across a wide range of technological applications, yet they remain difficult to produce on large scales with sufficient quality. Researchers in the LMCat project are developing a new method of producing 2-D materials using liquid metal catalysts, as Dr. Irene Groot explains. A lot of attention in research is centred

Photograph of the LMCat reactor.

on the development and utilisation of 2-dimensional materials like graphene, which hold rich potential across a wide range of technological applications. However, fabricating these 2-D materials on a large scale with sufficient quality remains a challenge. “When we prepare these materials in an academic setting, we see that there are always some defects. It’s very difficult to upscale this process without producing those defects,” explains Irene Groot, Associate Professor in the Institute of Chemistry at Leiden University. As the Principal Investigator of the LMCat project, Dr. Groot is developing a new approach to the synthesis of 2-D materials, which uses liquid metal catalysts. “The project is about developing a methodology to produce these 2-D materials. We would also like to learn more about the physics and chemistry behind the growth of these materials,” she outlines. The main material of interest here is graphene, which has attracted a lot of interest from the commercial sector due to its interesting electrical properties. However, the project’s research is not limited solely to graphene, with Dr. Groot and her colleagues aiming to identify general principles which could also be applied in the production of other materials. “Some other 2-D materials can also be grown on copper. One of them is hexagonal boron nitride,” she outlines. “In principle, with the reactor and measurement techniques that we have developed, we could use more or less any catalyst material.”

3-D drawing of the LMCat reactor

Chemical vapour deposition Researchers are using a technique called chemical vapour deposition (CVD) to effectively build these 2-D materials from the bottom up, using a reactor that has been developed as part of the project. A highly pure piece of copper is melted into a liquid which is then used as a catalyst, which provides the foundations to then develop the material. “That is the substrate on which the graphene, or another 2-D material, is going to grow. On liquid copper, less defects are expected to form,” says Dr. Groot. Certain


gases are then added, with Dr. Groot and her colleagues using methane, which contains both carbon and hydrogen atoms. “We need a gas that contains carbon atoms, that will then eventually form graphene. This gas will then decompose on the catalyst surface,” she continues. “Methane contains carbon atoms, but also hydrogen atoms. We don’t want this hydrogen, as graphene is formed of an atomthick layer of carbon atoms.” The bonds between hydrogen and carbon are broken at the copper surface, and the

hydrogen atoms disperse while the carbon atoms remain, where they form a layer of graphene. Methods have been developed to then visualise the growth of the graphene, from which researchers can look to gain deeper insights into the process. “We can visualise the growth of the material, and we can see where there are imperfections,” explains Dr. Groot. With CVD, huge numbers of methane molecules hit the catalyst surface simultaneously, which can lead to imperfections. “Graphene does not start

EU Research

growing from a single carbon atom, but rather it starts growing on the catalyst surface at multiple places in little graphene sheets,” says Dr. Groot. “Ultimately what people want is one big sheet of what we call single crystalline graphene. So we want it to be a single sheet that does not have what we call domain boundaries.” This is a significant technical challenge, as when two sheets of graphene come together there are typically imperfections of some kind, for example they might be slightly rotated with respect to each other. By modifying the amount of gas that is added to the system, and the ratios in which it is added, Dr. Groot hopes to improve the growth process. “We want to etch away imperfections and ensure that the graphene grows in the optimal way,” she says. Graphene itself is just 1-atom thick, and if additional layers are added on top then the material loses its interesting electronic properties, which is an important consideration in the project. “It is quite difficult to grow graphene so that it is just 1-atom thick,” acknowledges Dr. Groot. “We are modifying the ratio between hydrogen and the carbon-containing gas to try and achieve this, while we are also looking at the importance of pressure.” A lot of progress has been made in this respect, with researchers demonstrating that the growth of graphene can be controlled and tuned through manipulating these gas ratios and the pressure level. The development of the reactor itself represents an important step forward, while Dr. Groot is also keen to highlight the project’s achievements in developing new measurement techniques. “With these measurement techniques we are able to visualise the growth of the graphene while it actually happens, in real time,” she continues. This opens up the possibility of effectively directing the growth of graphene. “If we see that the graphene is not growing in the way that we want it to, then we can change certain parameters, such as the ratios of these gases and the pressure,” explains Dr. Groot. “For example, if we have more than one sheet growing simultaneously, then we can look to manipulate them so that they align well.”

© Sensu Productions

© Sensu Productions © Sensu Productions

© Sensu Productions

Removing graphene The graphene is grown at very high temperatures inside the reactor, typically in excess of 1000° C, so it’s essential to cool down the system before removing the material. This is where a problem arises, as while the copper shrinks when it is cooled down, the graphene actually expands. “The graphene starts getting bigger while the


© Sensu Productions


Synthesis of 2D-Materials on Liquid Metal Catalysts

Project Objectives

The main objective of the LMCat project is the controlling and tuning of the growth of graphene on liquid copper using real time in situ observations via optical microscopy, X-raybased techniques and Raman spectroscopy. By obtaining a fundamental understanding of the underlying growth processes, perfect, defectfree graphene can be grown.

Project Funding

LMCat: H2020 FET Open project number 736299 / DirectSepa: H2020 FET ProAct project number 9519]\\\43.

Project Partners

Technical University of Munich: Mie Andersen, Postdoc • Hendrik Heenen, Postdoc• Santiago Cingolani, PhD student • Karsten Reuter Principal Investigator. Leiden Probe Microscopy: Gertjan van Baarle, Principal Investigator • Arthur Sjardin, Technician • Marc de Voogd, Application scientist. ESRF: Valentina Belova,Postdoc • Oleg Konovalov, Principal Investigator • Francesco La Porta, PhD student. University of Patras: Marinos Dimitropoulos, PhD student • Costas Galiotis, Principal Investigator • Anastasios Manikas, Postdoc • Christos Tsakonas, PhD student. Leiden University: Irene Groot, Principal Investigator • Mahesh Prabhu, Postdoc • Mehdi Saedi, Postdoc. Alternative Energies Commission: Maciej Jankowski, Postdoc • Gilles Renaud, Principal Investigator.

Contact Details

Project Coordinator, Dr. Irene Groot PO Box 9502 2300 RA Leiden The Netherlands T: +31 (0)71 527 7361 E: W: W: VujNRDRI& Dr. Irene Groot

Irene Groot obtained her PhD degree at Leiden University investigating the dissociation of hydrogen on metals, both experimentally and theoretically. After two postdocs she started her own group at the Leiden Institute of Physics. Currently, Irene Groot is associate professor (tenured) at the Leiden Institute of Chemistry. She investigates the structure-activity relationship of catalysts under industrial conditions focusing on sustainable energy and materials production.

copper gets smaller, and that gives rise to wrinkles in the graphene. So it is already less well oriented on the copper surface,” explains Dr. Groot. Once the system has cooled down to room temperature, the next step is to transfer the graphene and remove the copper catalyst. “Then we add a protective layer to the graphene, and from the bottom, start etching away the copper. So we add acid to the system to etch away the copper, and effectively destroy the copper in that way,” continues Dr. Groot. “The protective layer also needs to be removed eventually, in order to end up with just the graphene.”

There are still many hurdles to overcome before this can be realised, but this project will represent an important step in this respect. One important aim will be to enlarge the reactor so that there is a bigger copper pool available. “It’s not clear whether a scaled-up reactor will still work in the same way. That is something we will need to investigate,” says Dr. Groot. The primary interest for Dr. Groot in research is to learn more about the physics and chemistry behind the growth of graphene and other 2-D materials, yet she is very much aware of the wider interest in this work, from both the commercial and academic sectors.

The project is about developing a methodology to produce 2-D materials like graphene. We would also like to learn more about the physics and chemistry behind the growth of these materials. This research is still at a relatively early stage, with Dr. Groot and her colleagues producing a circular sample of graphene with a diameter of around 1 centimetre. However, funding has been granted for a further project in which researchers will aim to build on the progress achieved in LMCat and move the technology forward. “In this follow-up project, we will look at whether it is possible to remove graphene from the copper while it is still in the liquid phase,” outlines Dr. Groot. This would be a continuous process, with graphene grown on one side of the copper pool, and then pulled off on the other. “In that way, you could get rid of all these difficult transfer steps. That’s the ultimate dream,” says Dr. Groot. “If we can get this to work, then in theory we could grow very large graphene sheets.”

“We are working together with a small company who are looking into a business model for this reactor, to see whether other people, at other labs, might be interested in purchasing it,” she continues.

Optical microscopy image obtained during graphene growth on liquid copper. The light hexagons are graphene sheets, the dark background is liquid copper.

Molecular dynamics simulation of a hexagonal graphene sheet (silver) on a pool of liquid copper (gold).


EU Research

Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.