EU Research Summer 2017
Hope on the Horizon Healthcare: eHealth, Diagnostics, Bio Medicine and Drug development - a look at the systems of the future
The effect of Migration on Innovation - The Good, the Bad and the Facts
Exclusive interview with Pascal Garel, Chief Executive of HOPE
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
Editor’s No I As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
n 1900, the global population stood at 1.6 billion. The current population in 2017 is 7.5 billion. By 2100, this is expected to rise to a probable 11 billion. We are experiencing pandemic scale healthcare problems like the rise of chronic diseases and widespread antibiotic resistance, whilst resources are strained and budgets get tighter. Healthcare provision has no option but to adapt. Research by scientists can be a key to finding solutions to many of the problems and in this issue, we look at many groundbreaking projects in healthcare. One opportunity that we have been hearing about for many years is in eHealthcare, which can mean many different things. Telemedicine for example, can offer remote diagnosis via information communication technology (ICT), or ePrescription software can cut out process time, knowledge management systems and electronic health records give us trackable health, and there are many more examples of eHealth solutions – established and new. One of my favourite examples of the usefulness of ICT in healthcare is in the related subject of what is termed mHealthcare. With almost two thirds of the world’s population owning mobile phones, it makes sense that health advice can be delivered straight to your mobile device. I have seen great projects. One, for example, gave advice on delivering babies safely by sending diagrams of delivery procedures to people in remote African villages, displaying guidance on the mobile phone screens. This is simple, effective and brilliant and although not perfect, it can make a difference. Think of a future with algorithm driven virtual doctors, home diagnostic kits and medicine delivery services mirroring Amazon. Whilst there are healthcare cultures that may need to adapt too, change will be necessary at some juncture, that’s a given. We are in the era now where we need to self-serve and rely on our ability to learn online to help ourselves, where we can. Personal, consultative healthcare in doctors’ surgeries will transform into a more vendor-like remote service provision, I suspect. In all of this, on-going research into new ways of coping with the strain on health provision is essential. Hope you enjoy the issue.
Richard Forsyth Editor
Contents 32 The European Institute
for Innovation through Health Data (i-HD)
Analysis of electronic health records and efficient information sharing can lead to improved treatment, yet it is important to take account of privacy and security concerns, says Professor Dipak Kalra
4 Research News
EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation
10 CODEMISUSED Building understanding of levels of misuse of Codeine. Dr Marie Claire Van Hout and Margaret Walsh discuss the issue and its importance to the general public, medical and public sectors
12 PEP-PRO-RNA Drug development is a complex task, Professor Tom Grossman tells us about the PEP-PRO-RNA project’s work in identifying principles for the design of peptide-based drugs
14 Phosphoprocessors The Phosphoprocessors project aims to learn more about the molecular basis of protein phosphorylation. This work could have important implications for the design of circuits for synthetic biology, as Professor Mart Loog explains
17 CAUSALPATH Understanding the relationship between cause and effect in a complex system is central to effective intervention. Ioannis Tsamardinos tells us about CAUSALPATH’s work in developing new causal analysis algorithms
20 Terpenecat Terpenes are the largest class of chemicals produced in nature, now researchers aim to develop selective catalysts for terpene cyclizations, as Professor Konrad Tiefenbacher explains
The 3DinvitroNPC project has developed a 3-dimensional method of assessing the viability and functionality of nanoparticles for medical applications, as Dr Anna Laromaine explains
22 RobustNet Researchers in the RobustNet project are using C. elegans as a model system to investigate the underlying factors behind the robustness of complex biological systems, and also model infections by emerging pathogens, as Dr Michalis Barkoulas explains
23 Beta3_LVH The Beta3_LVH project aims to assess the efficacy of mirabegron in preventing heart failure in patients at risk of developing the disease, as Dr Nancy Van Overstraeten explains
24 Autonomous CLL BCRs Changes in cell signalling are thought to be an important factor in the development of chronic lymphocytic leukemia. Professor Hassan Jumaa tells us about his research into the underlying mechanisms behind these changes
26 HOPE on the Horizon Exclusive interview with Pascal Garel, Chief Executive of HOPE the European Association of Hospitals
30 VALUeHEALTH Healthcare services are typically organised on a national level, yet there are cases where patients require care while abroad, raising important questions in terms of interoperability and the business case, says Professor Dipak Kalra
There is an urgent need for new integrated care models to help provide care for people suffering from more than one chronic disease, says Professor Maureen Rutten-van Mölken
36 Ada 2020 Researchers are looking to harness the power of technology to help diagnose disease. Dr Martin Hirsch tells us about the Ada 2020 project’s work in developing a decision support tool
Daniel Beltran-Alcrudo tells us about the LinkTADS project’s work in coordinating research between the EU and China, helping lay the foundations for continued investigation into animal diseases
43 E-motion There are not limitless supplies of phosphate, leading researchers to look at methods of recovering it, an issue central to the E-motion project, as Dr Louis Desmet explains
We spoke to Dr Jaime L. Toney and Dr Antonio Garcia-Alix about their research into using algal lipids to extend human instrumental records further back in time
47 VariKin The family is a universal part of the human experience, yet societies differ widely in whom they class as part of the family. The VariKin project is investigating the roots of this diversity, as Professor Fiona Jordan explains
50 The effect of
Migration on Innovation The effect of Migration on Innovation – an in depth look at the positive and negative impacts migration can have on research innovation
54 RelRepDist Mathematicians have long used representation theory to study linear symmetries and algebraic structures, now Dr Dmitry Gourevitch and his colleagues are developing new theoretical tools to approach the subject
57 AROMA-CFD The rapid development of numerical methods over the last ten years holds important implications for modelling systems in engineering and applied sciences, says Professor Gianluigi Rozza of the AROMA-CFD project
58 RespiceSME The RespiceSME project is proposing new approaches to boost the effectiveness of European photonics SMEs. Project Manager, Samantha Michaux, talked to EU Research about the aims, methods and overcoming the challenges
Dr Julien Scheibert tells us about the Cascade project’s work in developing a comprehensive picture of friction at all scales, work which holds important implications for both science and industry
62 Comgransol Granular materials play an important role in many industrial and natural processes. Dr Georgios Theocharis aims to both investigate the fundamental behaviour of these materials and design new acoustic devices
64 Drinking Water from Seawater
Materials science Researchers at the University of Manchester have developed a new type of graphene-based membrane that could quickly filter salts from water
Hope on the Horizon Healthcare: eHealth, Diagnostics, Bio Medicine and Drug development, a look at the systems of the future
68 AVA Antimatter The new ELENA ring at CERN promises to open up new avenues in antimatter research. We spoke to Professor Carsten Welsch about the AVA project’s research into fundamental questions
70 Ariadne Neutrinos enable researchers to probe physics beyond the standard model. Dr Kostas Mavrokoridis tells us about the Ariadne project’s work in developing a next generation neutrino detector
72 Champagne Researchers in the Champagne project aim to understand the underlying basis of high temperature superconductivity, as Dr Catherine Pépin explains
74 MenWomenCare New structures developed during the interwar period, which affected traditional perceptions of gender roles in the provision of care, as Dr Jessica Meyer of the MenWomenCare project explains
We spoke to Dr Alina Badescu about the CosNed project’s work in helping to lay the foundations of a new technique to detect cosmic neutrinos in natural salt mines
76 Perspectival Realism Scientific progress is based to a large degree on the search for truth, yet the history of science is also marked by significant conceptual revolutions, as Professor Michela Massimi explains
The affect of Migration on Innovation - The Good, the Bad and the Facts
Exclusive interview with Pascal Garel Chief Executive of HOPE
Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
EDITORIAL Managing Editor Richard Forsyth firstname.lastname@example.org Deputy Editor Patrick Truss email@example.com Deputy Editor Richard Davey firstname.lastname@example.org Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks email@example.com PRODUCTION Production Manager Jenny O’Neill firstname.lastname@example.org Production Assistant Tim Smith email@example.com Art Director Daniel Hall firstname.lastname@example.org Design Manager David Patten email@example.com Illustrator Martin Carr firstname.lastname@example.org PUBLISHING Managing Director Edward Taberner email@example.com Scientific Director Dr Peter Taberner firstname.lastname@example.org Office Manager Janis Beazley email@example.com Finance Manager Adrian Hawthorne firstname.lastname@example.org Account Manager Jane Tareen email@example.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: firstname.lastname@example.org www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
The EU Research team take a look at current events in the scientific news
maps air pollution Air pollution affects all of us in our everyday lives to varying degrees, as we inhale toxins and particles that could adversely affect our health. Statistics underline the scale of the issue, with air pollution estimated to be responsible for around 5.5 million premature deaths in 2015. The problem is particularly acute in cities, where around 80 percent of inhabitants are breathing air that is unsafe for humans. It’s hard to know which areas are worst affected though, whether you’re a local resident or a visitor. Global tech giant Google is now taking steps to provide more information to the citizens of the Californian city of Oakland, releasing a map showing air quality information in fine detail, enabling people to identify the worst affected areas and avoid them where possible. The information was collected using sensors on top of Google’s Street
View cars, with readings taken every 30 metres or so, from which researchers have been able to build a more detailed picture of air pollution levels across the city. This approach allowed researchers to achieve significantly greater spatial precision than previous techniques, which relied more on monitoring air quality from a single central point. This represents a significant step forward. “Air pollution varies very finely in space, and we can’t capture that variation with other existing measurement techniques,” said Joshua Apte, an Assistant Professor at the Cockrell School of Engineering and the lead author of a paper describing the results. “Using our approach and analysis techniques, we can now visualise air pollution with incredible detail. This kind of information could transform our understanding of the sources and impacts of air pollution.”
Research shows US vulnerable to ozone depletion The population of central US states could be exposed to harmful levels of UV radiation during the Summer months, a new study reveals. Research from Harvard University shows that the protective stratospheric ozone layer is liable to erosion during the Summer, as specific chemical reactions cause ozone depletion, leaving people at greater risk from the effects of UV radiation. Researchers know that the central US states have some particular characteristics that leave them vulnerable to ozone depletion during the Summer. The combination of the northerly flow of warm, moist air from the Gulf of Mexico, together with heating and convergence over the Great Plains, often leads to the injection of water vapour into the stratosphere. This means that the stratosphere over a number of US states, including Kansas, Oklahoma and Nebraska, is at higher risk of experencing certain chemical reactions that lead to ozone depletion. Stratospheric ozone concentrations are also vulnerable to temperature changes caused by storm systems over the Great Plains. With these storms expected to increase in both frequency and intensity, researchers are looking to understand how they are likely to affect the earth’s climate. “Thunderstorms that hydrate the stratosphere can have significant local and regional impacts on Earth’s radiation budget and climate,” said Cameron R. Homeyer of the University of Oaklahoma, a co-investigator on the paper. “This work demonstrates our increasing knowledge of such storms, using ground-based and airborne observations and evaluates their potential for depleting stratospheric ozone now and in the future. The results strongly motivate the need for increased meteorological and chemical observations of such storms.”
Inside the physiological secrets of sherpas
Inside the physiological secrets of sherpas
The vast majority of visitors to the Himalayas rely on local sherpas to guide them through the area and help minimise risk. Along with their local knowledge and affinity for the mountains, sherpas also possess certain physiological attributes which mean they can cope better with the lowoxygen environment at high altitudes better than visitors to the area. A new study sheds further light on the qualities which set sherpas apart and enable them to adapt so well to the high mountains. High altitudes have low levels of oxygen in the atmosphere – most visitors to the Himalayas need time to adapt and acclimatise to this, they may also require additional supplies of oxygen. The sherpas by contrast cope far better, and can spend extended periods of time at high altitudes without suffering adverse health effects. The Xtreme Everest 2 expedition set out to investigate
these differences, following two groups as they climbed to Everest base camp at 5,300 metres – a group of lowlanders and a group of sherpas. Researchers from the University of Cambridge gathered data before the trip from the two groups, to provide a baseline measurement, then also took further muscle samples during the trip. The biochemical tests showed that sherpas were able to produce energy extremely efficiently, even when oxygen supplies were limited, while they also had lower levels of fat oxidation. “This shows that it’s not how much oxygen you’ve got, but what you do with it that counts,” says Professor Murray of the University of Cambridge, the senior author on the study. “Sherpas are really extraordinary performers, especially on the high Himalayan peaks. So there’s something really unusual about their physiology.”
Piquing our curiosity
A good nose for the flu
If there’s one characteristic that all scientists share then it’s curiosity, the desire to learn more about how the world works and gain new insights into fundamental laws. But is curiosity an innate characteristic, or can we all become more curious? The leading science and technology company Merck has devoted a lot of energy to investigating this question. The company’s curiosity initiative measured and described curiosity in several countries, including the US, Germany and China, now they’re looking to continue their research. “Curiosity is our driving force and we now seek to empirically demonstrate that anyone can increase their own level of curiosity,” said Stefan Oschmann, CEO of Merck and Chairman of the company’s Executive Board. Scientific curiosity and the spirit of investigation are crucial to addressing major contemporary challenges, believes Oschmann. “We are convinced that along with optimism and confidence, curiosity can help find solutions to many of the greatest challenges facing mankind,” he stated. An experiment has been set up, in partnership with teams from Porsche Consuting and the Weizmann Institute of Science, to test certain practices designed to enhance curiosity and encourage investigation, which many businesspeople believe is central to commercial success. “We’re living in a time of revolutionary business where curiosity is not a choice, but crucial to business success. Merck has given us a path forward, we are curious to participate and of course to learn the results of this experiment,” said Claus Lintz, a partner at Porsche Consulting.
Research published recently in Science Immunology sheds new light on how the body responds to an influenza infection. Studies on mice shows that following an infection, the nose recruits immune cells with particularly long memories, which are then well-placed to watch for the virus and guard against its possible recurrence. This type of cell, known as tissue resident memory t-cells, has now been found in the nose, which could prove relevant to efforts to improve flu treatment. Nasal spray vaccines could potentially be designed to increase the number of these T-cells in the nose, offering an effective way of protecting against influenza. These t-cells are present in specific tissues, and typically provide reconnaisance over the tissue. “They’re basically sitting there waiting in case you get infected with that pathogen again,” said Lynda Wakim, an immunologist at the University of Melbourne. If a pathogen returns, the T-cells can then rapidly kill infected cells. The T-cells in the nose had longer memories than those in the lung, which could suggest that research into the nose should be a higher priority in terms of combatting influenza and protecting against other viruses and bacteria.
New questions on mouse models The use of mice as model systems is a long-established method of scientific investigation, yet new research suggests they may not be as effective models for studying immune responses to disease as previously thought, findings which hold significant wider implications.
This area of research had been relatively neglected. “It’s remarkable that despite the enormous number of studies of laboratory mice, ours is the first in-depth study of wild mice immune systems,” said Professor Mark Viney from Bristol University’s School of Biological Sciences.
Researchers from the University of Bristol and the London School of Hygiene and Tropical Medicine studied the immune systems of wild mice, and compared them with those of mice bred in captivity.
The project’s findings could lead researchers to re-examine the ways in which mice are used in laboratory studies, work which has historically underpinned the development of many different vaccinations and immune-based therapies.
A number of major differences were identified between the two groups. In particular, the wild mice had highly-activated immune systems, which could be due to their regular exposure to new infections - by contrast, laboratory-bred mice had slower immune systems.
Nevertheless, despite these findings, Professor Viney believes that mouse models will continue to be an important tool in research. “These results point to us having to be much more cautious in extrapolating from the lab to the wild, but laboratory mouse models will continue to be hugely important in biological and biomedical research,” he said.
Investigating the relationship between violence and autism Many of the people in prison are among society’s most vulnerable, so it may come as no surprise that a significant proportion are thought to suffer from some form of mental illness, including autism. However, the nature of the relationship between autism and violent crime is still not fully understood, a topic which has been addressed in a recent study.
Researchers found that the presence of these co-occuring conditions alongside autism, along with other factors like drug and alcohol misuse, were the most important indicators of the risk of violent behaviour, rather than autism in itself. “Our findings, from the largest study to date, show that at the population level, autism in itself doesn’t seem to be associated with convictions for violent crimes. However, other conditions, such as ADHD, which can co-occur with autism, may increase such risks,” said Dr Heeramun.
The study, led by researchers from the University of Bristol together with a team from the Karolinksa Institute in Sweden, looked at data on over 290,000 people in Stockholm county in Sweden, of whom 5,739 had been diagnosed with autism. The researchers tracked the number of violent crime convictions among this group, from which new insights could be drawn. “We know that some people with an autism diagnosis have challenging behaviour and may come into contact with the criminal justice system. However, whether having autism increases the risk of violence or not has not previously been clear,” explained Dr Ragini Heeramun, Consultant Forensic Psychiatrist at the Avon & Wiltshire Partnership NHS Mental Health Trust in Bristol. The study found that while on the surface there did seem to be a higher risk of violent offending among people who had been diagnosed with autism, the picture became more complex when co-occuring conditions like attention deficit hyperactivity disorder (ADHD) and other conduct disorders were taken into account.
Future vision of particle colliders There’s plenty of life left yet in the Large Hadron Collider, but scientists at CERN always keep an eye on the future, and they’re already looking towards the next generation of particle colliders. The LHC itself took nearly three decades to build, so given that kind of timescale, plans need to be put in place now for its replacement. The Future Circular Collider will be three times the size of the LHC when completed, giving scientists the opportunity to investigate even more powerful collisions. Developing it will be a complex task however, so hundreds of scientists met in Germany at the end of May to discuss plans for the accelerator. The FCCs circuit will extend to somewhere between 50-62 miles, significantly longer than the LHCs 17 mile circuit, while it will also have stronger magnets to enable collisions at up to 100 tera electron volts. That could allow researchers to identify particles even heavier than the Higgs Boson, which was discovered at CERN in 2012. The LHC itself will be upgraded in the mid-2020s, and it will continue to play a central role in physics research, yet further infrastructure development is essential to scientific discovery, and the next generation of machines could lead to new breakthroughs.
Bloodhound on the supersonic scent
The current official land speed record stands at 763mph, set back in 1997, now twenty years on a team of engineers are gearing up for a new attempt on the landmark. The Bloodhound SSC, touted as the world’s most advanced straight line racing car, will be driven for the first time at Newquay airport on 26 October, with the development team looking to gain some further key data on its performance. The airport’s runway is not long enough to allow the Bloodhound to fully utilise its thrust, so the team plan more to evaluate the various systems in the car, before taking it out to South Africa next year for the attempt at the land speed record. The initial aim will be to raise the mark to 800mph, before getting to 1,000mph, although the test in Cornwall will not reach anything like those speeds, instead reaching around 200mph. Still, the Newquay trials will be a great opportunity to learn more about the car and showcase it to the public, while it will also provide an important landmark in terms of development. “It will be a big emotional moment for the team,” said chief engineer Mark Chapman. “We’ve gone from a computer design to an actual thing that will move down the runway. It will be a huge validation for the people who’ve stood by us all these years: it is happening.” “Newquay will demonstrate that the cockpit talks to the rest of the car, and the rest of the car talks to the cockpit – and the whole thing then talks to the outside.” Image credit © Flock London
This artist’s concept shows planet KELT-9b orbiting its host star, KELT-9. It is the hottest gas giant planet discovered so far. ©NASA/JPL-Caltech
Scalding Kelt sets new records An exo-planet revolving around a star around 650 light years away, Kelt 9B has some remarkable characteristics, not least its temperature. The planet is extremely close to its star, and the nature of its orbit means that one side continuously faces it, with intense radiation leading to surface temperatures in excess of 4,300º C, which makes it the hottest giant planet ever found. The planet itself was observed using two robotic telescopes, one based in Arizona, and the other just north of Cape Town. The Kilodegree Extremely Little Telescopes (KELT) were made using off-the-shelf components, costing only a fraction of more conventional observatories; observations from these telescopes
Smart ships to navigate the high seas
The emergence of self-driving cars has been well documented, but they’re not the only form of transport which is set to be automated. Driverless trains are used on the Docklands Light Railway (DLR) in London, and there are plans to automate other routes to cope with rising demand, now plans have emerged for autonomous ships. There is a clear commercial rationale for the development of autonomous ships. The US Coast Guard estimates that human error is the root cause of 96 percent of marine casualties, while there is a marked shortage of skilled personnel ready to work at sea, and those who do work there face the threat of modern piracy. Given this backdrop it’s perhaps unsurprising that many companies have been looking into the potential of autonomous ships, now a number of Japanese companies have outlined plans to build 250 self-navigating vessels. The shipping firms Mitsui OSK lines and Nippon Yusen are collaborating with shipbuilders, aiming to bring ‘smart ships’ into service by 2025.
showed a regular dimming of the starlight reaching Earth, indicative of the presence of a planet. The planet’s unusual characteristics meant researchers were particularly careful in checking and describing it, as it was outside conventional expectations of what a planet would look like. “It’s like a star-planet hybrid,” says Drake Deming, a planetary scientist at the University of Maryland “A kind of object we’ve never seen before.” Now that more is known about Kelt 9B, some observers have suggested this could lead on to further discoveries, and the discovery of a wider population of scalding hot gas giants.
Gamers to get sneak preview at E3 event The annual Electronic Entertainment Expo (E3) event has historically been an opportunity for video games developers to showcase their work and network with other industry professionals. The 2017 E3 event in Los Angeles will be different though, with 15,000 video games fans from around the world heading to the Californian city for a sneak preview of what gamers can expect to see in the year ahead. This represents a new approach for the E3 conference, as the video games industry seeks to adapt to changing market conditions. In recent years some studios have broadcast showcases to fans online, helping to heighten interest in their products, now E3 is changing as well. “E3 originally was a retail conference, about connecting buyers with the publishers,” said Piers Harding-Rolls, of the consultancy IHS Markit. “The industry has changed significantly since then so E3 has to move with the times. It’s a process to make it much more publically available and it’s a good move – it keeps it relevant.”
These smart ships will work differently to conventional vessels, using artificial intelligence to plot the safest, shortest and most fuel-efficient routes, potentially offering a far more efficient means of transporting goods to their destination.
Roving across Mars The preparations for the first human mission to Mars are well underway, with plans to send astronauts to the planet by 2030, but how will people move around once they get there? NASA has unveiled a Mars rover concept, with space to carry four astronauts across the red planet. It’s not expected that Mars will have particularly smooth roads, so the vehicle will have to be robust to deal with the terrain. The rover has six wheels to help it get over even large rocks, while other design features will help it maintain mobility. The vehicle is equipped with a 700 volt battery and runs off solar
Orchestrating musical movement It might be an otherwise imperceptible nod of the head, or a raised eyebrow, but body language is a crucial element of communication between performers and ensuring everybody in an orchestra keeps in time.
power. At over 3 metres in height and 8 metres in length, the rover is quite a large vehicle, designed to protect astronauts and give them a safe environment from which to explore mars. The vehicle is still at an early stage of development, yet it demonstrates NASAs determination to look ahead and anticipate future challenges, however far-fetched they might seem. While there’s still much to be learned about Mars and its environment, some elements of the design may be incorporated in the eventual design, bringing the prospect of human habitation on Mars a step closer.
“The methodology developed in this study could be useful for understanding many different types of group behaviour, such as understanding communication problems in autistic children, or determining the best crowd control procedures for an emergency evacuation.”
The movements and gestures that are commonly seen within an orchestra can help aid communication, all without saying a word. Now researchers at McMaster University in Canada have moved a step closer to unravelling the mysteries of musical communication. The team used sophisticated technology to monitor and examine the movements of performers in two professional string quartets. They found that they were able to draw links between the body movements of one musician and the actions of another. Many performers like to express themselves on stage of course, and researchers found that the extent of the body swaying was linked to the group’s perceptions of how well they were performing, something to bear in mind if you ever see a static string quartet during a performance. This research could also lead to new insights into the importance of body language in other forms of social interaction. “Although we are often not consciously aware of it, non-verbal communications between people is common in many situations and influences who we like and who we don’t like,” explained Dan Bosnyak, a research director at McMaster’s LIVElab.
Raising awareness of codeine risks Codeine is widely used to treat pain, yet there is growing concern over levels of misuse and dependence. Dr Marie Claire Van Hout and Margaret Walsh tell us about the CODEMISUSED project’s work in building understanding of levels of misuse and raising awareness of the issue amongst pharmacists, medical practitioners, the general public and policy-makers A treatment for
mild and moderate pain, codeine is the most commonly consumed opiate in the world, with global demand having risen by approximately 27 percent over the last two decades. However, concerns have been raised about levels of misuse, and this is the focus of the CODEMISUSED project, a collaborative study bringing together pharmacies and scientific researchers in Ireland, the UK and South Africa. “The wider goal of the project is to raise awareness of the issue of codeine misuse amongst both the general public and policy-makers,” says Dr Marie Claire Van Hout, the project’s Principal Investigator. While codeine-containing products are advised for mild to moderate pain relief over the short term, they are habit-forming, so users are advised to be cautious. “Codeine use may be habit-forming, even at regular doses,” stresses Dr Van Hout.
Misuse and dependence Many people are not fully aware of how addictive codeine is however, and while they may start by using it as a treatment for pain, this can eventually slip into misuse and dependence. The line between therapeutic and non-therapeutic use of codeine is not always easy to distinguish though. “Somebody who starts using codeine will initially go and buy it over the counter (OTC), and use it to relieve short-term pain. They become misusers at the point when they’re not following the advice on correct and safe usage,” says Margaret Walsh, Project Manager. This misuse can take several different forms. “It could be somebody who uses codeine-containing products for too
long, it could be somebody who uses excessive doses, or it could be somebody who takes the correct dose but uses it longer than is advised,” says Dr Van Hout. There are cases where people use codeine-containing products to manage anxiety or stress, while there are also problem users who take them to become intoxicated. The diversity of the population which uses codeine-containing products is one of the factors which makes it difficult to get accurate figures on levels of misuse. “CODEMISUSED was really designed to try and understand levels of misuse a bit better, and also to inform national policies and the regulatory authorities,” outlines Dr Van Hout. Policy on OTC access to codeine-containing
several different sources. “We reviewed the existing literature, and surveyed pharmacists, prescribers and customers in each country to get their perspectives. We also worked with addiction treatment providers and interviewed individuals in treatment,” says Dr Van Hout. “There was also an internet monitoring exercise, with monitoring conducted on drug fora, online pharmacies and social media, and we consulted with experts in each country around innovations and best practice.” This data gave researchers a firm foundation on which to delve deeper into the issues surrounding codeine use. From interviews with individuals in treatment, researchers gained new insights into the underlying factors leading to misuse and
The wider goal of the project is to raise awareness of the issue of codeine misuse amongst both the general public and policy makers products varies across the EU and other regions in the world. They can now also be purchased online, and this is another factor contributing to increased concern around potential misuse. “Global awareness about issues around habit forming use of codeine and the potential for misuse is rising,” says Dr Van Hout. The project aims to help further raise awareness of the issue by quantifying the extent of both therapeutic and nontherapeutic use, misuse and dependence in three different regulatory regimes, namely Ireland, the UK and South Africa. The project is organised into twelve work packages, with data being gathered from
dependence. “A lot of the people we interviewed who ended up misusing codeine and getting treatment said that, within a day or two of starting to take it, they recognised that they liked the effect,” explains Dr Van Hout. Once an individual has got into the habit of taking a codeinecontaining product, it’s very difficult to break, particularly if it’s associated with pleasurable effects. “A painkiller makes you feel good, it makes you relax and sleep well, so why wouldn’t you take another?” points out Dr Van Hout. “A lot of individuals would take them first thing in the morning, before anything else, so the habit becomes part of the daily routine.”
There is a growing recognition of these risks, and a number of countries have taken measures to limit the potential for codeine misuse, including restricting advertising and removing codeine-containing products from self-selection. However, while there is an abundance of anecdotal evidence that the misuse of codeine products is a significant issue, Dr Van Hout says clear, rigorous data are essential to effective policy-making. “National regulators will say; ‘so, how many people are misusing in each country?’ If you can’t answer that, then they have no indication of the scale of the issue,” she points out. The project will make an important contribution in these terms, providing a clear evidence base on the nature and extent of codeine misuse. “We’re publicising our research in each country and bringing it to the attention of policy-makers,” continues Dr Van Hout.
Information availability A clear finding that has emerged from the project is that many of the individuals who have misused codeine-containing products or become dependent believe not enough information is available on the risks. The deregulation of OTC codeine-containing medicines in some countries has led to increased choice, and the ease with which they can be purchased may give users a false perception of the risks associated with consuming codeine over the longer term. “More people are using codeine-containing products nowadays, perhaps because they’re more readily accessible,” says Walsh. With greater levels of use, there is a corresponding need to heighten awareness among the wider public of how codeine-containing
products can be used safely. “We aim to help educate the general public about the general effects of over-use,” continues Walsh. The project also aims to work with pharmacists and manufacturers to establish realistic guidelines that can help limit the potential for misuse. One major issue is that in many countries pharmacies aren’t centralised, so it’s difficult to monitor levels of use. “People can go from one pharmacy to the next and stock up. Pharmacists are aware of this, so they would maybe sell a product on a one-off basis and advise the customer, but they know that that customer can then go to the next pharmacy and do it again,” says Dr Van Hout. The ultimate goal for the project is to develop innovations that support customers and empower pharmacies. “This is about empowering pharmacists, so they can see what’s going on with levels of codeine use,” says Walsh. “The challenge for policy-makers and practitioners is how to ensure the availability of codeine-containing products for therapeutic use but minimise the risk of misuse,” says Walsh. “CODEMISUSED identified and documented some possible opportunities for innovations across the EU and beyond regarding Manufacturing (e.g., package sizes, labelling, tamper-proof); Product Information and Public Education; Training of Pharmacy Staff for Responsible Prescribing; Monitoring and Surveillance; Dispensing, Screening and Brief Interventions in Community Pharmacies; Safety in the Workplace and on the Road; Internet Supply of Codeine and Technological Support; Treatment of Codeine Dependence; Learning Resources and Training for Health Professionals.”
At a glance Full Project Title Over the Counter Codeine Use, Misuse and Dependence (CODEMISUSED) Project Objectives The CODEMISUSED project aims to carry out a national and international collaborative study to estimate levels of therapeutic and non-therapeutic codeine use, misuse and dependence in partner countries from a variety of sources and perspectives. Project Funding Funded by the European Community’s Seventh Framework Programme FP7/20072013 under grant agreement no 611736. Project Partners • Waterford Institute of Technology, Ireland https://www.wit.ie/ • King’s College London, UK http://www. kcl.ac.uk/index.aspx • South African Medical Research Council, South Africa http://www.mrc.ac.za/ • CARA Pharmacy Group, Ireland https:// www.carapharmacy.com/ • Weldricks Pharmacy Ltd., UK https:// www.weldricks.co.uk/ • The Local Choice Pharmacy Group, South Africa http://thelocalchoice.co.za/ Contact Details CODEMISUSED Project Manager, Margaret Walsh Waterford Institute of Technology Cork Road, Waterford City, Ireland T: +353 051 845 548 E: email@example.com W: http://www.codemisused.org/
Margaret Walsh Dr Marie Claire Van Hout
Margaret Walsh is Project Manager of the CODEMISUSED project. She is responsible for managing this portfolio of research with a value of €2.04m and a consortium of 28 researchers across Ireland, the UK and South Africa. She holds a Master of Business Degree and a Bachelor of Arts Degree. Dr Marie Claire Van Hout PhD is the Principal Investigator of the CODEMISUSED project. She has over 15 years experience in the field of substance misuse, participatory health, pharmaco and addict-vigilance.
Combining the best of small molecules and biologics Drug development is an increasingly complex challenge, as researchers seek to modulate novel biological targets. Professor Tom Grossmann tells us about the PEP-PRO-RNA project’s work in identifying principles for the design of peptide-based drugs, laying the foundations for the future development of novel macrocyclic compounds targeting biological processes The majority of approved drugs in the past were small molecules addressing defined pockets on a protein target. However, over recent decades, the pace of small molecule-based drug discovery has slowed, as researchers have faced difficulties in identifying novel biological targets that expose appropriate binding pockets and can be modulated with small molecular scaffolds. Since biologics such as antibodies exhibit improved surface recognition properties, they have been exploited successfully to address such targets. However, biologics are very poor cell penetrators, which restricts their use to extracellular targets. For these reasons, it has proven extremely challenging to address intracellular biomolecules which lack defined binding pockets. This holds particularly true for intracellular proteinRNA interactions (PRI) and proteinprotein interactions (PPI). “If you want to modulate a PPI inside a cell, then you hope that there is a small pocket in the binding interface. Otherwise there are basically no suitable molecular scafolds with sufficient cell permeability,” outlines Professor Tom Grossmann. Based at the VU University of Amsterdam, Professor Grossmann is the Principal Investigator of the PEP-PRO-RNA project, an EC-backed initiative developing a new approach, using peptide binding epitopes.
“If you start with a peptide originating from the protein of a PPI, there is a good chance of getting a high-affinity binder for that site,” he explains. “A key challenge is to make sure that these peptides, or related molecules, actually go into the cell. We aim to develop a technology that combines high affinity for the biological target with good cellular uptake.” A number of peptides are known to be able to penetrate cells, yet the precise mechanisms behind this are not fully understood, meaning there is not a solid foundation for continued development. It is known that linear and flexible peptides are particularly poor cell penetrators and that reduced flexibility can support uptake, says Professor Grossmann. In addition, such constrained peptides also show an increased tendency to bind their target with high affinity. This is due to the fact that the flexibility of a peptide decreases upon binding, leading to entropic penalty and reduced overall affinity. “If a system gains order upon binding, that costs a certain form of energy, so we pay a penalty for binding a flexible peptide. The idea is to pre-organise the peptide in solution, in the same state as it wants to be bound,” continues Professor Grossmann. “In this process, we focus
on the identification of basic design principles. We want to provide tools that enable scientists to identify and optimize structures to address a given problem.”
Peptide organisation The first step in this work is to identify suitable peptide-binding epitopes. “We have defined some clear parameters for their selection. In terms of size, we don’t want a huge peptide, but also there’s no point in producing something very tiny that would not provide enough energy when binding the target. Taking this into consideration, we searched for potential starting points,” says Professor Grossmann. The researchers use a computational approach to analyse available structural data that identifies potential starting points for the development of agents to inhibit PPI or PRI. The next step in development involves macrocyclisation, meaning the formation of a large molecular ring system, where researchers aim to improve the affinity for the biological target. Professor Grossmann says there are two key parameters that need to be considered in this step. “One is – where do you choose the two attachment points for cyclization? It could be head-to-tail, or anywhere within the peptide sequence,”
At a glance Full Project Title Peptide-derived bioavailable macrocycles as inhibitors of protein-RNA and proteinprotein interactions (PEP-PRO-RNA)
Conformational states of cyclic peptides
The Grossmann lab
Project Objectives In general, research in the Grossmann lab aims for the design of novel peptidomimetics and biocompatible reactions. The PEP-PRO-RNA project in particular uses design principles derived from natural peptide epitopes to develop molecules that allow the modulation of biological processes which allows a testing novel therapeutics strategies. Project Funding Funded under: H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Project Collaborators • Dr. C. Ottmann, Department of Biomedical Engineering, Technical University Eindhoven, The Netherlands • Prof. H. Waldmann, Max Planck Institute of Molecular Physiology, Dortmund, Germany • Dr. C. Rademacher, Max Planck Institute of Colloids and Interfaces, Potsdam, Germany
he outlines. The second parameter relates to how the cycle is closed; a particular molecular architecture has to be chosen to close the cycle and introduce conformational constraint. “This is called the cross-link architecture – it’s an important parameter influencing both affinity for the target but also cell permeability,” continues Professor Grossmann.
This work will help to pave the way for a more general application of peptidomimetics, laying the foundations for more efficient drug discovery in future. “We want to develop rational and general design principles to stabilise peptide epitopes, and to equip them with bioavailability. These are the major goals of the project,” says Professor Grossmann.
We want to develop rational, general design principles to stabilise peptide epitopes, and to equip them with bioavailability Introducing a high level of bioavailability into the compound is another key element of the project’s research. This covers issues like cell permeability and ensuring the compound is not degraded by enzymes. “We have to take care that the bioactive structure is stabilized conveying good affinity for the target and at the same time allowing enough flexibility to enable the molecule to pass the cell membrane,” outlines Professor Grossmann. “There are no general principles to balance these properties we simply don’t know enough how to rationalize this process. There have been some successful examples, but it is not fully understood why they’ve worked out. That is what we want to shed some light on.”
Contact Details Professor Tom N. Grossmann VU University Amsterdam De Boelelaan 1108 1081 HZ Amsterdam, The Netherlands T: +31 20 59 88339 E: firstname.lastname@example.org W: http://www.grossmannlab.com P.M. Cromm, S. Schaubach, J. Spiegel, A. Fürstner, T.N. Grossmann, H. Waldmann ‘’Orthogonal ringclosing alkyne and olefin metathesis for the synthesis of small GTPase-targeting bicyclic peptides’’ Nature Commun. 2016, 7, 11300 M. Pelay‑Gimeno, A. Glas, O. Koch, T.N. Grossmann ‘’Structure-based design of inhibitors of proteinprotein interactions: Mimicking peptide binding epitopes’’ Angew. Chem. 2015, 127, 9022–9054; Angew. Chem. Int. Ed. 2015, 54, 8896–8927
Professor Tom N. Grossmann
Therapeutic potential The project’s research holds important implications for future of drug design, yet the more immediate focus is on PPI and PRI, both of which involve biological targets of great therapeutic interest. PPIs in particular contribute to virtually all aspects of cellular organization and function, and inhibiting them can enable the manipulation of specific biological processes which are currently undruggable. “Inhibiting PPIs is considered a promising strategy towards next-generation therapeutics,” says Professor Grossmann. The principles identified in the project will provide macrocyclic PPI and PRI inhibitors.
Professor Tom N. Grossmann studied chemistry at the Humboldt University Berlin involving undergraduate research at the University of California Berkeley. In 2008, he received his PhD from the Humboldt University Berlin. After postdoctoral research at Harvard University, he became group leader at the Technical University and the Chemical Genomics Centre in Dortmund. Since 2016, he is a full professor at the VU University Amsterdam.
Protein phosphorylation plays an important role in many cellular processes, now researchers in the Phosphoprocessors project aim to learn more about the molecular basis of this process. This work could have important implications for the design of circuits for synthetic biology, as Professor Mart Loog explains
Analysing the basis of protein phosphorylation Protein phosphorylation by
protein kinases plays a central role in many cellular processes. The addition of a negatively charged phosphoryl group can change the function of a protein, which then may lead on to further changes. “If you add another phosphate group to the protein, you can activate the protein or de-activate it, or localise it to other parts of the cell,” explains Professor Mart Loog, the Principal Investigator of the Phosphoprocessors project. Researchers in the project now aim to learn more about multisite phosphorylation, looking in particular towards three main objectives. “One is related to the regulation of cell division, and regulation of temporal order of individual molecular events in cell cycle and cell division,” outlines Professor Loog. “We’re looking at how the master regulators of this process, cyclindependent kinases (CDK), are able to
temporarily resolve these triggers and switches and other molecular events.” A second key objective in the project is to study the phosphorylation of proteins in kinetochores, a type of large protein structure. The main topic of interest here is the regulation of protein function through phosphorylation; the enzymes which catalyse this are protein kinases. “We are focusing on CDKs, as master regulators of the cell cycle. So far, we have studied this process with respect to individual substrate proteins, but now we are going to study it in the context of larger protein structures. We expect that we will find different rules and dynamics, as this is a new area of research,” says Professor Loog. The third objective is related to synthetic biology, synthetic circuit design.“We aim to apply the knowledge that we have gained from studying multi-site phosphorylation circuits in the cell
cycle – we aim to apply the same rules in designing regulatory circuits, in synthetic cellular systems,” continues Professor Loog.
Multi-site phosphorylation This work builds on continued research into the phosphorylation process. Within the project, researchers are looking in particular at multisite phosphorylation, in which the phosphoryl groups are attached at different sites in the proteins. “These sites can be distributed within the peptide chain in different patterns and clusters, with different distances between them, and different amino acids around them. These are the parameters that define the phosphorylation process – the kinetics of it, the dynamics of it, and basically, how its inputs and outputs are controlled,” explains Professor Loog. The aim in the project is to crack the ‘multisite phosphorylation code’ of CDK, while
Professor Loog and his colleagues are also looking to apply their knowledge to design circuits for synthetic biology. “We can apply certain concentrations of kinase input, and depending on the information we encode in the amino-acids sequence, in terms of distances and patterns,” he says. “Depending on the design, and the ability of the kinase to read this kind of barcode, the protein will react in different ways.” The key issue here is to ensure that the circuit is sufficiently specific and well adapted to the environment in which it is to be introduced. A synthetic circuit put into a cell has to fulfil two main criteria. “One is that your own circuit or signalling system doesn’t affect the normal physiology of the cell,” outlines Professor Loog. Then it’s also important to ensure that cellular kinases won’t shoot their signals into the artificial system, that it’s almost insulated and protected from other similar signals in the cellular environment. “It’s really difficult to gain this specificity, because there are crossspecificities and similar, closely related kinases and substrates,” continues Professor Loog. “They compete with each other, and it’s very difficult to get rid of the noise of the other signals, it’s one of the biggest challenges in synthetic circuit design. With multi-step and linear coding, we can encode unlimited combinations into these proteins, so we can amplify the specificity and selectivity of signals at every step.” Researchers are working primarily with disordered proteins, which is why it is possible to encode patterns, networks and phosphorylation sites in a linear way, without the complexity of the 3D structure of proteins. The biochemical
rules on how to encode the signal have been identified; Professor Loog and his colleagues are building on this knowledge to design circuits for synthetic biology and develop a toolbox of synthetic parts. “The phosphorylation barcodes act like signal processors. Right now we are in the process of creating logic gates and different computational elements, which would be entirely unique technology,” he explains. This work reflects a wider change in biology, away from it being a purely descriptive science towards also becoming an engineering discipline. “We used to focus primarily on basic science, studying the cell cycle. Now, in this project, we are applying this
understanding of these processes, and now we can apply this knowledge for designing synthetic circuits. We understand it in greater depth now, so we have very predictable models and predictable switches,” he says. Researchers are working with yeast, which is the simplest eukaryotic system. “We are designing this system using yeast, because it’s the best option for doing very clean and simple experiments, for establishing a proof-of-concept,” says Professor Loog. “Once we’ve got our systems working as we’ve predicted, we’ll then go into stem cells and try to re-wire the stem cell differentiation pathways to see if we can create stem cells which are entirely programmeable by our inputs.”
We’ve been inspired by the signalling networks that have been created through evolution. We have gained very specialised understanding of these processes, and now we can apply this knowledge for
designing synthetic circuits
knowledge to engineer switches that are not found in nature, that can then mediate signals in cells and perform other biological functions,” continues Professor Loog. This work holds important implications beyond the research sphere. Virtually every signalling pathway or disease state is related to a phosphorylation process of some kind, underlining the wider relevance of Professor Loog’s research. “We’ve been inspired by the signalling networks that have been created through evolution. We have gained very specialised
Synthetic cells The next step could be to apply the technology in 3D tissue engineering, while Professor Loog is keen to explore further potential applications of the research. One major area of interest is sustainable bioprocessing and cell factories, using yeast in bioreactors. “With these systems, we can control the regulation of metabolic enzymes and metabolic pathways. This is very practical research,” explains Professor Loog. A longer-term goal within the synthetic biology field is the development of synthetic organs; while this remains a
At a glance Full Project Title Biological signal processing via multisite phosphorylation networks (Phosphoprocessors) Project Objectives Multisite phosphorylation of proteins is a powerful signal processing mechanism which plays crucial roles in cell division and differentiation, as well as in disease. The goal of the Phosphoprocessors project is to elucidate the molecular basis of this important mechanism. Project Funding ERC-CoG-2014 - ERC Consolidator Grant Contact Details Project Coordinator, Mart Loog Professor of Molecular Systems Biology Institute of Technology University of Tartu Nooruse 1, Tartu 50411, Estonia T: +37 2 517 5698 E: Mart.Loog@ut.ee W: www.looglab.com Kõivomägi M, Ord M, Iofik A, Valk E, Venta R, Faustova I, Kivi R, Balog ER, Rubin SM, Loog M. (2013) Multisite phosphorylation networks as signal processors for Cdk1. Nature Struct Mol Biol. 20(12), 1415-24. Kõivomägi, M., Valk, E., Venta, R., Iofik, A., Lepiku, M., Balog, E.R.M., Rubin, S.M., Morgan, D.O., and Loog, M. (2011). Cascades of multisite phosphorylation control Sic1 destruction at the onset of S phase. Nature, Oct 12. doi: 10.1038.
Professor Mart Loog
long way off, Professor Loog says the project’s research could have a practical impact, for example in cell factories. “Cell factories are being developed to synthesise certain chemicals and drugs,” he says. “The biggest bottleneck there is in regulating the metabolic pathways, so that the carbon flux goes to the compound, which then results in higher yields. This is where we are going to apply our research, in yeast in the first instance.”
vision, looking towards cell factories and sustainable bio-processing,” he outlines. There are plans to develop the Centre further, helping to establish it as a centre of excellence in synthetic biology, which could help translate research into practical applications. “We plan to create a pilot plant, a bioreactor facility for cell factories, and to also build a new centre for mammalian cell factories, microbial cell factories,” says Professor Loog.
We used to focus primarily on basic science, studying the cell cycle. Now, in this project, we are applying this knowledge to engineer switches that are not found in nature, that can then mediate signals in cells and perform other biological functions There is a huge market for these kinds of tools and synthetic circuits, as researchers seek to develop more sustainable methods of producing specific chemicals and pharmaceutical products. Over the next three years, researchers will be publishing results, and patenting and finalising the toolbox, while Professor Loog is also keen to lay the foundations for continued synthetic biology research. “We have established the Estonian Centre for Synthetic Biology, where we collaborate with many different research groups. We are introducing this industrial
This is a very active area of research, with scientists seeking to develop cells, control their functions, and harness their properties for specific biological applications. There is still a great deal to learn, yet Professor Loog believes that humans will eventually learn how to use synthetic cells, and the Estonian Centre for Synthetic Biology will play a prominent role in these terms. “We are in a very good position to be leaders in this field and to create an effective toolbox, which could have a wide range of biological applications,” he says.
Mart Loog is professor of molecular systems biology and head of the Estonian Centre for Synthetic Biology (ECSB). Mart received a PhD in medicinal biochemistry from Uppsala University, Sweden in 2002, followed by postdoctoral training at University of California, San Francisco. In 2006 Mart established his laboratory at the newly established Institute of Tehnology. He has received several international fellowships and awards including The Wellcome Trust Senior International Fellowship and a startup research grant from European Molecular Biology Organization (EMBO) and Howard Hughes Medical Institute (HHMI). In 2012 he received the Estonian National Science Prize in chemistry and molecular biology. In 2015 he was awarded the European Research Council (ERC) Consolidator Grant and became a principal coordinator of a H2020 ERA Chair project SynBioTEC to establish the multidisciplinary Centre of Synthetic Biology.
The path to understanding causal relations Understanding the causal relationships between the different elements of a complex system is central to effective intervention. We spoke to Dr Ioannis Tsamardinos about the CAUSALPATH project’s work in developing new causal analysis algorithms, which could help researchers learn more about biological pathways and the human immune system The relationship between
cause and effect is central to human reasoning capacity, yet analysing and understanding it is highly challenging when it comes to complex systems with multiple interacting components. Researchers in the CAUSALPATH project aim to develop new algorithms for causal analysis and causal discovery, particularly with respect to molecular biological data, which could lead to new insights. “We want to develop new methods, new algorithms that are applicable to a family of problems and application domains, but we also want to apply them for the discovery of new knowledge in biology, specifically in the human immune system. We hope to discover new biological pathways and refine existing knowledge on biological pathways,” outlines Ioannis Tsamardinos, the project’s Principal Investigator. These methods are designed to integrate
information and data from various sources. “These methods are able to learn causality or refine what we know about causal relationships – among molecules for example, or other measured quantities – that come from various different data sets,” explains Tsamardinos. This could be not only different types of ‘omics’ data, such as proteomics or genomics, but also data of the same type generated under different experimental conditions. For example, a biologist may perform a study of the human immune system under certain conditions, then another biologist may make measurements on the same system from a different perspective. “These studies have different statistical distributions; however, the common factor is that they are looking at the same system,” points out Tsamardinos. Researchers aim to develop algorithms to piece together this information and
identify unifying causal mechanisms, looking particularly at single-cell data, measured mostly through mass cytometers. “Mass cytometers are a relatively new type of biotechnology. These machines can measure the concentrations of proteins in thousands of cells per second. So, they generate very detailed measurements that lend themselves to applying causal discovery and causal analysis methods,” says Tsamardinos.
Causal discovery methods The methods themselves build on relatively recent advances in the causal discovery field, introduced by Tsamardinos and colleagues, where researchers are now able to convert causal discovery problems to mathematical logic. This technique has enabled researchers to solve ever more
Causal models of the microworld: Differential Equation learning
complex problems. Now however, Tsamardinos and his colleagues are also exploring a new approach to learning causal models, which they think may be better suited for molecular data in the microworld. “The current approach uses graphical probabilistic models of causality. Now we’re also developing algorithms that are based on ordinary differential equation models, like the ones employed to express laws in physics,” he explains. Evidence suggests that ordinary differential equations could be more effective as a method of analysing causal relationships in the molecular biology domain, now researchers hope to improve these methods further. Tsamardinos and his colleagues are collaborating with a team at the Karolinska Institute, using data gathered from the mass cytometry facility there. “This is something new for us. We have designed and planned an experiment, and now we’re going to produce data specifically to answer specific biological questions, and in a way that is well-adapted to our methods. And if we do find something novel, then we will have the opportunity to perform targeted biological validation experiments to either verify or disprove the discovered relationships,” he says. “The measurements have been taken, the samples have been chemically processed, and they’re now ready to go into the machine.” Researchers hope to gain new insights into biological cellular pathways from this work, which could help inform drug
Single cell web tool: SCENERY
design. A deeper understanding of causal relations at the single-cell level will help ensure drugs can be more precisely targeted. “This is why causality is so important, particularly in biology and medicine,” stresses Tsamardinos. While there is a strong focus in the project on bioinformatics, biology and the discovery of new knowledge, Tsamardinos says this is not the only domain where these new algorithms could play an important role. “Think about economics for example – there may be a correlation over time between interest rates and unemployment,
about this project, as we’ll have the opportunity to analyse large volumes of business data,” says Tsamardinos. Researchers also plan to make the tools and software developed in the project more widely available. “One tool we have developed in the project is called Scenery - it is a tool for causally analysing and visualising data meant for the non-expert data analyst. It’s a tool for network reconstruction from single cell data, and particularly mass cytometry or flow cytometry data,” explains Tsamardinos. “The idea is that we can allow users who
We want to develop new methods, new algorithms that are applicable to a wide family of problems, but we also want to apply them for the discovery of new knowledge in biology, specifically in the human immune system or another variable. The response of a central bank in this situation might be to reduce interest rates with the goal of reducing unemployment – but this is only going to happen if there is a causal relationship between the two,” he continues. “This is what we’re trying to achieve with this project.” The algorithms are quite general in scope, and Tsamardinos is looking to identify other areas in which they could be applied. A spin-off company has been established called Gnosis Data Analysis, and a contract has been arranged to causally analyse data from a major US insurance company. “We’re very excited
may not be technological experts, like biologists or medical specialists for example, to use sophisticated, web-based methods to automatically analyse their single-cell data and visualise their results.”
Data analytics The use of big data is a prominent issue across several different sectors at the moment, with both commercial companies and public institutions seeking to gain new insights from the data sets available to them. The group is working with the Norwegian University of Science and Technology on a project aiming to discover
Automatic predictive analytics: JAD Bio
biomarkers for the early diagnosis of lung cancer, while Tsamardinos says one of the wider goals is to eventually bring effective commercial systems and software to the market. “The main tool that we hope to release, called Just Add Data Bio or simply JAD Bio, is designed to perform automatic predictive analytics for users, without them requiring any knowledge of data analysis, statistics or mathematics,” he outlines. “The user uploads their data, dictates the goal of the analysis, and then the tool automatically figures out the best way to analyse your data. It finds you the best predictive model, it visualises the model, and generates an estimated predicted accuracy and confidence intervals. It gives users a wealth of information on which they can base their decisions.” The first research application of JAD Bio has just been published in Scientific Reports for automatically creating a model to separate periplasmic and cytoplasmic proteins based on their mature amino-acid sequence. More publications with the use of the tool are about to follow, even before its official release.” In addition to algorithms and commercial tools for automated data analytics, the group also performs research in other directions. One important area of investigation is algorithms for what is called feature selection. “This is closely tied to causality, because a good causal model first starts with good feature selection,” stresses Tsamardinos. In terms of biological data, these features would be
molecular measurements which are collectively predictive of a specific disease status or another outcome. “We invest heavily in feature selection. We’re designing algorithms that can scale up to really big data,” continues Tsamardinos. “We’re talking about data comprising measurements from millions of people, measuring millions of quantities or features on each, like having a very large Excel file. Such data arises in business databases but also we expect them to be gathered out of the precision medicine initiative in the US. This initiative will provide a huge data set for researchers to analyse, and effective algorithms could be used to identify biomarkers.” But all the above methods concern analysing and getting the most out of a single study or source of data. Tsamardinos and his group are now investigating a more integrative and holistic approach to data analytics, harnessing the power of big data and public data repositories. “We’ve been downloading public data-sets for more than a year now, and we preprocess them in the same way, so that they’re comparable. Then we develop methods that reveal the relationships between these data-sets,” he explains. “We have data on different types of diseases and can look at the relationships between them. We are looking at how to visualise and compute these relationships. Eventually, we will create a map of the relationships between all studies in biology that reside in public repositories. There are tens of thousands of them.”
At a glance Full Project Title Next Generation Causal Analysis: Inspired by the Induction of Biological Pathways from Cytometry Data (CAUSALPATH). Project Objectives The goal of the project is to advance methods, algorithms, and theory of inducing Causal Models from a set of possibly heterogeneous datasets and bridge the gap between the theory and practice of causal discovery. It primarily emphasizes on mass cytometry and single cell data as the application domain with the intent to de novo induce biological signal pathways. It also explores methods for massive integrative analysis of biological data, feature selection for Big Data and other novel directions for emergent bioinformatics needs. Project Funding The research leading to these results has received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement n. 617393. Project Partners • Karolinska Institutet Contact Details Project Coordinator, Dr Ioannis Tsamardinos University of Crete Department of Computer Science Voutes Campus GR-70013 Heraklion, Crete, Greece T: +30 2810 393575 E: email@example.com W: www.mensxmachina.org W: http://cordis.europa.eu/project/ rcn/191274_en.html
Dr Ioannis Tsamardinos
Dr Ioannis Tsamardinos is an Associate Professor at the Computer Science Department of the University of Crete and is the co-founder of Gnosis Data Analysis PC, a University start-up. He gained his Ph.D from the Intelligent Systems Program at the University of Pittsburgh in 2001 and held further academic positions in the US before returning to Greece in 2006.
Integrative data analytics: Study similarity maps
Drawing inspiration from nature in catalyst development Terpenes are the largest class of chemicals produced in nature, however chemists across the globe have not yet been able to mimic the efficient cyclization processes found in nature with man-made catalysts. We spoke to Professor Konrad Tiefenbacher about the Terpenecat project’s work in developing selective catalysts for terpene cyclizations The
largest class of chemical compounds produced in nature, terpenes are commonly used in medicine, including in anti-cancer and anti-malarial drugs. While nature is able to produce these compounds efficiently, organic chemists have not yet been able to develop a similarly effective method, an issue that lies at the core of the Terpenecat project’s research. “The aim of the project is to develop selective catalysts for terpene cyclizations,” says Professor Konrad Tiefenbacher, the project’s Principal Investigator. Terpene cyclizations are among the most complex reactions performed in nature, now researchers aim to develop selective catalysts for this purpose, which could open up the possibility of producing an almost limitless number of cyclized structures. Bridging the gap between supramolecular chemistry and current synthetic challenges: Developing artificial catalysts for the tailto-head terpene cyclization. TERPENECAT Project ID: 714620 Funded under: H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) ERC-2016-STG - ERC Starting Grant funded EU contribution: EUR 1 500 000 Professor Konrad Tiefenbacher University of Basel St. Johanns-Ring 19, CH-4056 Basel Switzerland T: 0041 612 075 609 E: firstname.lastname@example.org W: https://nanocat.chemie. unibas.ch/en/research/ W: http://cordis.europa.eu/ project/rcn/206316_en.html Konrad received his chemical basic education at the Technical University of Vienna and the University of Texas in Austin. After finishing his diploma thesis in the lab of Prof. Fröhlich, he pursued his interest in total synthesis of biologically active natural products during a Ph.D. in the lab of Prof. Mulzer at the University of Vienna. He then moved to Prof. Rebek’s lab at The Scripps Research Institute in La Jolla to learn about molecular recognition and self-assembly. In December 2011 he started his independent career as a Juniorprofessor (W1-position) at the Technical University Munich. In June 2016 he was appointed to a dual tenure track assistant professorship at the University of Basel and the ETH Zürich.
“If you imagine folding a cable, you could also fold these linear terpenes in nearly limitless different kinds of ways,” explains Professor Tiefenbacher. “So we could directly produce hundreds – maybe thousands – of different cyclized structures from essentially one simple starting material.”
Supramolecular catalyst utilized by the Terepenecat project.
and artemisinin, which are used as anti-cancer and anti-malarial drugs, respectively. “One of the goals in the project is to make new derivatives of artemisinin. We aim to take this antimalaria agent, and to modify specific parts of it,” continues Professor Tiefenbacher. “That means we would start with a slightly modified starting material, and try to cyclize it to the artemisinin framework. So, after folding the cable in the right conformation and gluing it together, we would have a new artemisinin derivative.” This work could open up new avenues of research in terpene chemistry. Controlling terpene cyclization with man-made catalysts could eventually lead to simplified terpene production, yet the project is still in its early stages, so Professor Tiefenbacher is continuing to develop new catalysts. “So far we have found one catalyst which is able to perform this cyclization, but it’s not really selective. We are trying to understand the
One of the goals in the project is to make new derivatives of artemisinin. We aim to take this anti-malaria agent, and to modify specific parts of it. That means we would start with a slightly modified starting material, and try to cyclize it to the artemisinin framework Substrate conformation A major focus now is learning how these hundreds or thousands of different products can be selectively produced. A key step in this is controlling the conformation of the substrate; Professor Tiefenbacher again draws on the example of folding a cable. “Conformation would just be to put the cable into the right shape. With terpenes, first we fold the starting material, and then during the cyclization process it’s glued together,” he explains. From here, researchers can then look to learn how to construct terpene structures like taxol
principles which enable this catalyst to perform this cyclization,” he says. A lot has been achieved already in terms of understanding why this catalyst is catalytically active, now researchers are looking further ahead. “Over the next few years we plan to modify the selectivity of this catalyst and investigate what other kinds of products could be formed in this way,” outlines Professor Tiefenbacher. “Eventually, we want to be able to design and construct catalysts which are able to selectively cyclize a terpene in a desired manner.”
Evaluating the biological effects of nanoparticles is a hugely complex task, and it is not commonly performed in great depth and in a cost-efficient manner. The 3DinvitroNPC project has exploited a 3-dimensional method of assessing the viability and functionality of nanoparticles for medical applications, as Dr Anna Laromaine explains In Vitro
Cells on a surface - 2D
Cells on scaffold - 3D
Cells in suspension
New dimension in nanoparticle research Nanoparticles are increasingly used in medicine today, with applications in drug delivery, medical imaging and in vitro biosensors, to name just three areas. These nanoparticles need to be thoroughly characterised before they can be used in biomedical applications, an issue that lies at the core of the 3DinvitroNPC project. “The idea behind the project is that instead of evaluating nanoparticles on a 2-D cell culture, we will develop 3-D biological approaches,” explains Dr Anna Laromaine, the project’s Principal Investigator. This will provide a more realistic environment to test nanoparticles and assess their likely effects than existing methods. “With a 3-D cell culture, we could evaluate questions like how nanoparticles interact with cells, how they aggregate, whether they degrade, and how they interact with membranes,” outlines Dr Laromaine.
to remove liquid and maintain the material’s porosity. “When you dry a porous material that has been submerged in water, the capillary forces actually make the pores close in the structure. By using super-critical drying, we prevent the pores in this structure from collapsing,” continues Dr Laromaine. Researchers are also working with another model system, C. elegans, which is also used to screen nanoparticles in the laboratory. These types of approaches are quicker and can be more efficient than existing methods of assessing nanoparticles, says Dr Laromaine. “This is much quicker than in vivo approaches. It’s simpler experimentally, and you can still get very complex information,” she explains. Researchers assess the viability and functionality of nanoparticles, aiming to evaluate how these nanoparticles change within a biological in vivo structure. “Are
The idea behind the project is that instead of evaluating nanoparticles on a 2-D cell culture, we will develop a 3-D cell culture and in vivo simple biological structures Nanoparticle testing The core goal of the project is developing a platform to test nanoparticles, building on Dr Laromaine’s previous experience of developing 3-D cell scaffolds. Aerogels of biomaterials are being used to create biodegradable, transparent scaffolds which accurately mimic the biological environment; the material itself is porous, which Dr Laromaine says is an important issue. “Human tissues are porous, and they need sufficient nutrients, while cells have to be able to grow within the system,” she says. A method called super-critical drying is used
these nanoparticles the same status? Are they dispersed in a similar way to how they were first delivered? What is the coating of those nanoparticles? We approach these questions from the materials point of view,” says Dr Laromaine. These issues are critical to the application of nanoparticles in biomedicine. Researchers will assess the viability of FexO y nanoparticles which could be applied as magnetic labels of cells, a means of tracking the nanoparticle. “When the cells uptake those nanoparticles, you can then pinpoint where the cell is, and not only the
nanoparticle. This is important, as you can then monitor the bio-distribution of those cells,” outlines Dr Laromaine. More detailed characterisation of nanoparticles could open up new applications in biomedicine, but the immediate focus for the project will be on the platforms. “We aim to more fully validate the platforms and screen different types of nanoparticles, then we can look for more targeted applications. One could be developing iron oxide nanoparticles as a food supplement for anaemia for example,” says Dr Laromaine. This research was partially funded by the Spanish Ministry of Economy (MAT2012-35324 and MAT2015, Ramon y Cajal program (AL, RyC-2010-06082), FPU program (LGM, FPU12/05549), Severo Ochoa Program (SEV-2015-0496) co-funded by European Social Funds), the Generalitat de Catalunya (2014SGR213), People Program of the European Commission (grant agreement no. 303630, co-funded by the European Social Fund), the Christian Boulin fellowship 2015 from EMBL (LGM), L’ORÉAL-UNESCO For Women In Science-Spain 2016, the COST Actions HINT (Action No. MP1202), and GENIE (Action No. BM1408-A). Dr Anna Laromaine Institut Ciència de Materials de Barcelona (ICMAB-CSIC) Campus UAB - 08193 Bellaterra - Spain T: +34 935 801 853 Ext 367 E: email@example.com W: www.icmab.es/nn Dr Anna Laromaine is a tenured researcher at the Institute of Material Science of Barcelona (ICMAB). She is currently working on a project involving the synthesis and evaluation of nanoparticles in 3D environments.
Building a picture of biological robustness Complex biological systems are subject to many perturbations, yet they show great robustness during development. Researchers in the RobustNet project are using C. elegans as a model system to investigate the underlying factors behind this robustness, and also model infections by emerging pathogens, as Dr Michalis Barkoulas explains A biological system is subject to many perturbations during development, including both environmental and genetic variations, yet such systems are able to cope with this variability and produce a consistent output. Based at Imperial College in London, Dr Michalis Barkoulas is the Principal Investigator of the RobustNet project, an EU-funded initiative investigating the underlying mechanisms behind biological robustness. “In particular we’re looking at developmental robustness – for example, thinking about how a fertilised egg is transformed into a multi-cellular individual. We take it for granted that this is going to work, but we shouldn’t lose sight of the fact that this is pretty remarkable, given all the perturbations that biological systems face,” he outlines.
C. elegans The C. elegans nematode is being used in the project as a model system in research. C. elegans is a eutelic organism, meaning that it produces a precise number of cells; complete lineage maps have been developed for all different cell types, STUDYING MECHANISMS OF DEVELOPMENTAL ROBUSTNESS AND NOVEL OOMOCYTE INFECTIONS IN C. ELEGANS Doctor Michalis Barkoulas Group Leader | Imperial College Department of Life Sciences SAF building, office 607 SW7 2AZ, London T: +44 (0)207 5945227 E: firstname.lastname@example.org W: https://www.imperial.ac.uk/ people/m.barkoulas : https://mobile.twitter.com/ barkoulab?lang=en Dr Michalis Barkoulas obtained his PhD on plant evolution and development from the University of Oxford. He started working on questions related to C. elegans developmental patterning as a postdoctoral researcher at the Ecole Normale Superieure in Paris. He is currently a Lecturer at Imperial College in London and his group investigates the robustness of various developmental systems and host-pathogen interactions.
enabling researchers to trace the developmental history of any cell in the adult all the way back to the zygote. “Based on these reproducible developmental lineages, we can study the mechanisms that are actually behind this robustness,” says Dr Barkoulas. One major area of
prioritise them and how do we identify those that are likely to help us derive more general principles,” says Dr Barkoulas. The wider goal in this research is to investigate how genes stabilise developmental outcomes, in order to establish a framework for phenotypic buffering. This could lead to new insights into the underlying causes of disease, and how mutations cause specific conditions. “A lot of disease can be viewed in the context of the buffering of normal physiological states. For example, we study cell numbers, which is very relevant to cancer,” says Dr Barkoulas. “We hope that some of our findings, such as identifying points of fragility in C. elegans gene networks, will be relevant for other biological systems and disease in the long run. For example, cancer is notoriously robust against therapeutic treatments but it is still unclear how cancer cells gain this robustness.”
We hope that some of our findings, such as identifying points of fragility in C. elegans gene networks, will be relevant for other biological systems and disease in the long run interest in the project is the consistency of cell numbers in the population. “If you compare animals in controlled settings, they typically have the same number of cells,” continues Dr Barkoulas. C. elegans nematodes all have the same genetic composition. However, mutations can be induced where certain genes and pathways are perturbed, leading to animal-to-animal variation in the population. Dr Barkoulas and his colleagues are using genetics to discover the full spectrum of mutations that can give rise to variability in the model organism. “We are trying to find out what are the genes leading to developmental variability and how they work,” outlines Dr Barkoulas. These screening techniques are unbiased, so researchers hope to identify both pleiotropic factors, which are not specific to a particular developmental system, and more specific factors for the tissues investigated. “These screens provide a lot of data on new mutations; the question then is how do we
Oomycetes Researchers are also using C. elegans to model infections from a particular class of understudied organisms called oomycetes, the most famous of which is Phytopthora infestans, which causes potato blight. For the first time Dr Barkoulas and his colleagues have two oomycetes in culture infecting C. elegans; this opens up new areas of research. “We study questions related to innate immunity. How does C. elegans recognise the presence of these pathogens in the environment? What sorts of responses does it trigger as protection? And how do these pathogens manage to circumvent these responses to kill nematodes?” he says. This research holds wider importance in terms of understanding disease. “This is particularly important as some of these oomycetes also infect humans, especially in tropical regions, and these infections are currently incurable,” explains Dr Barkoulas.
Steps towards a treatment for heart failure Patients who have experienced a structural heart re-modelling, where the wall of the heart becomes thicker, are at greater risk of developing heart failure in future. The Beta3_LVH project aims to assess the efficacy of a drug called mirabegron in preventing heart failure in patients at risk of developing the disease, as Dr Nancy Van Overstraeten explains The prevalence of heart failure with preserved ejection fraction (HFpEF) continues to rise across the developed world. While patients with HFpEF are still able to pump blood relatively efficiently, there are structural and functional signs to indicate they may be at risk of developing more serious problems in future. “Despite preserved ejection fraction, cardiac remodeling can be thought of almost as an early stage in the development of heart failure when it leads to altered filling properties of the heart,” explains Dr Nancy Van Overstraeten. Based at the Université catholique de Louvain, Dr Van Overstraeten is a translational research coordinator managing the Beta3_LVH project, a multi-centric, european initiative testing a new treatment. “We are trying to demonstrate the efficacy of a drug called mirabegron, which is already on the market to treat bladder disease. This drug is known to activate the human beta3 adrenergic receptors, which are also present in the heart,” she explains.
stress-induced wall thickness decreases”, she says. Researchers are now looking to test the efficacy of mirabegron in a clinical trial. “The major points we will assess will be the left ventricular mass index and the change in diastolic function. Is the diastolic function restored after treatment?” continues Dr Van Overstraeten. “We aim to demonstrate that mirabegron can be effective in preventing heart failure.” Researchers from nine european centers are currently recruiting for the clinical trial, sponsored under a Horizon 2020 program, looking to identify people who are showing morphological signs of structural cardiac remodelling, as indicated by echocardiographic techniques. While the age range for the trial is 18-90, these signs are more common in elderly people, so they are likely to form the core of the trial group; Dr Van Overstraeten and her colleagues also aim to assess the impact of mirabegron on their quality of life. “One of the main symptoms of heart failure is that patients are tired. During exercise
The major points we will assess will be the left ventricular mass index and the change in diastolic function. Is the diastolic function restored after treatment? We aim to demonstrate that mirabegron can be effective in preventing heart failure with preserved ejection fraction Structural heart re-modelling The focus in the project is on patients who have already had a structural heart re-modelling, where the wall of the heart becomes thicker, causing it to become more rigid and its filling to be impaired. At the moment, there is no treatment available to prevent this from developing into heart failure, but Dr Van Overstraeten says that data so far on mirabegron is promising. “Preclinical data from our laboratory on animals showed that when the human beta3 adrenergic receptor is genetically expressed in the mouse heart, the
- when they climb stairs for example, or when they do everyday physical exercise - they are tired and short of breath. So we will see if their exercise capacity is improved as a result of this treatment,” she says. This would represent a significant development, as currently there is no specific treatment available for HFpEF. Current treatment centers on reducing risk factors, such as obesity and hypertension. “This will be the first time that a drug has really been used to prevent the development of HFpEF,” says Dr Van
Overstraeten. Researchers are in contact with drug development companies, looking towards potentially using beta3 adrenoceptor agonists to prevent heart failure; however, Dr Van Overstraeten says she and her colleagues will remain focused on more fundamental research in future. “His research group is investigating new signaling pathways regulating myocardial remodeling,” she outlines. A multi-center randomized, placebocontrolled trial of mirabegron, a new beta3adrenergic receptor agonist on left ventricular mass and diastolic function in patients with structural heart disease (Beta3_LVH) Patients with cardiovascular risk factors, e.g. hypertension and obesity are at risk of developing heart failure with preserved ejection fraction (HFpEF), a highly prevalent disease in the elderly, mostly women population. There is currently no specific, defined treatment for HFpEF, beyond control of risk factors. Activation of cardiac and vascular Beta3-adrenergic receptors (B3AR) represents a new concept and a novel target for structural cardiac disease. B3AR expression and coupling were demonstrated in human myocardium and vasculature. Professor Jean-Luc Balligand Université catholique de Louvain 1, Place de l’Université B-1348 Louvain-la-Neuve (Belgium) T: + 32 2 764 52 62 E: Jl.Balligand@uclouvain.be W: http://www.beta3lvh.eu/home-
Professor Jean-Luc Balligand is Head of the FATH Pole within IREC, a practicing Physician at the Cliniques Universitaires Saint-Luc, UCL, and also teaches cardiovascular physiology and pharmacology at UCL Medical School. His research group is investigating new signaling pathways regulating myocardial remodeling.
Signalling the way to new therapies Changes in cell signalling are thought to be an important factor in the development of chronic lymphocytic leukemia (CLL), which is among the most common forms of cancer affecting adults. Research into the underlying mechanisms behind these changes could lead to both a deeper understanding of the disease and improved treatment, as Professor Hassan Jumaa explains The
molecular mechanisms that regulate the development of lymphocytes are an area of great interest in medical research, in particular B-lymphocytes, which generate the antibodies which protect us against infectious diseases. Alongside investigating the underlying mechanisms behind this process, Professor Hassan Jumaa and his colleagues at Ulm University are also interested in abnormalities in lymphocyte development. “If this development is not working well then cells can become transformed or malignant, and grow in an uncontrolled way,” he explains. “This leads to new questions – what are the signals that lead to this transformation? Which genes or gene products are involved? How can we interfere with their growth so that we can identify targets for therapy or diagnosis? These are the general questions that we are investigating.” Chronic lymphocytic leukemia This research holds particular relevance to our understanding of chronic lymphocytic leukemia (CLL), one of the most common types of cancer among adults. In CLL, a transformed B-lymphocyte expands in an uncontrolled manner and makes too many of the clonal B-lymphocytes, which have given up their physiological function and, instead, can hamper the ability of residual untransformed B-lymphocytes to protect against disease. “By secreting antibodies, the B-lymphocytes normally label pathogens, which become visible to phagocytes, which in turn respond and eat up the pathogen,” explains Professor Jumaa. The B-lymphocytes have an intrinsic machinery for mutation, so they can improve their affinity for a specific pathogen by mutating their genome; while this is important to generating high affinity antibodies, it can lead to other changes. “When the B-lymphocytes mutate their genome to generate these high-affinity antibodies, they might induce mutations that might lead to other transformations,” says Professor Jumaa.
Figure 1 Activation of BCR-dependent signaling in CLL B cells. (A) Signaling of conventional BCRs is activated by binding of self- or foreign antigen (Ag) to neighboring BCRs. (B) In CLL B cells, BCR signals are generated by an inter-molecular interaction between the variable regions of neighboring BCRs independent of external antigens. Therefore, this BCR activation modus is referred to as antigen-independent or cell-autonomous signaling. The development of CLL and other forms of leukemia are associated with these kinds of changes in the basic mechanisms of cells. While the B-lymphocytes need to mutate to generate efficient antibody responses, there is a price. “Sometimes mistakes happen, which are usually removed by control mechanisms. But if these mechanisms fail, then the cells might become malignant and not secrete antibodies in the case of CLL. So, the malignant cells give up their original function, meaning that people with CLL are immune deficient,” outlines Professor Jumaa. A prime focus now is investigating the mechanisms behind the survival or transformation of the cells that cause CLL, building on earlier basic research. “We suggest that these cells have acquired cell autonomous mechanisms for
proliferation and survival, so mechanisms independent of the outside environment,” continues Professor Jumaa. These changes allow the cells to activate their signalling machinery without requiring an outside ligand to initiate the process, now researchers aim to learn more about the underlying factors behind this. A key area that Professor Jumaa and his colleagues are investigating is acquired changes in the B-cell antigen receptor, which lead to the autonomous activation of signalling. “We think it is most likely that these changes occur during immune responses. We have identified a sub-group of patients, in which a mutation in the receptor is essential for self-aggregation of receptors, for the induction of cellautonomous signalling,” he explains. “If we can learn more about the molecular details, we might be able to interfere with this mechanism and develop a specific therapy against the disease, as this autonomous signalling appears to be specific to CLL. Our investigations might improve the otherwise poor prognosis of this sub-group.” This goal forms an important part of the wider agenda, with Professor Jumaa keen to explore the clinical implications of this research. Alongside exploring fundamental mechanisms at the B-cell antigen receptor, Professor Jumaa also
aims to develop effective therapies against CLL, building on a deeper understanding of these underlying mechanisms. “The idea is to develop antibodies with which we could precisely target and remove the cells with the specific changes that we’ve identified. This would be the first really CLL-specific therapy,” he outlines. “It would not be effective for all patients, as every CLL sub-group has its own specific change, which needs to be characterized in order to develop a specific therapy. For the sub-group with poor prognosis mentioned above, we are in the process of developing and characterising such specific reagents, to recognise the mutated site, and to hopefully remove the malignant cells.”
treatment of other groups in future,” says Professor Jumaa. The primary focus at the moment however is this specific subgroup, because the change that has been identified is clear, and it’s known to be associated with the key mechanism. “If we could develop an antibody or a reagent that recognises this change, then we will have achieved our aim,” continues Professor Jumaa. This applied research is central to improving treatment, yet Professor Jumaa plans to also pursue more basic work that can reveal fundamental insights into the behaviour of cells and the underlying causes of CLL. While it is important to maintain close links with clinicians if the disease is to be more fully understood,
If this development is not working well then cells can become transformed or malignant, and grow in an uncontrolled way. This leads to new questions – what are the signals that lead to this transformation? Which genes or gene products are involved? How can we interfere with their growth? There is also the possibility of using this approach in prevention of CLL among groups at higher risk of developing the disease. CLL affects mainly older people, who could be screened for the presence of such cells once the reagents for detecting the malignant cells of the sub-group with poor prognosis are available. If necessary these cells could be removed before the disease develops. “We are primarily focused on developing a therapy specifically for this sub-group of patients that we’ve identified, but this research could also hold implications for the
Professor Jumaa believes this needs to be balanced by more fundamental research. “If you focus just on the patients, you might miss the key mechanisms,” he points out. “So if you understand the biology, and look at the cases that might be related to the biology, then this is an unbiased way of understanding diseases. I would say, for me, that this is the right way to go. So our focus will remain on basic research and connecting to our partners in the clinic, to see what problems are, and where can our improved understanding be applied in the cases that we see?”
At a glance Full Project Title Role of autonomous B cell receptor signalling and external antigen in the pathogenesis of chronic lymphocytic leukaemia [CLL] (Autonomous CLL BCRs) Project Objectives Chronic Lymphocytic Leukaemia (CLL) is a common type of blood cancer. It is characterized by uncontrolled growth of B-lymphocytes that normally produce antibodies to fight infections. The European Research Council Advanced Grant Autonomous CLL BCRs aims at characterizing the mechanisms that cause immune cells to convert into malignant cancer cells. Project Funding Funded by an ERC-ADG - Advanced Grant. EU contribution: EUR 2 256 250 Project Partners • Nicholas Chiorazzi, MD, The Feinstein Institute for Medical Research, Head, Karches Center for Oncology Research, Hofstra Northwell School of Medicine, 350 Community Drive, Manhasset, NY 11030 / • Massimo Degano, PhD, Biocrystallography Unit, Division of Immunology, Transplantation and Infectious Diseases, IRCCS San Raffaele Scientific Institute, Milan, Italy / • Paolo Ghia, PhD, Università Vita-Salute San Raffaele, Milan, Italy / • Kostas Stamatopoulos, PhD, Institute of Applied Biosciences, Center for Research and Technology, Thessaloniki, Greece / • Stephan Stilgenbauer, MD, Department of Internal Medicine III, Ulm University hospital, Albert-Einstein-Allee 23, 89081 Ulm, Germany Contact Details Project Coordinator Professor Hassan Jumaa Ulm University Albert-Einstein-Allee 11 89081 Ulm Germany T: +49 (0)731500-65200 E: email@example.com W: http://cordis.europa.eu/project/ rcn/204856_en.html
Professor Hassan Jumaa
Professor Hassan Jumaa is a Full Professor (W3) and Chair, Institute of Immunology, University Hospital Ulm. He received the Georges-Kohler-Award of the German Society of Immunology (DgfI) in 2004 and has held his current position since 2013.
Europe’s eHealth Innovations: An interview with Pascal Garel EU Research interviews Pascal Garel, Chief Executive, European Hospital and Healthcare Federation (HOPE), about eHealth in Europe. HOPE’s mission is to promote health for citizens and to push for a uniformly high standard of hospital care in the European Union. Can eHealth provide healthcare solutions for a growing, ageing population? By Richard Forsyth EU Researcher: In the broad spectrum of eHealth categories e.g. training, diagnosis, healthcare delivery etc. – where is the most promise and which do you see as the most important?
Pascal Garel: Healthcare has been slower than some other industries to feel the full force of digital technology. But from wearable sensors providing real-time data on an individual’s wellbeing, to the ability to sequence a person’s genome for a limited cost within 24 hours, it promises greater visibility over personal health and offers the possibility of earlier, more targeted treatment when people fall ill. These devices are part of a shift towards more informed patients taking greater control over their own health, accessing to some kind of digital dashboard of health data. Certainly the first of all the most promising elements is health analytics and Big Data in health: the transformation of data for the purpose of providing insight and evidence for decision and policy-making. Big Data makes reference to a big amount of data, larger and more complex than traditional data processing can process; this requires the use of distributed systems and advanced methods of data analysis. The second most promising elements are electronic health records (EHRs), real-time patient-centred records that provide immediate and secure information to authorised users. EHRs include typically a record of the patient’s medical history, diagnoses, treatment, medications, allergies and immunisations, as well as radiology images and laboratory results. The fact that this information is in digital format makes it easier to search, analyse and share.
With telehealth medical services, they are delivered from a distance. This encompasses remote clinical diagnosis and monitoring. Telehealth also includes a wide range of nonclinical functions encompassing prevention, promotion and curative elements of health. It also involves the use of electronics means or methods for healthcare, public health, administration and support, research and health education. Mobile technologies start to support health information and medical practices. The main activity of mHealth is the potential to reach wide geographical areas and the use of portable forms. Mobile health is incorporated into healthcare services such as health call centres or emergency number services and also includes functions such as lifestyle and well-being apps, health promotion and wearable medical devices or sensors. Finally, eLearning in health uses electronic technology and media for training and education that could be used to improve the quality of education and also to increase the access to learning in geographically isolated locations or those locations with insufficient training facilities. This will contribute to increase the number of trained professionals with specialised or general skills.
EUR: Which countries have well developed eHealth initiatives and systems – can you give examples and do these provide models for other countries?
Pascal Garel: HOPE organises a European conference on 16 November 2017 in Dusseldorf, precisely on this issue. But even if we have already identified good practices to be presented during
mHealth can contribute to the empowerment of patients: they could manage their health more actively, live more independently, thanks to self-assessment or remote monitoring solutions
this event, from Spain, Estonia, Denmark, etc. we must recognise that it is easier to find good practices at hospital level or at regional level than at national level. Not a single country can be found that covers all innovative and efficient use of information technology. And anyway, considering the cultural and historical roots of health and social systems, good practices are not necessarily transferable. In this regard, an interesting initiative has been taken by the European Innovation Partnership on Active and Healthy Ageing, which is to organise some kind of twinning to overcome those transferring issues.
directly delivered information to the patients is also significant, as well as the prediction of outcomes, e.g. containment and improvement of chronic diseases, global infectious disease surveillance through evolving risk maps and better understanding of demographic challenges and trends â€“ and disease trans- mission pathways. For knowledge dissemination, it can help physicians for example, to stay current with the latest evidence guiding clinical practice. Finally, it can contribute to the reduction in inefficiency and waste, with improvement in cost containment.
EUR: With a growing global population and resources that are stretched in many countries â€“ is eHealth critical to coping with the next decades?
EUR: Can eHealthcare help with mental health?
Pascal Garel: European population is ageing. The proportion of older people in our countries is increasing, due to fewer children, as well as a longer life expectancy. Healthcare costs in Europe are increasing. These costs are, in most European countries, a growing component of GDP, in some cases still a growing part of public finances, representing up to 12% of GDP. Another major trend is that about 40% of the European population above the age of 15, are reported to have a chronic disease. This proportion climbs to 66% of the population over 65, having at least two chronic conditions. Whatâ€™s more, 70% of healthcare costs are spent on chronic diseases and this figure is expected to rise in the coming years. For this reason, EU Member States are trying to achieve an affordable, more efficient, less intrusive and more personalised care of the citizens. The use of information and communication technologies can be of great help and eHealth is seen as a major driver to maintain quality health services in an affordable way. The eHealth solutions and technologies are expected to significantly increase in the upcoming years. Such solutions involve a broad group of activities that use electronic means to deliver healthrelated information, resources and services. These include supportive eHealth policy, legal and ethical frameworks, infrastructure development and developing the capacity of the health workforce through training. In the world of Big Data, applications may either be prospective data monitoring or retrospective data analysis and may contribute to increasing the effectiveness and quality of treatments by: earlier disease intervention, reduced probability of adverse reactions to medicines, less medical errors, determination of causalities, understanding of co-morbidity, cross-linkage of healthcare providers and professionals, intensification of research networks, and fusion of different networks such as social networks, disease networks or medicine networks. It provides, as well, widening possibilities for the prevention of diseases by identification of risk factors for disease at population, subpopulation, and individual levels, and by improving the effectiveness of interventions to help people achieve healthier behaviours in healthier environments. The improvement of pharmacovigilance and patient safety through the ability to make more informed medical decisions, based on
Pascal Garel: This is clearly what has been found out in the European Commission and Member States joint action on mental health that was recently completed. The main objectives of its working group on Depression and Suicide was to include eHealth. In taking action against depression and to prevent suicide in different target groups (adolescent, young adults, middle age and older people), the integration of e-mental health in implementing evidence-based interventions and improving sustainability of good practices, has a clear added value. For example, by promoting trans-national approaches to e-support for minority groups present in more European countries; or by integrating eHealth interventions into the package of health services and clinical practice of health professionals. EUR: How has mHealth helped rural areas and less developed countries deliver healthcare and what can we expect in the future in this context?
Pascal Garel: Mobile health is a rapidly developing field: over 100,000 mHealth apps are currently available on the market. It has the potential to play a part in the transformation of healthcare and increase its quality and efficiency. Indeed, mHealth can contribute to the empowerment of patients: they could manage their health more actively, live more independently, thanks to self-assessment or remote monitoring solutions. The provision of mHealth can also support healthcare professionals in treating patients more efficiently, as mobile apps can encourage adherence to a healthy lifestyle. This is already visible in developing countries, sometimes at a faster pace than in our countries. We are not yet at the optimal level of the use of mHealth for remote areas but several pilots are in place. It may take time for them to be mainstream. At European level, in April 2014, the European Commission launched a public consultation alongside its Green paper on mobile health to help identify the right way forward to unlock the potential of mobile health in the EU. It gathered inputs from interested stakeholders on barriers and issues related to the use of mHealth. Together with the Green Paper, the Commission also published a legal guidance on EU legislation in the field to app developers, medical device manufacturers, digital distribution platforms, etc. The Privacy Code of Conduct and the mHealth assessment guidelines and other research and Innovation in mHealth are among the results.
EUR: Will there be a day when eHealthcare like telemedicine, ePrescriptions etc is the norm and standard globally? If so – in what kind of timeframe do you predict this could happen?
example, the overall impact of cyberattacks on the hospitals and healthcare systems is estimated to be nearly six billion per year, in terms of financial costs.
Pascal Garel: Due to the diversity of resources, of cultures
EUR: What is the most exciting eHealth project or initiative, in
and of many other obstacles, this will certainly take more time than expected. In any case, health literacy and more precisely, digital literacy, is a major challenge. Systems must be designed to meet the needs of patients and those who care for them. Many people are already undertaking some health transactions online, such as ordering repeat prescriptions, checking hospital reviews or booking hospital and other healthcare services appointments. With increasing amounts of health information being presented online, hospital and healthcare services need to ensure that groups with lower levels of internet use, such as older people and more deprived groups, do not miss out. Digital exclusion will need to be addressed, including by catering differently for those who are accessing no, or limited amounts of information and support online.
your opinion, out there that you know about today?
EUR: There have been large scale eHealth disasters – take the UK’s NHS £11bn fiasco, where the system was pulled. What are the biggest challenges to large scale eHealth projects? For instance, is it mixing two very different disciplines and groups of experts - IT engineers and health professionals?
Pascal Garel: Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. The eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability, in large scale eHealth deployment. In the eStandards project in which HOPE is involved, 19 European case studies reporting from R& D and large-scale eHealth deployment and policy projects have been analysed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures and lessons learned, a paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organisations can serve the users embracing sustainability and technical innovation. Another key issue to mention here is cybersecurity. It must play a crucial role within the development of eHealth, to build the trust required in society for these services to flourish. The cybersecurity investment required by actors in eHealth will necessarily be comparatively higher than in other sectors, considering the direct costs derived from incidents. As an
Pascal Garel: Just to name one, among the projects in which we are involved, ICT4life is a three-year project financed by Horizon 2020, the EU Framework Programme for Research and Innovation. It kicked-off in Madrid on the 19th of January 2016, with the ambition to provide new services for integrated care, employing user-friendly ICT tools, ultimately increasing quality of life and autonomy at home for patients with Parkinson’s, Alzheimer’s and other dementias and also their caregivers. This initiative brings together nine partners representing academia, industry and users’ groups, all committed to improving patients’ lives and advancing Europe’s leadership role in personalised services for integrated care. The partners of this well-balanced and multidisciplinary consortium are: Artica Telemedicina (Spain) which leads the project, Polytechnic University of Madrid (Spain), Madrid Parkinson Association (Spain), Netis Informatics Ltd. (Hungary), E-seniors (France), Centre for Research and Technology Hellas (Greece), Maastricht University (Netherlands), European Hospital and Healthcare Federation (Belgium) and the University of Pécs (Hungary). The motivation behind ICT4Life comes from the need to find solutions aimed at developing the concepts of self-care, active patients and integrated care. To reach this goal, ICT4Life will conduct breakthrough research and radical innovation and will implement the ICT4Life Platform. ICT4Life Platform will deliver services, aimed at increasing the quality of life and the autonomy of elders in their own homes, nursing homes, day care centres and hospitals. It will support health professionals and formal and informal caregivers in the provision of integrated care to people affected by cognitive impairments at an early stage. ICT4Life Platform will provide proactive and patient-centred care using robust and secure communication channels and dedicated digital interfaces and it will be tested in real operating environments, through extensive pilots.
The value of interoperability in healthcare Healthcare services are typically organised on a national level, yet there are cases where patients require care while abroad. Cross-border eHealth interoperability will help ensure clinicians have the data they need to manage such patients safely, yet it is also important to establish a rigorous business case if these services are to be sustained, as Professor Dipak Kalra explains The individual Member States of the EU run their own national health services in their own ways, yet the European Commission (EC) does play a role in supporting the provision of cross-border care. The EC is presently establishing a European technical infrastructure and services to enable the secure communication of patient medical summaries and electronic prescriptions between European countries in authorised healthcare situations. However, while there are often cases where people require healthcare outside their own country, the numbers are dwarfed by national citizens using their own domestic healthcare services. “It is not sustainable to set up specific business models or develop special systems and services that only support this small patient number,” points out Professor Dipak Kalra, the Principal Investigator of the VALUeHEALTH project. An initiative funded under the Horizon 2020 programme, VALUeHEALTH, was set up with a clear remit. “We have investigated and now produced a business model and business plan for the sustainability of cross-border eHealth services,” says Professor Kalra. “That is, cross-border exchange of information between Member States, in relation to supporting the health of citizens who either cross borders and then need healthcare, or are deliberately referred to go across a national border to receive healthcare.”
eHealth services This means exchanging patient-related data and supporting healthcare professionals to maintain some degree of continuity of care, in potentially multiple locations, which is a challenging task given Europe’s cultural and linguistic diversity. The initial cross-border services mainly support emergency care situations, which impact on relatively small patient numbers. It can be difficult for Member
VALUeHEALTH work plan Interoperability (EIF)
Roadmap of prioritised use cases
Adoption strategies & incentives
Service implementation & deployment
Stakeholder engagement & endorsement Management
States to justify their contributions to the costs involved in cross-border eHealth services in comparison to their ICT spending on within border services, which encouraged Professor Kalra and his colleagues to look for mutually beneficial scenarios. “The starting philosophy of the project was to look for win-win scenarios in which an investment by countries in cross-border services would actually also support them with their own withinborder healthcare communications,” he outlines. The project partners worked with a wide range of European experts to identify situations in health and social care in which interoperability is most needed. “We developed a portfolio of business use cases, scenarios of health information exchange, which would be plausible and useful both within and across borders,” explains Professor Kalra. A set of prioritisation criteria were developed to assess these use cases, and eventually two were prioritised, one of which was safe prescribing. When a medical professional is assessing a patient, it is clearly important to have the relevant background information about their medical history before issuing a new prescription. “The interoperability case to share information is to allow somebody
who is treating a patient to safely issue a new prescription - to ensure they have enough information about the patient that could inform a safe prescribing decision,” says Professor Kalra. The second use case was to go beyond the content of the current European emergency care summary, to make it efficient and useful to support continuity of care for patients with common, long-term conditions, such as diabetes. “The aim is to prioritise the information that would support somebody caring for a patient with a long term condition such as diabetes, to provide reasonably sound continuity of care for a patient,” explains Professor Kalra. “So in both situations we are dealing with an unfamiliar patient-clinician interaction.” The clinician may have no local records on the patient’s medical history, which will affect the quality of care they can provide. Professor Kalra describes a hypothetical example of a patient with diabetes who has experienced a hypoglycaemic attack while abroad. “Their blood sugar has dropped suddenly and unexpectedly – they’ve felt very dizzy and fallen on the floor in a shopping centre in a tourist town, and then been taken to hospital by ambulance,” he outlines. The clinician can restore the blood sugar back to normal levels fairly
quickly, but the bigger question is whether this is just a one-off event, or indicative of a bigger problem that needs more care and attention in the short-term, before the patient travels back home. “Do we need to admit this person to hospital and observe their blood sugar over a 24 hour period?” asks Professor Kalra. “This would lead to significant disruption for the patient and significant cost for the hospital, so there’s a big difference between discharging a patient and keeping them in hospital. What if the clinician had access to background medical information about the diabetes? What information would support them in making a more accurate decision?” This is the kind of issue the project is investigating. Effective, efficient exchange of information on patients with long term conditions could lead to improvements in care both within and across borders, so it is closely aligned with
clearly to clinicians. The data itself will also hold importance as a source of records which can be used to learn more about a condition, for example diabetes. “Which treatments are proving optimal? In which situations does health deteriorate unexpectedly requiring urgent intervention? Could that be anticipated? Public health systems could find this kind of information useful in planning services,” says Professor Kalra. The project has constructed several stakeholder value chains, that show how value is realised at each interaction. “We’ve developed those value chains, a qualitative business modelling framework, then on top of that we’ve undertaken some novel costeffectiveness modelling. The aim is to look at the benefit to an example health system if it was able to better treat patients in urgent, unexpected scenarios,” continues Professor Kalra.
We have investigated and now produced a business model for the sustainability of cross-border eHealth services. That is, cross-border exchange of information between Member States, to enable all European citizens to have
safer healthcare all over Europe the project’s wider agenda. “If we can implement a European-scale way of exchanging this information, then any country that doesn’t already have such a mechanism in place could re-use the same infrastructure,” points out Professor Kalra. Researchers defined these information flow scenarios, and from that went on to examine business models and value chains, which Professor Kalra says is important to the long-term prospects of cross-border eHealth services. “If this is to be successful and to be sustained, you need to assess the value across multiple stakeholders and the business benefits of adopting and sustaining this level of health information interoperability. This analysis has required the project to adopt robust business modelling methodologies. You need to look at all the stakeholders in the eco-system – some of those are going to have to invest financially to keep services going,” he stresses. “There are complementary investors in this area, for example, the ICT industry can play an important role.” Many interfaces are required to enable interoperability and it will also be important to present the information
Economic benefits The focus in the project is on the economic benefits of interoperability, which is of course the measure by which infrastructure projects tend to be judged. The financial return could relate to costs saved, care interventions avoided, yet this is not easy to quantify. “We need to convince Member States that if they choose to put money into establishing a set of electronic digital services that support cross-border information flows for patients with long-term conditions, then this will give them a national healthcare benefit too and a financial return,” outlines Professor Kalra. Member States have already agreed to collectively share the cost of a pan-European infrastructure for sharing an emergency, basic medical summary, yet Professor Kalra believes that this is not enough on its own to bene t patients. “We’re going to make recommendations to the European Commission and to its Member States, through the eHealth network, to recommend that they further invest in enriching the summary, to include the data needed to improve treatment of several long-term conditions,” he says.
ALUeHEALTH At a glance Full Project Title Establishing the value and business model for sustainable eHealth services in Europe (VALUeHEALTH) Project Objectives VALUeHEALTH is establishing how eHealth interoperability can create and deliver value for all stakeholders, for a sustainable market in scaling up cross-border services. We are developing an evidence-based business plan for interoperability, beginning with CEF support and later sustainable revenue streams for developing and operating self-funding priority panEuropean eHealth Services beyond 2020. Project Funding Estimated Project Cost: €999.818 Requested EU Contribution: €849.755
Project Partners • Please see http://www.valuehealth.eu for full details. Contact Details Project Coordinator, Professor Dipak Kalra European Institute For Health Records c/o University Hospital Gent - Building 5K3 Unit of Medical Informatics and Statistics De Pintelaan 185 - 9000 Gent (Belgium) T: +32 475 97 3819 E: firstname.lastname@example.org W: http://www.eurorec.org/
Professor Dipak Kalra
Professor Dipak Kalra is President of the European Institute for Innovation through Health Data, while he also serves as President of the European Institute for Health Records (EuroRec), Professor of Health Informatics at University College London and Visiting Professor of Health Informatics at the University of Gent.
Establishing trust in the use of health data Analysis of Electronic Health Records (EHR) and efficient information sharing can lead to improved treatment of disease, yet it is also important to take account of privacy and security concerns in the re-use of data. We spoke to Professor Dipak Kalra about the European Institute for Innovation through Health Data’s work in supporting best practice in the trustworthy re-use of electronic health records There are vast quantities of health data available in Europe today, collected within the health systems of all Member States. These data have the potential to offer a rich set of resources for researchers to investigate diseases and improve treatments in a cost effective way. This can bring important benefits to patients, healthcare systems and healthcare funders. However, while analysis of high-quality, interoperable, health data can lead to new insights, medical innovations and improved treatments, it is also important that the data is used responsibly, a challenge which was part of the motivation behind establishing the European Institute for Innovation through Health Data (i~HD). “The institute was set up because many stakeholders recognised that one of the major concerns around the use of health data is trustworthiness, particularly with respect to privacy protection,” says Professor Dipak Kalra, the institute’s President. The institute was founded on a not-for-profit basis to develop and promote best practices in research when re-using electronic health records, both by industry and academia. It is a multi-stakeholder organisation including patients, healthcare providers, funders and decision makers, pharma and the ICT industry and academia. “One of the threads in which we have been most active has been in developing codes of practice. We have developed underlying principles of good practice for using health data,” outlines Professor Kalra. Experts at the institute are developing standard operating rules that can be adopted by technology providers, hospitals, and pharmaceutical companies, all of whom make extensive use of health data. It is also important to consider the technology that is used in the course of research, a topic which Professor Kalra and his colleagues are addressing. “We’ve also developed a method of assessing software products that support connecting researchers with electronic
health records, to check that those technology products do include adequate privacy protection measures. This is a technical and organisational assessment and we have started to award the Quality Seal to organisations and products that meet our standards,” Geert Thienpont explains. The institute has developed a quality assurance seal for research platforms (QS4RP), which provides reassurance to stakeholders on Quality Seal for security and privacy Research Platforms issues, while sociotechnical research into related issues is continuing. The institute is also involved in several other enabling initiatives. “We are beginning to work on educational materials that support stakeholders in understanding the value of health data for research and how it can be adequately protected,” says Professor Kalra. “In another project, we are looking at how custodians of research data can make this available to another research organisations (data sharing): how this can be done in a trustworthy way.”
are patients with the specific condition for the study, and therefore where they could conduct the trial,” outlines Professor Kalra. The way in which pharmaceutical companies currently identify relevant hospitals, and that the hospitals then identify patients who could participate in the study, is relatively ad-hoc and errorprone, states Professor Kalra. “Many patients who could have been invited to join a study are missed, and as a consequence the clinical trials take longer than expected, as they take a long time to find the right patients to recruit,” he explains. “That increases the cost of the trial and leads to delays in completing the clinical trial, which then translates into a delay in getting the drug into health systems. This adds to the cost of the drug and delays the benefits of it. We also know that there are many patients who would like to be involved in a clinical trial, but do not manage to connect to a trial relevant to their condition.” By using electronic health records more efficiently, pharmaceutical companies can
The institute was set up because we identified the need to create a multi stakeholder platform to maximise the value of health data for the benefit of patients Health records The wider goal in this work is to connect the different stakeholders involved in the analysis and use of electronic health records, in order to enable good re-use of data. One major scenario of interest is clinical trials, where a pharmaceutical company has a medicinal product that is ready for large-scale clinical trial testing, and needs to establish that trial. “They need to connect to a number of healthcare sites, some of which may become clinical research sites for that study. Securely analysing large volumes of anonymised hospital electronic health records can help the clinical trial team to identify where there
potentially locate the hospitals with ideal participants for a clinical trial more quickly. The same profiling tools that help the company to find the right hospitals can then be used internally within a hospital, to help the hospital find the right patients. “A technical platform supporting the clinical research process could, for example, tag all the patients who have a matching health profile, and the hospital can then contact those patients. This recruitment process can therefore become pro-active, rather than just waiting for suitable patients to routinely attend an outpatient clinic,” explains Professor Kalra. Analysis of health records can also help researchers identify what
The European Institute For Innovation Thr~ugh Health Data
At a glance Full Institute Name The European Institute for Innovation through Health Data (i-HD) Vision Enriching knowledge and enhancing care through health data. Website www.i-hd.eu course of treatment is best suited to an individual patient. “You can actually study many anonymised patient profiles, in a single hospital or across a region or on a larger scale, and look at which care pathways and treatment plans result in better outcomes,” says Professor Kalra. “You’re basically sub-profiling patients.” A greater level of precision in profiling is likely to lead to healthcare improvements in terms of identifying the right course of treatment and assessing its likely impact. Many hospitals are improving their ICT infrastructure, and as more data is gathered on more patients, new opportunities arise to analyse it. “We’re seeing that there are opportunities to use data in innovative ways, as we’re collecting more data electronically and it’s being held in better quality systems,” says Professor Kalra. It’s necessary to also gather information on healthy patients in order to identify how treatment could be improved. “You need control groups as well as disease groups, so you have to look at a large number of patients in order to identify specific findings that may benefit a specific subgroup of patients,” continues Professor Kalra. “We need society to be comfortable with this scaling up, with researchers using this routinely collected data. It is important to reassure patients, privacy protection
Realising the Value from Health Data ~ Improving Care and Research
regulators and the wider public, that this clinical research eco-system can be handled in a trustworthy way. We are working, in collaboration with multiple stakeholder groups, to define these trustworthy practices, promote their adoption, and then to be able to offer some assurance to society. Our contacts with patient organisations, and research findings, all suggest that the public are broadly happy for their health data to be used for health related research, provided that appropriate safeguards are applied.” The institute is working closely with hospitals to address these areas of concern and to establish good practice. A workshop was held in Brussels in February, attended by representatives from 50 hospitals, all keen to make better use of their electronic health records. “These hospitals want to work together towards responsibly handling privacy protection, through our institute. They also want to work together to improve the quality of their data, so that they have better data, from which they can then work smarter,” outlines Professor Kalra. “We wish to work with hospitals and other custodians of health data to look at the best ways in which they can be trusting of research communities that want to use their data to investigate new research questions,” says Professor Kalra.
Members http://www.i-hd.eu/members/ Executive board • Dipak Kalra, President (i-HD) • Mats Sundgren (AstraZeneca) • Veli Stroetmann (Empirica) • Irene Schlünder (TMF) • Sebastian Semler (TMF) • Geert Thienpont (RAMIT) • Pascal Coorevits (EuroRec) • Bart Vannieuwenhuyse (Janssen Pharmaceuticals) Contact Details Professor Dipak Kalra c/o University Hospital Gent - Building 5K3 Unit of Medical Informatics and Statistics De Pintelaan 185 - 9000 Gent (Belgium) E: Dipak.email@example.com W: www.i-hd.eu
Professor Dipak Kalra
Professor Dipak is President of the European Institute for Innovation through Health Data.
September 21-22, 2017 MADRID, SPAIN
Geert Thienpont is a founding member of i~HD and responsible for the QS4RP (Quality Seal for Research Platforms)
The European Institute For Innovation Thr~ugh Health Data
Annual Conference 2017
Building the foundations of tomorrow’s healthcare The growing number of people suffering from more than one chronic disease is set to put a heavy burden on European health- and social care systems over the coming years, leading to an urgent need for new integrated care models. Professor Maureen Rutten-van Mölken tells us about the Horizon2020 EUfunded SELFIE project’s work in generating evidence on the effectiveness of Integrated Chronic Care models The question of
how to provide care to people suffering from chronic diseases is a prominent issue in many European countries as healthcare challenges continue to evolve, putting established models under strain. Growing numbers of people have multiple morbidities, leading to an urgent need for new Integrated Chronic Care (ICC) models that start with the holistic understanding of the person with multi-morbidity in his own environment, followed by person-centered, pro-active, and tailored care provision. Especially relevant in the case of multimorbidity is that continuity is ensured, which includes smooth and monitored transitions between professionals and organisations and attention to potential treatment interactions. This is an issue central to the SELFIE project. “We want national authorities to learn from the promising practices being used in other countries,” says Professor Maureen Ruttenvan Mölken, the project’s Principal Investigator. This starts with agreeing on a conceptual framework that can be used to systematically describe, develop, implement and evaluate promising practices. “We have developed such a framework which can be found on the SELFIE website (www.selfie2020.eu) and is published in a special issue of Health Policy on multi-morbidity (https://doi. org/10.1016/j.hea lt hpol.2017.06.002) ,” outlines Professor Rutten-van Mölken. “We have used the framework to write comprehensive descriptions of 17 promising integrated care programmes in the 8 countries that participate in SELFIE. These so called ‘thick’ descriptions try to go beyond a description of the facts and explain what lies beneath the surface. What makes that things work or don’t work. The thick description reports can be found on the SELFIE website. We have grouped the 17 promising programmes into 4 categories: 1) population health management programmes (n=6), frail elderly programmes (n=5), 3) programmes for problems in multiple life domains, like health, housing, and financial
problems (n=3), and 4) programmes for palliative and oncology patients (n=3). There is a lot of variety among the 17 integrated care programmes, yet there are also some marked similarities, like the focus on collaboration between the health and social care sector. In many programmes we also see the emergence of new specialist roles, like nurses who take on the role of case managers or counsellors for vulnerable people. They are the central contact point
home – they specialise in providing care to frail, elderly people,” says Professor Ruttenvan Mölken. “They commonly participate in the multi-disciplinary team meetings that are organised across many of the programmes, in which the involved professionals discuss the particular cases,” continues Professor Rutten-van Mölken. “It is interesting to see that in one of the Dutch frail elderly programmes, the older person, plus his informal caregiver, are
With MCDA, we move beyond the quality-adjusted life year as an outcome measure, and measure output in terms of the Triple Aim for these people, creating individualised care plans, and helping them to navigate through the health- and social care systems. One major challenge is integrating these new roles with the current system, and ensuring that they collaborate effectively with existing providers. “In the Netherlands for example, there is a growing role for elderly care physicians in primary care. This is a trained physician, who previously mainly worked in a nursing
both participating in the multidisciplinary team meetings at the GP practice, to achieve a process of shared decision-making between formal providers, informal caregivers, and the frail elderly” she outlines. However, the role of the informal caregiver varies widely across the programmes. “In Eastern Europe, it’s less common to involve the informal caregiver in a structured way in the decision-making about the care plan.”
At a glance
Integrated care Evaluating these programmes is not easy, as integrated care often involves multifaceted interventions that target patients, providers, and organisations. The programmes are continuously adapted and improved. Their effectiveness is impacted by the behaviour of those receiving and providing the interventions and by many contextual factors. Moreover, the programmes intend to improve a variety of different outcomes. Standard evaluation methods are not sufficient for evaluating these complex interventions, so Professor Rutten-van Mölken and her colleagues aim to broaden and improve them. “We work with multi-criteria decision analysis (MCDA). With MCDA, we try to move beyond the quality-adjusted life year as an outcome measure, and to measure output in terms of the Triple Aim. That is, improving patients’ health and wellbeing, improving patients’ experience with care, and reducing the costs or the costincrease,” she explains. A core set of outcomes have been agreed on, relating to areas like physical functioning, psychological wellbeing, social relationships and participation, resilience, and enjoyment of life (covering aim 1), person-centeredness and continuity of care (aim 2) and total health and social care costs (aim 3), which will be used to evaluate these 17 programmes. In addition, other outcomes have been identified for each of the four categories of programmes. “For example, with programmes for frail, elderly people, we added outcomes like autonomy, the burden of medication they experienced, the burden of informal caregiving, the proportion of long-term institutional admissions, and fallincidents” outlines Professor Rutten-van Mölken. This broad set of outcomes reflects the complexity of providing care for these people. “It’s not always about improving the patient’s health. Sometimes, maintaining independence, having meaningful social interactions and ensuring that the patient has a good quality of life are more important, as the diseases themselves cannot be cured anymore,” says Professor Rutten-van Mölken. The importance attached to these outcomes varies across different stakeholders and countries. The SELFIE researchers are measuring the importance
attached to each of the outcomes, taking a range of views into account. “We obtain these weights from what we call the five P’s, the five groups of stakeholders – Patients with multi-morbidity, Partners (informal caregivers), Payers, Providers and Policy-makers,” outlines Professor Rutten-van Mölken. “We then combine them with the effectiveness of the programmes on the above mentioned outcomes.” This will provide the basis to evaluate the effectiveness of different programmes, information which will be highly important to decision-makers, for example health insurers and municipal authorities. “By using MCDA, we can give stakeholders very informed, transparent, and well-founded information on the effectiveness of the programmes, taking into account the importance of the various outcomes for the different stakeholders,” says Professor Rutten-van Mölken. For payers and public authorities, reducing costs or the cost-increase is of course a major concern, particularly given forecasts that healthcare spending is set to rise to 20 percent of GDP in future. Using primary care more effectively, and limiting secondary care only for those who really need it, could help reduce costs, yet this depends on effective collaboration. “Good examples are the population health management programmes, for example in Catalonia, where they strongly invest in prevention, patient-engagement, better coordination between specialised and primary care (vertical integration), as well as community-based coordination among all actors involved in both health and social services (horizontal integration). “Among the unique features of the Catalan system is the ability to monitor a wide range of outcomes using the Catalan Health Surveillance system that includes the entire population of that region,” says Professor Rutten-van Mölken. “Currently the evaluation studies are ongoing. Simultaneously, we are conducting the weight-elicitation studies to assess the relative importance of the different outcomes. In the last phase of the project we will bring all this together in the MCDA and try to draw conclusions about the effectiveness of the programmes,” outlines Professor Ruttenvan Mölken. This will be followed by developing implementation strategies.
Full Project Title Sustainable intEgrated care modeLs for multi-morbidity: delivery, FInancing and performancE (SELFIE) Project Objectives The aim of the SELFIE project is to improve care for persons with multi-morbidity by proposing evidence-based, economically sustainable, integrated care models that stimulate cooperation across health and social care sectors. The project also aims to propose appropriate financing/payment schemes that support the implementation of these models. Project Partners The Netherlands (coordinator): Institute of Health Policy & Management, Erasmus University Rotterdam • Germany: Department of Health Care Management, Technical University of Berlin • Norway: Department of Economics, University of Bergen • Austria: Department of Economics and Finance, Institute for Advanced Studies • Croatia: Department for Development, Research and HTA, Agency for Quality and Accreditation in Health Care and Social Welfare • Hungary: Syreon Research Institute • Spain: Consorci Institut D’Investigacions Biomediques August Pi i Sunyer, Hospital Clinic de Barcelona • The United Kingdom: Manchester Centre for Health Economics, University of Manchester Contact Details Professor Maureen Rutten-van Mölken E: firstname.lastname@example.org W: www.selfie2020.eu The SELFIE project was launched on September 1st 2015, and will continue for four years. This project (SELFIE) has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 634288. The content of this article reflects only the SELFIE groups’ views and the European Commission is not liable for any use that may be made of the information contained herein.
Professor Rutten-van Mölken
Maureen Rutten-van Mölken is professor of Economic Evaluations of Innovative Health Care for Chronic Diseases at the Erasmus School of Health Policy & Management and the Institute for Medical Technology Assessment of the Erasmus University in Rotterdam. She was trained as a health scientist and has additional training in health economics and epidemiology. She obtained a PhD in health economics at Maastricht University. She has almost 30 years of experience working in Health Technology Assessment, with a special interest in economic evaluations of complex interventions like integrated care.
Ada 2020 brings personalised healthcare a step closer With medical information accessible at the touch of a button, researchers are looking to harness the power of technology to help diagnose disease accurately and efficiently. Dr Martin Hirsch tells us about the work of the Ada2020 project in developing a decision support tool, research which could help reduce the costs of healthcare A GP may be presented with a wide variety of cases during the course of the working day, encompassing everything from the common cold through to more exotic diseases. As GPs tend to act as the first point of contact in healthcare systems, they are often responsible for the initial diagnosis, yet this can be difficult, particularly when a patient presents with quite unusual symptoms. “A doctor cannot reasonably be expected to keep 8,000 rare diseases in mind,” points out Dr Martin Hirsch. Based at Ada Health in Berlin, Dr Hirsch is the Principal Investigator of the Ada2020 project, an EU-funded initiative developing a visual reasoning tool called Ada for both medical professionals and the general public. “We integrated data about rare diseases into the tool and we enabled doctors to take rare diseases into consideration in diagnosis,” he says. The majority of GPs only have a few minutes with each patient however, so in many cases they focus more on alleviating the symptoms than precisely diagnosing the condition, while specialists are also under significant time pressures. It’s often the patients themselves and university hospitals who take the time to accurately diagnose a disease, so Dr Hirsch and his colleagues in the project differentiated the software accordingly. “We developed a patient version on the one hand, and an expert version on the other,” he explains. The expert version can cooperate with the patient version, an approach which is designed to help support diagnostic decision-making and train the system to grow more intelligent. “We aim to produce a decision support system that helps patients better understand their health. Then they can send over the assessment to a doctor, who has more options in their part of the tool, and they can closely work together,” says Dr Hirsch. 36
Probabilistic Reasoning At the core of the software is a probabilistic reasoning engine that works in concert with a medical knowledge base that includes detailed information about the relationship between specific symptoms and disease. The data required for the visual reasoning tool are the symptoms, the diseases and the probabilistic relationship between these two. “If you have this symptom, how probable is this disease? And if you have
Researchers at Ada in Berlin have addressed this by adding another layer in the tool to augment this approach, looking at the pathological state in the body which lies behind the symptoms of a disease. This allows researchers to identify which symptoms are not independent of each other, while Dr Hirsch says it also brings other benefits. “We will show that this improved approach has a much higher level of diagnostic accuracy than previous methods,”
We aim to produce a decision support system that gives patients access to earlier, more relevant health information. Then they can send over the assessment to a doctor, who has more options in their part of the tool, and they can closely work together to focus on outcomes and prevention this disease, how probable is this symptom?” outlines Dr Hirsch. While Ada’s approach is based on the best available methodologies for modelling this relationship, Dr Hirsch says these still do have some shortcomings. “The basic assumption with most current approaches is that the observables are independent of each other. However, in diseases, the observables – the symptoms – are not always in fact independent of each other,” he explains.
he explains. With more information about the underlying causes of symptoms, the tool can also take the time course of a disease and masking-effects of chronic medication into consideration and can be used to identify the most effective course of treatment. “We will be able to make assessments and give advice regarding treatment in a very personalised way,” continues Dr Hirsch. This approach could help improve efficiency in the healthcare system. It is
At a glance Full Project Title Ada2020 Visual Reasoning Support for Healthcare Professionals
estimated that around 45 percent of GP visits could be handled more efficiently via other means; Dr Hirsch and his colleagues are looking to use their tools to give GPs more precise information about specific cases in an even more efficient way. “We’re testing a new approach where we say; ‘ok, let’s try to make a pre-assessment with our software, with a chat-bot’. The chat-bot takes all the time necessary to interview the patient, drawing on the knowledge of cases in the system,” he explains. A GP might not have the time to devote this level of attention to each case, but with a reasoning tool, they can get the key information more efficiently and then decide on the next steps. “The results of the pre-assessment can be handed on to a specialist doctor, and they can look at it in their professional environment,” says Dr Hirsch. There has been a lot of interest in this tool from GPs, while Dr Hirsch is also keen to widen access to the software, and ultimately put Ada in the pockets of more individual users. With the costs of healthcare rising, in line with demographic pressures, Dr Hirsch believes that empowering patients is key to improving efficiency and adapting healthcare systems to modern demands. “We decided to empower patients and give them tools,” he
outlines. This is part of a wider trend; with vast amounts of medical data now available online, the patient-doctor relationship has fundamentally changed. “Patients in future will be empowered by artificial intelligence, by mobile sensors, by lab tests that are delivered directly to them. They will then ask doctors to discuss the case with them, and to make the correct prescriptions,” continues Dr Hirsch. The app so far has proved popular in App stores across the globe, with over 1 million health assessments generated, and in some cases it has been credited with prompting users to seek immediate medical attention. On the expert side, a pilot study has been established in the department for rare diseases at Hannover University Hospital, which is designed to give medical professionals access to more information. “Sometimes it takes Doctors weeks, even months, to get to the right diagnosis,” explains Dr. Hirsch. Information about comparable previous cases is invaluable in these circumstances. “Doctors want to be able to feed their experience into a system so that their colleagues with similar cases are informed,” says Dr Hirsch. “We’ve now added a ‘share cases’ functionality, so that experts can publish and share information about their cases using this tool.”
Project Objectives Ada has developed a unique visual reasoning tool for medical professionals that offers diagnosis decision support at the point of care. It’s award-winning user-interface is designed to structure the patient consultation in a more efficient way, providing physicians with earlier, more comprehensive health information and helping them assess more complex medical conditions. Developed over six years of fundamental research in reasoning, medical diagnosis and artificial intelligence, Ada captures the knowledge of medical experts in a sophisticated reasoning engine that continues to learn from new inputs and multiple closed feedback loops. The Ada2020 project matched its original purpose and went beyond, creating a new model of medical reasoning that will enable doctors and medical researchers to detect and understand individual conditions on a whole new level. Project Funding This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 674459. EU contribution: EUR 2 408 315,00. Contact Details Dr Martin Hirsch ADA HEALTH GMBH Adalbertstraße 20, 10997 Berlin, Germany W: https://ada.com/ada2020 https://itunes.apple.com/au/app/ada-personalhealth-companion/id1099986434?mt=8
Dr Martin Hirsch
Dr Martin Hirsch grew up in a family of scientists, and is a grandson of the celebrated Nobel Laureate Werner Heisenberg. He studied theoretical medicine before working as an independent researcher. His special interests include cognitive neuroscience, semantic knowledge representation and the use of technology to support human thinking and decision-making.
Bridging the research gap between China and the EU The research community has an important role to play in controlling animal diseases like avian influenza and classical swine fever. Daniel Beltran-Alcrudo tells us about the LinkTADs project’s work in coordinating research between the EU and China, work which will help lay the foundations for continued investigation into animal diseases across regions The nature of
modern farming and international trade heightens the risk of animal diseases spreading rapidly, sometimes jumping thousands of kilometres at once as illustrated by the rapid expansion of avian influenza. The European Union and China both hold deep scientific expertise on transboundary animal diseases (TADs) and zoonoses (those which can be transmitted to humans); now the LinkTADs project aims to strengthen research links, bringing together partners in the EU and China. “The idea behind LinkTADs is to get research groups in both regions working together on common animal health projects,” says Daniel Beltran-Alcrudo, the project’s Coordinator. The scope of the project was very broad, determined to a large degree by the interests of the partners. “The partners defined the topics - e.g. diseases of interest - and then workshops, exchanges and meetings were organised based on those interests,” explains Beltran-Alcrudo. “A wide range of topics were discussed at those workshops and meetings, and the projects that have come out of it are also very diverse, but always within the wider area of animal health.” A number of major research priorities have been identified, one of which is in epidemiology, the study of the patterns of disease transmission. While China is very advanced in areas like diagnostics and vaccine development, the one area in which there is a relative skills gap is in epidemiology. “We don’t always know
how a disease spreads, how it behaves. That means asking questions like what percentage of animals are affected? What are the market chains, are animals moving between places? In what places do animals congregate, and where do you need to test them?” explains BeltranAlcrudo. A second key priority was to coordinate the work that is being done on developing diagnostics and vaccines with the work of epidemiologists, so that diagnostics and vaccines can be applied more effectively. “We aimed to get these two groups working together, which previously hasn’t always been the case. That was another big focus of the project,” outlines Beltran-Alcrudo.
Many different projects have been established to investigate issues around animal diseases, in terms of improving surveillance, prevention and control. However, with this high volume of research, it becomes increasingly difficult for scientists to keep track of existing knowledge and identify potential collaborators for new studies. “It would be difficult for a researcher in Sweden to know who in China is working on a specific topic for example. LinkTADs bridges that gap,” says Beltran-Alcrudo. The project enables scientists to come together and discuss topics of interest, identify technological gaps and potentially establish new collaborations to address important questions in research. “There are eleven
Relationship network within LinkTADs and with external partners.
Summary of LinkTADs’ achievements.
partners in LinkTADs, and we aimed to make sure that those groups were working together and sharing information,” outlines Beltran-Alcrudo. “We also invited researchers from outside the project to the workshops and meetings, based on the interests of our partners.”
Disease prevention The wider goal in this is the prevention and control of transboundary animal diseases, some of which also represent a threat to human health. A number of transboundary animal diseases are currently present in the Chinese countryside and other areas; BeltranAlcrudo says the project’s priority is investigating those diseases which cross borders. “The focus of LinkTADs is on transboundary animal diseases,” he stresses. “We have been involved in some activities on diseases that are really entrenched in a particular country, but we’re looking primarily at diseases that are spreading from country to country and region to region.” Researchers first investigated which animal diseases were of interest to both the EU and China, eventually identifying a list of ten, which includes porcine reproductive respiratory syndrome (PRSS), African and classical swine fever,
and avian influenza. The project aims to help coordinate research into these different diseases, holding workshops and meetings to identify major priorities and help ensure resources are allocated efficiently. “For example, we invited researchers to a workshop where we looked into the gaps on the diagnostic side for a number of diseases, then we did the same on the epidemiology side, What would a joint project look like? What would be the major benefits?” continues Beltran-
provides a good example of the importance of close collaboration. A primary aim of the workshop was to propose collaborations between partners on epidemiology and laboratory research in priority areas. “Here we managed to coordinate researchers on diagnotics and on epidemiology to come up together with a quick gap analysis of what’s needed in 8 priority diseases,” outlines BeltranAlcrudo. This work has already had a significant impact. “By the end of the
The focus of LinkTADS is on trans-boundary animal diseases. We’re looking primarily at diseases that are spreading from country to country and from region to region Alcrudo. Overall eleven technical workshops were held, spanning epidemiology, laboratory and policy issues, while seven webinars were also broadcast over the course of the project, covering mostly specific diseases or policy questions. “We tried to give researchers from both sides as many opportunities as possible to get to know each other,” says Beltran-Alcrudo. The project’s workshop on Status Analysis and Identification of Potential Collaboration on Animal Diseases, held in April 2015 in the Chinese city of Qingdao,
2-day workshop, we had project concept notes for each disease including priority gaps to be addressed, main activities, partners and their role, institutional partners, length of the project, etc,” continues Beltran-Alcrudo. “One of the projects, on Japanese encephalitis, has actually been funded already.” This close collaboration can help lay the roots of continued research into transboundary animal diseases. While the project’s three year term has now concluded, Beltran-Alcrudo hopes there will be a lasting legacy in the form of
At a glance
LinkTADs Kick-off Meeting
Full Project Title Linking Epidemiology and Laboratory Research on Transboundary Animal Diseases and Zoonoses in EU and China (LinkTADs) Project Objectives • To identify the priority areas, where joint actions are needed. • To link the research activities carried out on by European and Chinese research programmes. • To ensure a wide-range networking of scientific communities and stakeholders. • To provide a long term vision and achieve coordinated planning on future common research. • To contribute to the international policies of the EU. • To improve the research capacity of organizations by supporting young researchers; through exchange programmes and training. • To share the results and methodologies within and outside the consortium. Project Funding 1,000,000 EUR Project Partners Please see website for full details: http://www.linktads.com/about/ partnership#sthash.1Q14VC39.dpbs Contact Details Daniel Beltran-Alcrudo Animal Health Officer (veterinary epidemiologist) FAO, Regional Office for Europe and Central Asia Benczur utca 34, Budapest 1068 E: Daniel.BeltranAlcrudo@fao.org W: http://linktads.com/#sthash.c4Viq5QI.dpuf
continued research and close collaboration between European and Chinese scientists. “Now that the researchers know each other, they are planning projects and working on research together. So in the coming months more papers will be published by partners on both sides, and more projects will be funded,” he outlines. The project website will also continue to help researchers find funding for their work. “This will allow researchers to quickly search what funding mechanisms are available. Also, we have a network of professionals that will remain active until the end of the project, who will offer advise to those who want to go into animal health research,” says Beltran-Alcrudo. “They can contact the members of the network, who include representatives of the different institutions involved in LinkTADs. We have also established exchange programmes to encourage knowledge sharing, and are involved in training courses across a number of different areas.”
Policy The wider goal in this research is to help improve disease prevention and control in
both China and the European Union. The rapid growth of the Chinese economy has led to increased demand for animal protein, and today China is the world’s largest livestock producer and consumer, yet as has been noted, a lot of the production takes place in backyard settings. “This could be people who have very few animals, with very low animal health regulations and security. It’s all marketed in a very informal way, and the animals may even cross borders into neighbouring countries without any controls,” says Beltran-Alcrudo. This has significant implications in terms of disease exposure; the project’s data will help underpin policy on the prevention and control of trans-boundary animal diseases. “On the policy side, we aim to help get both regions working together, to help make sure that policies are equivalent on both sides,” continues Beltran-Alcrudo. “We’re also looking at how research is prioritised, and aim to make sure that interests on both sides are represented.”
Daniel Beltran-Alcrudo is a veterinary epidemiologist at the Food and Agriculture Organization (FAO). He has worked on several projects aiming to assist national veterinary services in the surveillance, prevention and control of various transboundary animal diseases, most importantly African swine fever, avian influenza, lumpy skin disease or Rift Valley fever.
Getting the most out of waste water Phosphate is essential to life on earth, yet there are not limitless supplies, leading researchers to look at methods of recovering it. Researchers in the E-motion project are developing a method which combines modified polymers with an electrical field to recover key nutrients from waste water, as Dr Louis de Smet explains The majority of waste water treatment techniques are focused on improving the quality of the water, yet high-value nutrients like phosphate ions are often present in waste streams, and there is a great deal of interest in recovering them efficiently. Researchers in the E-motion project are developing a novel, highly specific method to separate key nutrients from waste water and recover them. “I want to combine modified polymers with an electrical field, to steer the separation,” outlines Dr Louis de Smet, the project’s Principal Investigator. As an organic chemist with a strong focus on surface chemistry, Dr de Smet has long experience of designing and characterising materials, in particular polymers, and now he aims to introduce specific properties to these polymers. “These properties will help these polymers to perform the specific role of separating certain ions from the aqueous environment,” he explains.
aspects of this process. “Is the ion that we are targetting indeed the only one going through the filter?” he asks.
The properties of these polymer coatings should be at least two-fold – they should be selective towards the target ion, which in my case is phosphate, but at the same time they should also have the ability to release it again, enabling recovery Organic materials This work centres around combining organic materials with an electrical field, with the goal of attracting negatively charged phosphate ions to a positively charged electrode. Water essentially flows through two parallel porous electrodes, one positive and the other negative, which will then attract those species that have the counter charge. “Phosphates are negatively charged ions – so these negatively charged ions will be attracted by a positively charged electrode,” says Dr de Smet. Researchers aim to use the tailor-modified polymers to generate a very thin coating on top of a porous electrode, that ideally will only allow phosphate ions to pass through, acting as a kind of filter; Dr de Smet and his colleagues are studying fundamental
The porous nature of the electrodes means they have a very high surface area, and ions can then be electro-adsorbed. As explained, a polymer coating on top of such a porous electrode acts as a filter; Dr de Smet and his group are investigating different approaches to modifying the electrode surface with the polymer coating. “The properties of these polymer coatings should be at least two-fold – they should be selective towards the target ion, for example phosphate, but at the same time they should also have the ability to release it again, enabling recovery,” he outlines. Researchers now aim to find the right balance between the selectivity of binding, which typically goes hand in hand with a high binding strength, but at the same time ensuring that the binding process is reversible. “That’s one of the most important challenges from a chemistry point of view,” says Dr de Smet.
The goal is to design, synthesise and characterise ion-selective coatings, yet there are many technical challenges to deal with before selective ion transport can be achieved. At this stage, Dr de Smet and his colleagues are focusing on preparing macromolecules that can be used in very thin membrane films. “We are immobilising these tailored-macromolecules, then coating them onto model substrates. The next step would be to do the same with a porous electrode, and also implement electrical fields, to see whether we can achieve the desired transport properties,” he outlines. This research could also lead to new insights into the recovery of other ions, including lithium ions, potentially opening up other applications in battery design, solar fuel devices and fuel cells. “For example, there are a lot of lithium ions in certain salt-lakes, but here selectivity is also an important issue,” says Dr de Smet. “Currently, the main focus is on phosphate though.” Electro-motion for the sustainable recovery of high-value nutrients from waste water (E-motion) ERC-CoG-2015 - ERC Consolidator Grant Fe3O4 Nanoparticles Coated with a Guanidiniumfunctionalized Polyelectrolyte Extend the pH Range for Phosphate Binding, Paltrinieri, L.; Wang, M.; Sachdeva, S.; Besseling N.A.M.; Sudhölter, E.J.R.; de Smet, L.C.P.M. J. Mater. Chem. A, 2017, in press. DOI: 10.1039/ C7TA04054G (Front Cover Article) Dr Louis de Smet Laboratory of Organic Chemistry Wageningen University Stippeneng 4 6708WE WAGENINGEN T: + 31-15-2782636 E: email@example.com W: www.louisdesmet.nl Dr Louis de Smet is an Associate Professor in Organic Analytical Chemistry at Wageningen University in the Netherlands. He is also a Senior Advisor to Wetsus, European centre of excellence for sustainable water technology in the Netherlands. His research work focuses on the molecular design of selective materials, mostly to recover high-value materials from waste water.
Algal lipids open a window into climate records Analysis of algal lipids in both lakes and on the sea surface can help scientists to reconstruct past temperature records, from which new insights can be drawn into the likely future evolution of the climate. We spoke to Dr Jaime L. Toney and Dr Antonio Garcia-Alix about their research into using algal lipids to extend human instrumental records further back in time A type of
lipid produced by algae, alkenones are highly responsive to water temperature, and have long been used as fossils to reconstruct past climate records. Based at the University of Glasgow in the UK, Dr Jaime L. Toney is the Principal Investigator of the ALKENoNE project, an ERC-backed initiative aiming to analyse these lipids and build more detailed climate records. “The main objective in the project is to record temperature and precipitation from a number of lakes in Canada in a quantitative way, so that we can extend human instrumental records further back in time,” she explains. The number of double bonds in alkenones varies according to the water temperature at the time the algae were growing; Dr Toney and her colleagues are analysing alkenones from over 100 lakes in Canada, which she says have some interesting features. “These lakes have salinities which range from being completely fresh to having four times the salinity of ocean water. Because they are spread across a large latitude range, they also have a large temperature gradient of about 9 degrees,” she outlines. This variability allows researchers to calibrate the biomarkers or lipids that are found in the lakes to all these different environmental conditions, which is a key part of the project’s agenda. The main factor that complicates temperature
BECS PhD student and Nuffield Placement summer intern splitting lake sediment cores. ©Jaime L. Toney, PI of ERC-funded ALKENoNE project. calibration with respect to lakes is that researchers need to know the time of year at which these compounds were produced. “For instance, if the algae are producing these compounds in the Spring, then you’re going to be recording temperatures from Spring, if they’re producing them in Summer, then you’re going to be recording temperatures from Summer, and so on,” explains Dr Toney. Previously Dr Toney
developed a temperature calibration by collecting water samples from different depths within a single lake, now she aims to investigate whether this calibration is applicable to this larger set of lakes in Canada. “Ultimately we’d like to understand the algae that produced these alkenones in sufficient depth to come up with a single calibration that can be used regardless of location,” she says.
Information sharing Effective collaboration and information sharing is central to this wider goal. Dr Toney and her colleagues work with a number of different groups, including scientists at the University of Regina in Saskatchewan who have gathered environmental data from hundreds of lakes in the surrounding area. “We have a really robust database that we can refer back to when we’re trying to look at changes within the lakes and what they’re recording,” she outlines. Researchers are also collaborating with the NAOSIPUK project, an initiative working to reconstruct the North Atlantic Oscillation over the Holocene period. “We’re looking at the sampling of surface sediment in lakes, to get information about the
to grow, we remove individual organisms or algae that we see, under the microscope, and then try and get individual cells to grow and multiply, so that we have just a pure strain of that algae,” says Dr Toney. There are less variables to consider with these laboratory-generated samples, allowing researchers to draw more direct inferences. “It ground-truths the relationships you see in the field that might be complicated by salinity, sunlight, or other factors,” explains Dr Toney. “We want to show that what we’re seeing in the lab is also what we’re seeing in the field.” This is a challenging goal, as the alkenones are produced by multiple different species of haptophyte algae. Researchers are applying genomics techniques to investigate how many
The main objective in the project is to record temperature and precipitation from a number of lakes in Canada in a quantitative way, so that we can extend human instrumental records further back in time alkenone or diol composition of these sediments,” says Dr Antonio Garcia-Alix, the project’s Principal Investigator. “This composition is mainly related to temperature changes – if we have a thorough record of temperature in these lakes, we can look to establish a proper calibration for these proxies.” The ALKENoNE project’s research includes lab-based work, with scientists also culturing algae in the laboratory to further refine the temperature calibration. This is technically demanding work, as it’s important to isolate the algae from bacteria and other organisms. “We’ve had the most success with what are called enrichment cultures, where we essentially take mud from the bottom of a lake, and we add it to a nutrient media. As it starts
different species of algae are producing the alkenones, and then using the labgenerated cultures to make sure that the different species have the same calibration. “We can ground-truth that with the short
High-resolution sediment core from the Northern Great Plains lakes. ©Jaime L. Toney, PI of ERC-funded ALKENoNE project. sediment cores that we have, that overlap the instrumental record. So we can reconstruct temperature back over the last hundred years or so, where we have human records of temperature change, and we can make sure that the proxy matches up to the instrumental record,” continues Dr Toney. “We’re also looking at other biomarkers, including diols from algae
Alkenone-producing algae from the North Great Plains. ©Bob Andersen, Emeritus Director, Provasoli-Guillard National Center for Marine Algae and Microbiota (NCMA).
At a glance Full Project Title Algal Lipids: the Key to Earth Now and aNcient Earth (ALKENoNE) Project Objectives The ALKENoNE project uses state-of-theart, interdisciplinary techniques to model the relationship between biomarkers and the environment. Measuring and modelling biomarkers from lake sediment cores will generate past temperature and hydrologic records that will assess how past extreme temperature, drought, and flood events inform future drought risk in the Canadian Prairies. Project Funding Funding comes from an ERC Starting Grant (2015-2020, 637776, €1.2M) to Jaime L. Toney the project ALKENoNE – Algal Lipids: the Key to Earth Now and aNcient Earth and PhD studentship through the NERC IAPETUS Doctor Training Partnership. Project Partners • Professor Peter Leavitt (https://www.uregina.ca/science/biology/people/ faculty-research/leavitt-peter/index.html) • Professor Yoshihiro Shiraiwa (http://plmet.biol.tsukuba.ac.jp/index-en.html, https://www.researchgate.net/profile/ Yoshihiro_Shiraiwa2) • Dr. Ian Watson (http://www.gla.ac.uk/schools/engineering/ staff/ianwatson/#/researchinterests) Contact Details Project Coordinator, Dr Jaime L Toney Senior Lecturer (Associate Professor) in Organic Geochemistry University of Glasgow Geographical & Earth Sciences G12 8QQ T: +44 (0)141 330 6864 E: firstname.lastname@example.org W: http://environmentalbiomarkers.co.uk W: http://www.gla.ac.uk/schools/ges/ staff/jaimetoney/
Dr Jaime L Toney
Dr Jaime L. Toney is the leader of the research group Biomarkers for Environmental and Climate Science (BECS) at the University of Glasgow, UK that develops and applies the study of organic molecular fossils called biomarkers to understand past and present environmental and climate change.
Fluorescence genomic techniques allow for visualization of alkenoneproducing algae (green) and autofluorescing diatom algae (red). ©Jaime L. Toney, PI of ERC-funded ALKENoNE project.
and tetraethers from bacteria - we know from previous studies that those are related to temperature and pH. If we have all of these different biomarkers and can look at them in the same sample, then we can then start questioning whether there’s something else that influences the biomarker. So it’s really a robust test.”
Bacterial proxies The group plans to spend some time over the next few months collecting sediment cores from a number of lakes in Canada. The initial goal is to reconstruct records of the last 2,000 years and look at how they overlap with instrumental records, aiming to validate the proxies, before looking further back. “We want to go back further
years. Detailed records of changes over this period could eventually enable researchers to more precisely investigate the impact of human activity on the climate. “By reconstructing the climate over the past 2,000 years in very high resolution, we can see how trends have changed during the Anthropocene period,” outlines Dr Toney. Understanding how the climate changed in the past could also help scientists understand how it is likely to evolve in future, which is another aspect of the project’s research. “We aim to understand how the climate is changing in this particularly region, in terms of not only temperature variations, but also in assessing future drought risk,” says Dr Toney.
Ultimately we’d like to understand the algae that produced these alkenones in sufficient depth to come up with a single calibration that can be used regardless of location in time and look at changes that have occurred over longer time periods,” says Dr Toney. Most of these lakes formed when the landscape was covered by an ice-sheet, and the landscape has only been exposed for between 8,000 and 12,000 years, so there are a lot of data. “Based on other sites that we’ve cored in the region, we expect that we will be able to get up to 15 metres worth of sediment, covering the past 12,000 years or so,” stresses Dr Toney. “After our field season we’ll bring the sediment cores back to Glasgow. We’ll start out by radiocarbon-dating them to understand how old they are, then we’ll start collecting samples from the cores and reconstructing the past climate.” This includes both reconstructing the longer record, looking as far back as the last 12,000 years, and also over much shorter timescales, around the last 2,000
The project’s findings could be used to inform the development of climate models designed to forecast future changes. While local data is an important element in climate modelling, Dr Toney says it’s important to also consider the wider picture, and the influence of other climate events. “For instance, El Ni ňo is an event that occurs in the tropical Pacific Ocean, but it is strongly correlated with certain events that happen in Canada, and even here in Europe,” she points out. In the immediate term, Dr Toney and her team of colleagues are continuing their genomics and sedimentary analysis work, after which they will look to publicise their findings more widely. “We’ll publish a number of papers describing how these different biomarkers are related to the environmental parameters, and which ones could potentially work as proxies,” she says.
Photograph by Alice Mitchell
Probing the roots of family language The family is a universal part of the human experience, yet societies differ widely in whom they class as part of the family, variations which extend to the surrounding language, cognition and social norms. We spoke to Professor Fiona Jordan, Dr Alice Mitchell, Dr Catherine Sheard, Sam Passmore and Dr Péter Rácz about the VariKin project’s work in investigating the roots, boundaries, and explanations of this diversity We all have relatives, yet there is great diversity across different languages and cultures in whom we class as family and the terms we use to describe these relations. “In English we use ‘brother’ and ‘sister’ for our siblings, differentiating their gender and distinguishing them from, for example, cousins – but we don’t have a gendered term to differentiate between male and female cousins like in French. In other languages, you’d use the words for brother and sister for some but not all of your cousins. There are many languages in which the words for mother and aunt are the same. So there’s really intriguing variety,” says Professor Fiona Jordan. Based at the University of Bristol in the UK, Professor Jordan is the Principal Investigator of the VariKin project, an ERC-backed initiative investigating the
roots of this kinship diversity. “My aim, as an inter-disciplinary scientist, is to bring different disciplines of the human sciences—such as psychology, linguistics, biology and anthropology— together to try and create a multi-faceted explanation for why we see such variety within what is really a central part of the human experience.” Language diversity forms an important piece of the puzzle that Professor Jordan’s project is investigating. There are thought to be somewhere between 6,000-10,000 languages in the world, grouped into approximately 150 language families, yet much of this diversity is precarious, as Professor Jordan explains; “If as English speakers all we knew about was our own way of organising our kinship system, we would think that gender and
generation are important but not which side of the family you’re from. In fact it’s very different in other societies.” Interdisciplinarity is also key. “An understanding of kinship requires investigation into language, society, and biology—one approach won’t capture the whole of the story,” continues Professor Jordan.
Child acquisition and understanding of kinship Researchers aim to investigate these differences in three sub-projects within VariKin, using a range of methods to look at kinship diversity at individual, community, and global levels. One subproject is aimed at understanding how children learn the kinship system of their local community, and focuses on the Datooga-speaking people. Located in
Photograph by Alice Mitchell
a relatively remote part of Tanzania, Datooga people organise most aspects of their lives according to family relationships. “They have a broad, complex set of terms for family, encompassing quite distant relatives, like second cousins,” outlines Professor Jordan. Generally, the smaller-scale a society is, the more important family is as a way of structuring social, political and economic life. “In British society we place strong importance on nuclear family kinship terms, but that often falls off quite rapidly once you get beyond your grandparents, your aunts, uncles and cousins. Beyond that we tend not to use specific kin terms to refer to more distant relatives,” says Professor Jordan. “There will probably be a broader definition of the family in other communities, perhaps related to factors like the size of the community and the way in which people make their living.” Datooga communities are typically polygynous, with many different generations living in the same household.
“There may be several wives living in the same household along with their children, and there may also be some grandchildren living there,” explains Dr Alice Mitchell. A post-doctoral researcher, Dr Mitchell has previously carried out linguistic research with Datooga speakers, and will head to East Africa again to live with Datooga families, doing ethnographic and linguistic research. “I’m going to look at how Datooga-speaking children talk about and experience kinship, working with children from the age of around four to twelve or fourteen, recording their language use” she outlines. “The second part of this research is more quantitative – I’ll be doing tasks with kids, to get at their knowledge of kinship. So I’ll be asking them to define certain kinship terms, to identify who people are in relation to themselves, and looking at how their ability to do that might change as they get older.” A second sub-project in VariKin is focused on looking at patterns of usage
in both written and spoken language. This research stems from earlier work on how often different parts of vocabulary are used, and whether the frequency of usage is related to evolution of usage over time. “It turns out that the more frequently we use a word, the slower it is to change,” says Professor Jordan. But until recently, it hasn’t been possible to compare the usage patterns of kinship terms across multiple languages. While we might assume close relatives are talked about more often, we don’t know if this holds across different languages. Now, there are vast amounts of data available on the internet, in all sorts of languages, and new computational tools; Dr Péter Rácz is analysing these huge data sets, aiming to detect patterns of usage. “For example, there may be five or six ways in which cousins and siblings could be named in a language. We’re looking at those usage patterns, and seeing whether there are broad correlations with other societal structures, beliefs, and values, such as
At a glance marriage rules, religiosity, gender equality to explain why and how the terminology is used,” he outlines. “There are now sophisticated methods available to detect patterns of use and see how they hold up across the world, across all societies and cultures.”
Kinship language evolution Languages have shared ancestry – for example, French, Italian and Spanish are all Romance languages that are descended from Latin. This presents a challenge and an opportunity in terms of investigating the evolutionary processes behind the transformation of kinship systems. “The first rule in statistics is that your data has to be independent,” says Dr Catherine Sheard, an evolutionary biologist also working on the VariKin project. She is joined by Sam Passmore, a PhD candidate; both are drawing on techniques from biology, using phylogenetic models to investigate the transformation of kinship systems.
little literature cross-culturally,” she outlines. The VariKin project will make an important contribution in these terms, which could then act as a basis for further research. “One of the things we want to come out of VariKin is a kind of toolkit for other fieldworkers to go and take out to different communities in different parts of the world, and then add incrementally to the very small generalisations that we hope to make,” says Professor Jordan. “This is what researchers have done on how children acquire concepts of colours for example, so maybe we could do that for kinship. From this variety, we want to establish a set of hypotheses that perhaps other disciplines could test in the future.” This approach will allow researchers to test more ‘nuanced’ hypotheses about human behaviour than would otherwise have been possible. With phylogenetic models of cultural evolution, Professor Jordan and her colleagues can tease out information about the causes of language and social changes. “These methods give
In English we use ‘brother’ and ‘sister’ for our siblings, differentiating their gender and distinguishing them from, for example, cousins – but we don’t have a gendered term to differentiate between male and female cousins like in French. In other languages, you’d use the words for brother and sister for some but not all of your cousins. There are many languages in which the words for mother and aunt are the same. So there’s really intriguing variety “In evolutionary biology we use phylogenetic “tree” techniques to control for the non-independence of biological species from one another by historical relatedness,” explains Dr Sheard. “We’re borrowing these well developed techniques from evolutionary biology to say ‘ok, languages also evolve in a tree-like pattern.’ Can we take these techniques, working on trees, to model what might be going on and look at changes over time? We’re building up data sets of kinship terminologies, from which we’ll extract information about kinship systems in 500 different languages.” There are also more specific goals within each of the sub-projects in VariKin. One major area of interest to Professor Jordan is the question of how children learn about the family, which she says has previously been neglected. “There’s very
us a time dimension, with an opportunity to correlate language change data with specific events or time periods. So with this information about societies, we can project back in time to what sorts of kinship systems existed in the past,” she outlines. By taking a cross-cultural approach, researchers hope to identify the main influences on the global distribution of kinship diversity. “Is it simply the case that if your language is part of a specific language family then you’re more likely to have a certain kind of kinship terminology system? Or does your way of life have something to do with it? Is it that you’re allowed to marry certain kinds of cousins, or inherit from your matrilineal ancestors?” asks Professor Jordan. “Is it society or something about shared cultural history that really underpins the diversity we see in kinship terminology?”
Full Project Title Cultural Evolution of Kinship Diversity: Variation in Language, Cognition, and Social Norms Regarding Family (VariKin) Project Objectives Why do human societies differ in whom they class as family? Why are cousins classed with siblings in some societies but not others? Accounting for the variable ways that cultures classify kin is an enduring puzzle. The VariKin project takes a cultural evolutionary approach to variety and unity and engages different fields– cultural phylogenetics, corpus linguistics, and cross-cultural child development. Project Funding ERC-StG-2014 - ERC Starting Grant EU contribution: EUR 1 233 672 Contact Details Project Coordinator, Professor Fiona Jordan Department of Anthropology and Archaeology, University of Bristol Bristol BS8 1UU United Kingdom T: +44 117 954 6078 E: email@example.com W: http://excd.org
Professor Fiona Jordan
Professor Fiona Jordan is a Professor in Anthropology at the University of Bristol in the UK. Her primary research interests are in cultural evolution and diversity, particularly in kinship and language, and with expertise on phylogenetic methods and the Austronesianspeaking populations of the Pacific.
How does migration affect innovation? The reality of the contribution of migrants is widely overlooked. With all the talk in the media about how migrants impact on country resources, we investigate the concept that migration is a stimulus for economic prosperity, specifically in relation to innovation, within Europe. We take a closer look at the two European countries that have the highest migrant populations, to try and find answers. By Richard Forsyth
ome important facts about the positive aspects of immigration are being missed by the mainstream media. Itâ€™s time we took a step back to inspect how immigration plays a key role in economies. In the context of nurturing innovation, immigration is being identified as a driving force. Itâ€™s a complex thing to measure but there are ways to correlate data and find indicators. First, letâ€™s get a feel for the numbers and look at the influx of migrants coming into Europe, which countries have the most
migrants and where migrants are moving from. According to a report released in March 2017 by Eurostat, in 2015, a total of 4.7 million people migrated to one of the EU-28 Member States. By 1 January 2016, there were 35.1 million people residing in an EU Member State born outside the EU, 20.7 million citizens of non-member countries living in the EU and 16 million people living in one of the EU Member States with citizenship of another EU Member State. Germany took in the highest number of immigrants, followed by the UK, then Italy, Spain and France, in that order.
The German acid-test As Germany has arguably the most open borders, let’s start by taking a closer look at what that means to innovation for the country. A recent study entitled Foreign Human Capital in Germany: Considering the Benefits of Workers from other Countries, was published in May this year, undertaken by the company, Movinga. Finn Hänsel, the MD of Movinga explained: “There are reasons to believe that foreign labour can boost the economy rather than harm it.” Research was conducted into each of the 16 federal states. The number of firms receiving venture capital, the number of patent applications, the unemployment rate, and the percentage of the state that were born in another country were all examined. The findings show that German states with a higher percentage of foreign-born citizens see higher levels of innovation. They also illustrate that attracting more people from other countries does not mean higher unemployment. In order to analyse the possible benefits of foreign human capital, the diagrams below compare the key indicators on innovation and economic prosperity (firms accepting venture capital, patent applications, unemployment) with the percentage of the population that are born in another country. All data used for this report was provided by The Organisation for Economic Co-operation and Development (OECD) and the German Federal Statistical Office (Destatis). With 81.4 million citizens, Germany is Europe’s largest
Figure 1 - Distribution of foreign-born workers in Germany
country by population. It is also the nation with the largest foreign-born population in Europe, with more than 7.8 million (9.6%) originating from another country. However, this diversity is not evenly spread across Germany’s 16 federal states: five states have more than 10% of citizens who are foreign-born, whereas five states have a foreign-born population of less than 3%. This disparity is illustrated in Figure 1. Figure 2 shows that the city states such as Berlin and Hamburg that have a higher percentage of foreign-born citizens, are also home to a higher number of firms receiving venture capital. Similarly, Figure 3 shows that the two federal states with the most patent applications (Bayern and Baden-Württemberg) are also diverse demographically, with around 10% of their populations being foreign-born. Figures 1, 2 and 3 also convey that the federal states with fewer firms receiving venture capital and lower numbers of patent applications, like Sachsen-Anhalt and MecklenburgVorpommern, have smaller foreign-born populations. These findings illustrate that people born in other countries are of great economic value, and that an attitude of openness to foreign-born citizens is important, in order to support innovation, research, development and growth. The relative weakness of the federal states with fewer numbers of people born in other countries suggests that they could boost innovation and their general economic performance through attracting more talent born outside Germany.
Figure 2 - Number of firms receiving venture capital
countries, immigrants are more likely than natives to start new businesses. In the UK, immigrants are more likely to be selfemployed.’ This was backed up by The 2015 Global Entrepreneurship Monitor (GEM) which analysed start-up activity in the UK and found that immigrants were three times more likely to be entrepreneurial than people born in Britain. It revealed that over half a million people from 155 countries have settled in the UK and launched businesses – some with more than one business. It’s clear in this case, that when it comes to entrepreneurship, migrants are statistically braver than natives at starting new businesses. For innovation to flourish you need the dual strengths of research and entrepreneurship and those aspects, it’s been proved, have been strong when immigration has been accepted and encouraged. Figure 3
Will Brexit mean brain-drain? The UK is poised to leave the EU with the so-called Brexit, a decision which came from the 2016 referendum. The country has the second biggest immigrant population in the EU, after Germany. It makes up a large part of the labour force, 17 percent in total – that amounts to 5.4 million non-UK born workers out of 31.6 million working people. When looking at how leaving the EU might specifically, directly affect innovation (leaving out the issue of missing significant European funding) the clearest alarms have been in terms of less collaborative research with foreign scientists and fewer international university students basing themselves in the UK and working on scientific projects. It is expected that a segment of non-native UK scientists and research students may leave the UK due to Brexit – in what is termed a ‘brain-drain’, in context to migration of talent away from the country. Whilst there are likely assurances for citizenship for those already in the UK, Brexit has the potential to push UK science outside of collaborations between EU countries and with barriers in place to further recruitment and movement, it may become less attractive a base. Leaving innovation to the side for a moment and looking at the overall economy, if they were not there, there would be a noticeable dip in economic activity. For an idea of the financial impact, consider new research conducted for Universities UK by Oxford Economics, which revealed from an analysis of 2014-15, that international students and their visitors had a knock-on impact of £25.8 billion in gross output in the UK. As well as tuition fees, foreign students spend their money off-campus on goods and services. Beyond their spending and their achievements in research, such skilled citizens who have migrated to the UK, provide technical skills needed in business and importantly, they launch and influence startups. Innovators, inventors and entrepreneurs have been noticeable in the migrant populations in the UK. CReAM – Centre for Research & Analysis of Migration published a fact sheet called What do we know about Migration – Informing the debate. The document highlighted that ‘immigration improves innovation, trade and entrepreneurship. In most OECD
Is freedom of movement THAT Important? A study in the Harvard Business Review, titled Why Mass Migration Is Good For Long Term Economic Growth by Vincenzo Bove and Leandro Elia looked at countries around the world for the effects that came from migration. They realised that the poorer the target country the more potential for economic growth… ‘Richer countries are closer to the technological frontier than poorer countries, so the adoption of new technologies should be faster in developing economies, and the labour force’s skills and knowledge should increase at a faster rate. In other words, the more developed the destination country is, the less economic impact we are likely to see from migration…’ However, they continue, ‘…Overall our evidence suggests that immigration-fuelled diversity is good for economic growth. The main recommendation that political leaders and organisational practitioners can take away from [our] findings is to increase openness to workers from as many origins as possible, to reap the large benefits of having an increased range of skills, ideas, and innovative solutions.’ For EU countries – freedom of movement is a fundamental underpinning strategy of growth. The idea of easing the flow of skilled migrants through Europe and employing them as professionals has long been perceived as a method of fostering innovation. Whether the EU grows stronger or fractures further, hinges to some degree, on public perceptions of how immigration is affecting their country. Whilst it is indeed a complex issue, to reverse the flows of people, to discourage collaboration, this will likely have negative effects on innovation and growth. As a final footnote, consider this, that according to an OECD report published in 2014, migrants accounted for a 70% increase in the workforce since 2004 in Europe – addressing labour market imbalances. This alone suggests there is a challenge to really be worried about, if multiple borders become prohibitive to foreigners and migration becomes sluggish, as industry could face unbridgeable labour gaps from having to rely on native recruitment alone. Research, innovation, growth and simply having the manpower, these have long been the pillars of economic progress. Diminish your human assets and you will inevitably see a decline in all these drivers.
Mathematicians have long used representation theory to study linear symmetries and algebraic structures, now researchers are developing new theoretical tools to approach the subject and gain new insights. We spoke to Dr Dmitry Gourevitch about his work in relative representation theory, which brings together elements of several branches of mathematics
Background images describe the action of the group of 3x3 matrices with unit determinant over real and p-adic local fields on its quotient by its maximal compact subgroup, courtesy of Prof. A. Aizenbud and his students Y. Hendel and I. Glazer.
Branching out across maths disciplines Many objects in mathematics, physics, and other sciences possess natural symmetries, and over the course of history a number of theories and conceptual frameworks have been developed to study them. Based at the Weizmann Institute of
of permutations of a deck of cards – the set of all the possible ways to reorder it. There are all kinds of groups – like symmetries, they usually act on something,” outlines Dr Gourevitch. “Then there are also subgroups to consider.”
infinite-dimensional representations, functional analysis comes in. Then, when we study relative representation theory, geometry When we study
also comes in. So it’s at the crossroads of three major parts of mathematics – analysis, geometry and algebra Science in Israel, Dr Dmitry Gourevitch specialises in a field called representation theory, which concerns the study of linear symmetries. “There are lots of symmetrical figures, like circles, squares, cubes, while many molecules are quite symmetrical for example,” he says. A key topic in Dr Gourevitch’s research is the study of groups, a specific type of algebraic structure which is closely related to the idea of symmetry. “A group is a set with an invertible operation, such as multiplication or addition. Numbers (integral or real) are one example of a group, while another example is a group
Representation theory Researchers are investigating this topic further in the RelRepDist project, an ERCbacked initiative which aims to develop new theoretical tools to describe and study these groups in greater depth. “It’s difficult to compute something that is non-linear, yet the study of linear models should be much easier, and indeed it is,” continues Dr Gourevitch. “Group representation theory is the study of linear symmetries. A representation of a group is a way to present its quotient as a group of symmetries of a linear space.”
The study of representations of finite groups started around 120 years ago. All the simple finite groups were classified around 10 years ago, and in some sense also all of their representations. The next step is the study of infinite compact Lie groups. For example, the group of all possible rotations in the space is a compact Lie group since it is closed, bounded and smooth. By now, this subject had also become quite classical. The majority of the groups Dr Gourevitch and his colleagues are interested in are infinite and non-compact however, like the group of all invertible 3 by 3 matrices. They consider representations of such groups in symmetries of infinitedimensional spaces. This field has been intensively studied over the years, and significant progress has been made. “On one hand, the irreducible representations have been classified to some extent by Langlands,” says Dr Gourevitch. These answers are not perfect however, as the description is very complicated and implicit. “The existing classification describes some representations as small parts of huge spaces.”
Generalized functions, or distributions The main tool being used by Dr. Gourevitch and his colleagues is the theory of distributions, or generalized functions. This theory originated in 1920s with the work of the physicist Dirac, who used in his work the so-called ‘delta-function’ (δ 0 ). Dirac described it as a function that has zero value everywhere except the point zero, while the value at zero is infinite, and due to this infinity the total integral becomes one. Mathematicians at first believed that such a function could not exist, since the value at a single point has no effect on the integral. However, physicists continued working with this function and obtained meaningful results. Finally, in the 1950s the French mathematician Laurent Schwartz developed the notion of a generalized function, showing that while Dirac’s delta makes no sense as a function, it is a well-defined generalized function, or distribution. In the 1960s the theory was enhanced by Gelfand, and since then by many others. Distributions became a classical tool, but the theory is far from being complete. One of the goals of the project is to study distributions with a given group of symmetries, following Gelfand, Kazhdan, Harish-Chandra, Bernstein and others.
p-adic numbers The project’s research also touches on some other important mathematical concepts, including real and rational numbers. The group of real numbers includes all rational numbers. “Rational numbers are simple fractions. For example 5/7 is a rational number, as is -5/7, yet the square root of 2 (√2) is not a rational number, as it cannot be presented as a simple fraction. It’s still a real number however, and it appears in the real world, as the length of the hypotenuse of a right-angled triangle for example,” explains Dr Gourevitch. “Then π is another real number. A real number may be difficult to precisely quantify. For example π can be computed to thousands of decimal places; while 22/7 is a fairly accurate approximation for π, it’s still an approximation. A real number is defined by successive approximation, i.e. by a sequence of rational numbers that approximates it to any given level of precision” outlines Dr Gourevitch. “Thus, a real number is defined using rational numbers.” “Surprisingly, it is also possible to use rational numbers to define a totally different field of numbers - the so-called p-adic numbers.” says Dr Gourevitch. “To do that one uses other absolute values. The
notion of absolute value seems to be simple: it the “size” of the number, without its sign. For example, the absolute value of 5 is 5, while the absolute value of -5 is also 5. However, if we just list the basic properties of the absolute value and ask whether there are other functions with these properties, the answer is apparently ‘yes’ – there is a p-adic absolute value for any prime number p. For example, the 2-adic absolute value is defined to be 2 on ½, to be ½ on 2, and to be 1 on all odd integers. A 2-adic number is a binary sequence that has infinitely many digits before the binary point and finitely many digits after it, as opposed to a real number that has finitely many points before the decimal point and infinitely many digits after it. While these numbers may seem strange, they help to find integer solutions for algebraic equations. They are also useful in other parts of number theory – a wide field of mathematics that has attracted scholars since ancient times.” “Within the scope of the RelRepDist project, I would like to find an approach to the study of symmetric distributions that will work uniformly for algebraic spaces defined over real numbers and those defined over p-adic numbers,” says Dr. Gourevitch.
At a glance Full Project Title Relative representation theory and 1.2 distributions on reductive groups over local fields (RelRepDist) Project Objectives Let F be a local field, e.g. the field of real numbers or the field of p-adic numbers. Let G be an algebraic reductive F-group, e.g. the group of invertible n-by-n matrices over F. Objectives: 1. Develop tools for the study of G-equivariant distributions on spaces with G-action, that will work uniformly for all F.
2. Study the action of G on spaces of functions on quotients of G by its unipotent subgroups. Project Funding The project is funded under the ERCStG-2014 - ERC Starting Grant 637912. Dr Dmitry Gourevitch is the incumbent of the Dr. A. Edward Friedman Career Development Chair in Mathematics. Project C0-authors • Professor Avraham Aizenbud • Professor Siddhartha Sahi
Contact Details Project Coordinator, Dr Dmitry Gourevitch Faculty of Mathematics and Computer Science The Weizmann Institute of Science 234 Herzl St. PO Box 26 Rehovot 7610001 Israel T: +(972) 8-934-2171 0.6 E: firstname.lastname@example.org W: http://cordis.europa.eu/project/ rcn/193540_en.html
Dr Dmitry Gourevitch
Dr Dmitry Gourevitch is Assistant Professor in the Department of Mathematics at the Weizmann Institute of Science in Israel. He previously held academic positions in both the US and Israel, before taking up his current position in 2011.
Relative representation theory, or non-commutative harmonic analysis Dr Gourevitch is investigating how representations of certain groups behave when restricted to sub-groups. This subject is called relative representation theory, or symmetry breaking. While representation theory was originally considered to be primarily related to algebra, as groups and linear spaces are algebraic objects, Dr Gourevitch says his research also touches on several other branches of mathematics. “When we study infinite-dimensional representations, functional analysis comes in. Then, when we study relative representation theory, geometry also comes in. So it’s at the crossroads of three major parts of mathematics – analysis, geometry and algebra,” he outlines. Another name for relative representation theory is non-commutative harmonic analysis. It is a generalization of the classical harmonic analysis that originated in the works of Fourier 200 years ago, and which since then has found numerous applications in physics, engineering and computer science. For example, the Fourier transform is constantly used by our cell phones and computers. Thus, one may hope that applications arising from noncommutative harmonic analysis will be identified in due time.
RelRepDist project The RelRepDist project also holds wider relevance – to number theory, string theory and integral geometry. While the field of integral geometry has its roots in fundamental investigation, it has given rise to tomography techniques, which today are commonly used in medical imaging. Relative representation theory could inform future technical development, yet this is of course difficult to predict, and the project is
The Weizmann Institute of Science, Israel. motivated more by intellectual curiosity than the possibility of applications. A number of collaborations have been established, and Dr Gourevitch plans to pursue further research in this area in future. “I was contacted by some physicists working in string theory. They had a precise mathematical question, and we are investigating this question together,” he says. This is complex work, and research in this area demands deep knowledge, so training is essential to the long-term development of the field. Dr Gourevitch works with four PhD students and two post-doctoral trainees in his group, while he is also organizing international activities to help lay the foundations for continued research. “Together with Profs. Aizenbud, Bernstein and Lapid, I’m holding a two-month activity in May and June with the support of the ERC and some Israeli funds. Over 50 researchers came from across the world to attend,” he says. “The ERC funding gives us a unique opportunity to hold several workshops in adjacent fields, in back-to-back time periods. Our workshops consist of seminars and mini-courses, thus allowing the leading scientists in the area to present not only their latest results, but also deeper insights, summarizing the whole development of the field in the 21st century.”
Laying the foundations to model physical problems The rapid development of numerical methods over the last ten years holds important implications for modelling systems in engineering and applied sciences. Professor Gianluigi Rozza tells us about the AROMA-CFD project’s work in developing reduced order modelling techniques, which could help overcome current limitations The idea of using reduced order modelling techniques in numerical simulations to reduce complexity and computational times is not entirely new, dating back around thirty years, but they are still not widely applied in engineering and applied sciences problems. Part of the reason in the past was a lack of devices with sufficient computational resources, and more difficult problems had not been addressed yet by numerical analysts and computational scientists, while Professor Gianluigi Rozza of the AROMA-CFD project says the application of model order reduction techniques to practical industrial problems is also still relatively limited. “The research community has historically been more devoted to numerical analysis research than to exporting the methodology and exploiting it in computational science and engineering,” he explains.
Reynolds number This is an issue the AROMA-CFD project aims to address by further developing reduced order modelling techniques, with a particular focus on computational fluid dynamics. A large part of this work relates to the application of model order reduction to a specific class of physical problems involving fluids, such as gases, liquids, as well as cardiovascular flows. “There is a strong limitation in this related to the fact that we are not able to simulate what we call a high Reynolds number, which is a non-dimensional quantity in fluid dynamics. It relates to the ratio between inertial forces and viscous forces,” continues Professor Rozza. “A high Reynolds number roughly means that there is a higher velocity, whereas a low
Reynolds number means that more of a viscous effect is involved.” A high Reynolds number would be used to simulate the fluid dynamics around an aircraft in flight for example, while a lower Reynolds number would be used for a problem like modelling blood flow in the human body. These are very different problems, with very different shapes and dimensions, but Professor Rozza says they can be unified within a single paradigm, improving efficiency. “At the moment, when someone
and the same kind of parameterisation,” stresses Professor Rozza. This approach could potentially be applied in other contexts as well, with researchers testing its potential in automotive design and construction. In future, researchers plan to move towards a proof-of-concept and to demonstrate the versatility of this methodology. “We would like to develop a more unified framework,” says Professor Rozza. The next step will be to further develop a multi-physics approach,
We are integrating computer-aided design with research and development, optimization and control, and exporting supercomputing to industry and to hospitals does a numerical simulation, they don’t have a parametric design. So every time that you change the shape or the configuration of your problem or your system, you have to re-do almost everything,” he explains. “We are now preparing a reference configuration, a reference shape that is properly parametrized.” This configuration can be changed for specific industrial problems, providing the foundation for model development. The same paradigm can be used for medical problems, including modelling blood flow in certain parts of the cardiovascular system. “We know about specific shapes in the cardiovascular system, like the aortic arch and the carotid bifurcation,” says Professor Rozza. Everybody’s cardiovascular system is different, yet with good parameterisation a reference configuration can be developed, again providing the basis for accurate models. “We can build all possible configurations, using the same paradigm
which Professor Rozza says will be central to applying this research more widely. “This research could be an integral part of the new industrial design paradigm. We are integrating computer-aided design with research and development, optimization and control, and exporting supercomputing to industry and to hospitals thanks to reduced order methods and modern devices, like smartphones and tablets.” he says. Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics (AROMA-CFD) Professor Gianluigi Rozza SISSA, Mathematics Area, mathLab International School for Advanced Studies Scuola Internazionale Superiore di Studi Avanzati Office A-435, Via Bonomea 265, 34136 Trieste, Italy T: + 39 040 3787 451 E: email@example.com W: http://mathlab.sissa.it/rbnics W: http://people.sissa.it/~grozza Professor Gianluigi Rozza is a Tenured Associate Professor in Numerical Analysis and Scientific Computing at the International School for Advanced Studies in Trieste, Italy. He is SISSA Director’s delegate for technology transfer and industrial cooperation, and is the author of 100 scientific publications.
Illuminating the path to market for innovators in photonics The RespiceSME project is proposing new approaches to boost the effectiveness of European photonics SMEs. Project Manager, Samantha Michaux, talked to EU Research about the aims, methods and overcoming the challenges to stimulate photonics SMEs in Europe Photonics applications are
vast and all pervasive but few people outside the direct involvement in photonics research and innovation fully comprehend its reach and potential. “The word photonics is not widely understood in industry, yet it is a branch of science that has a huge impact on our lives and on our economy. For instance, the smart technology we rely on today owes a lot to photonics. We see the results of photonics in the automotive industry, in manufacturing, in applications for the energy sector, for environmental studies and more,” explained Samantha Michaux, Project Coordinator of RespiceSME. The fields of optics and photonics have experienced dramatic technical advances
over the past several decades and have cemented themselves as key enabling technologies across many different industries. Photonics is the science and technology responsible for generating, manipulating and detecting light waves and particles of light, known as photons. Far from being a mysterious fringe science, photonics technology is all around us every day – it’s in the TV remote control, the DVD player, fibre optic connections, the shop barcode scanners, smartphones and laptops, our in-car sensor systems and in lighting systems. For more specialist uses we see photonics in medical instruments, satellite imagery, security systems, manufacturing (lasers cutting) and energy production, such as
PV solar cells, for example. The global market value of photonics technology in the world economy is 300 billion Euros with a projected market value of beyond 600 billion Euros in 2020. The potential therefore, for SMEs in photonics should be huge, yet there is often a disconnect between creating the initial innovations and reaching the marketplace. When that gap between innovation and industry isn’t bridged, opportunity and potential can go to waste. This is where RespiceSME steps in; with a robust formula that enables productive business development strategies. “We provide cluster managers and representatives of organisations
© LITEK (Laser&Engineering Technologies cluster).
supporting SMEs with methodologies and tools to analyse the business goals and processes of photonics SMEs effectively and support them by defining a sustainable innovation strategy. Furthermore, we focus on the analysis of value chains to identify potential gaps which can merge in new business opportunities.”
Targeted collaborations “We have gone to great lengths to identify and approach the most promising SMEs in photonics across Europe and offer them this unique support in business development. The 10 partners in the project are mainly photonics innovation clusters joining forces to increase the competitiveness of European Photonics SMEs. Each partner has three businesses under their guidance, so 30 businesses in all stand to gain from this. For the SMEs, this is valuable, as this kind of consultancy for strategy would otherwise be at great cost to the company, if provided by profit-oriented consultancies,” says Michaux. The process begins with an innovation audit to assess the business innovation processes. RespiceSME then stimulate the
innovation potential by developing a sustainable innovation strategy and defining recommendations which lead to an action plan for strengthening the innovation capacity of high-tech photonics SMEs. RespiceSME helps exploit photonics innovation capacity by analysing different value chains valuable for high-tech photonics SMEs and optimises the value of inter-sectoral applications of photonics by promoting better understanding and exploitation of interdisciplinary value chains and sector roadmaps. For this purpose, we implement a Value Chain Analysis for commercial sectors of Environment/Energy, Transport and Manufacturing. “We help with everything from forming their identity, finding the best ways to exploit their products and helping them to understand which sectors and industries could really benefit from what they are producing. We facilitate access between SMEs and technology research organisations (RTOs) to provide them with R&D facilities and expertise. Additionally, we inform SMEs about the different financial instruments (public & private) available for innovation funding”, so Michaux.
The word photonics is not widely understood in industry, yet it is a branch of science that has a huge impact on our lives and on our economy. For instance, the smart technology we rely on today owes a lot to photonics
© Fotolia 129610943 L AA+W
© Dieter SchÅtz pixelio.de
© TiM Caspary pixelio.de
© Fotolia 63980460 M chungking Energy
At a glance
Full Project Title RESPICESME IN A NUTSHELL (RespiceSME) Project Objectives The RespiceSME project aims to reinforce the innovative capacity of Europe’s photonics Small and Medium Enterprises (SMEs), clusters and national platforms by stimulating targeted collaborations in and beyond photonics. Project Funding This project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 687961 Consortium Members • Steinbeis 2i GmbH – Germany • OpticsValley – France • F oundation for Research and Technology Hellas – Greece • OptecNet Deutschland e.V. – Germany • PhotonicSweden – Sweden • Photonics Austria – Austria • SECPhO – Light Technologies Cluster – Spain • National University of Ireland – Galway, Ireland • L aser & Engineering Technologies Cluster – Lithuania •K nowledge Transfer Network Limited – United Kingdom Contact Details Project Manager, Samantha Michaux Steinbeis 2i GmbH (S2i) Germany T: +49 721 935 19-123 E: firstname.lastname@example.org W: www.respice-sme.eu
Samantha Michaux has been working for Steinbeis 2i GmbH (S2i) based in Germany, since 2010 and has been Project Manager there for over four years. Her expertise is in project management and consultation in a wide range of sectors including ICT, Transport & Logistics, Energy, Automotive, EU Financing, Technology Transfer, Business Development, Internationalism and Innovation Management. Prior to this she studied at the the Johannes Gutenberg University Mainz. Samantha is qualified in Business Development and speaks four languages with professional proficiency: French, English, Spanish and German.
© Fotolia 78439179 M ProMotion
Sourcing talent and funding An important part of the RespiceSME project involves aligning education with innovation by collecting and updating information on photonics courses and training activities for the future photonics generation. In this framework the consortium organises a workshop to trigger innovation seekers and companies by which tools and measures will be presented to better scope the requirements of the industry (in particular SMEs) for improving and updating the curricula of physics / photonics courses. The project also provides feedback to the European Commission with data and research around Regional research and innovation strategies for Smart Specialisation (RIS3) that build on each region’s strengths, competitive advantages and potential for excellence. These analysis should provide valuable information on strategic decisions being made to support the innovation capacities of photonics SMEs and thus, design the regional funding programmes.
RespiceSME is a unique initiative to help photonics SMEs struggling with different gaps that hinder their development or exploitation of innovation capacity. The gaps mainly signify limited access to education, public institutions, policy makers, other industries, etc. Innovation clusters are building bridges to overcome these gaps or barriers that are preventing innovation effectiveness. They are connected to their SMEs and also to all mentioned institutions that SMEs need to access to stay competitive on the market.
We have gone to great lengths to identify and approach the most promising SMEs in photonics across Europe and offer them this unique support in business development © LITEK (Laser&Engineering Technologies cluster).
A deeper picture of friction Frictional interfaces are all around us, in both the natural world and manufactured objects, yet some questions about their behaviour remain unanswered. Dr Julien Scheibert tells us about the Cascade project’s work in developing a comprehensive picture of friction at all scales, work which holds important implications for both science and industry There are frictional
interfaces all around us in everyday life, in the natural environment, in manufactured objects, and in our own bodies. These interfaces can be in two kinds of states, as Dr Julien Scheibert, the Principal Investigator of the Cascade project, explains. “They’re either stable – the stuck state – and don’t move relative to each other, or they’re in a slipping state where the interface slides,” he says. The transition between these two states formed the primary research focus for the Cascade project.
Frictional interface This transition does not happen instantaneously, as was previously believed. In a sheared frictional interface, for example a seismic fault, there is always a region which is weaker than another, and hence will start to slip earlier. “This is what we call slip nucleation – it involves a small region of the interface. This region will grow progressively, and eventually it will be able to invade the whole interface. Only at that point will there be macroscopic relative motion between the two bodies,” explains Dr Scheibert. “This growth of the slipping region defines two sub-regions of the interface – the slipping region, which is growing, and the stuck region, which is shrinking. The frontier between these two regions is what we call a micro-slip front.” A number of questions around the dynamics of these fronts remain unanswered, in particular their speed and direction, and the force at which they nucleate. The project followed a twopronged strategy to address these questions. “One was experimental, using high-speed camera acquisition and a digital image correlation method to monitor the dynamics of the slip field. The other was numerical. There was a series of unexplained observations in the literature which we wanted to understand so we developed new models of friction and tried to reproduce the observations and interpret them,” says Dr Scheibert.
Heterogeneous interface Front type This approach was designed to gain new insights into the dynamics of micro-slip fronts, in particular the differences between fast and slow fronts. One well known type of front is quasi-static fronts, meaning that there is mechanical equilibrium at all times, which affects crack propagation. “If you drive your system and you stop shearing it, then the front will stop. Because you have no more energy available for the front to go on,” explains Dr Scheibert. Researchers have since discovered fronts that are highly dynamic, and so behave differently. “The main difference is that when this type of front has nucleated it cannot be stopped. Even if you stop shearing, it has sufficient elastic energy stored in the system to continue propagating,” says Dr Scheibert. Researchers have also observed differences within the dynamic regime, with both very fast fronts and much slower fronts in the same system. The project developed a new friction model that was able to reproduce these two dynamic fronts – slow and fast – in the same system. “We were able to understand why there can be a transition between slow and fast fronts,” explains Dr Scheibert. The wider objective in this research is to develop a comprehensive picture of friction at all scales, which will hold important implications across a number of disciplines. “In mechanical engineering, it’s very important
to understand the dynamics of shear cracks, while it’s also central to studying earthquakes, so many fields are interested in this topic,” says Dr Scheibert. CASCADE: Frictional shear crack dynamics along heterogeneous interfaces The objective of the project CASCADE is to advance knowledge in the field of friction, by providing a comprehensive picture of the onset of sliding and, in particular, of the dynamics of shear cracks along heterogeneous frictional interfaces. The strategy is based on a tight, quantitative dialogue between experiments and numerical/theoretical approaches. Project Collaborators: CNRS, University of Oslo and Academy of Sciences of Ukraine. Project Funding: Marie Curie action, Career Integration Grant, FP7 Dr Julien Scheibert, CNRS scientist Laboratoire de Tribologie et Dynamique des Systèmes Ecole Centrale de Lyon 36, Avenue Guy de Collongue, 69134 Ecully cedex, France T: +33 47 218 6226 E: email@example.com W: http://perso.ec-lyon.fr/scheibert.julien/
Dr Julien Scheibert is Chargé de Recherche (Researcher) at CNRS. He develops experimental and modelling tools to investigate the rupture dynamics of heterogeneous interfaces, in various fields including biomechanics, mechanical engineering and geophysics. In 2009, he received the Branly prize from the French Federation of Scientific Societies.
Granular materials play an important role in many industrial and natural processes. Researchers are combining theoretical, computational and experimental techniques to both investigate the fundamental behaviour of these materials and design new acoustic devices, as Dr Georgios Theocharis explains
A clearer picture of granular materials The development of
new materials often leads on to further technical innovations, as scientists seek to harness their properties to develop new technologies and devices. Granular crystals can act as important building blocks in materials development, while they also hold intrinsic scientific interest, both of which are key motivating factors behind the work of the Comgransol project. “The main objectives of the project are on the one hand to achieve a better understanding of the physics of driven granular media. Then secondly, to design granular-based acoustic devices for the control of acoustic propagation,” says Dr Georgios Theocharis, the project’s Principal Investigator. These granular materials are effectively collections of macroscopic grains; analysis of these grains and their behaviour within wider structures can lead to new insights in physics and materials science, underlining the wider importance of Dr Theocharis’s research. “Understanding the physics of contact forces in these materials enables us to develop a deeper understanding of wave physics in granular solids, and also of the physics of granular materials in general,” he outlines.
Granular solids This area forms a central part of the project’s overall agenda, with Dr Theocharis and his colleagues combining theoretical, computational and experimental techniques to gain new insights into the physics of granular materials. Experiments are being performed on metallic grains, such as stainless steel; there are two main reasons behind the decision to use this type of
structures. “We’ll investigate waveguides in different network configurations, such as branched waveguides, L-shaped waveguides, or two dimensional crystals in different geometries, like square or honeycomb crystal geometries,” says Dr Theocharis. The great advantage of using these granular solids, in comparison to other solid structures, is the strong nonlinear response. This nonlinearity is derived
The main objectives of the project are on the one hand to achieve a better understanding of the physics of driven granular media. Then secondly, to design granular-based acoustic devices for the control of acoustic propagation material. “We are using external magnetic fields to magnetize the grains and thus design particular granular structures, so it is necessary to use magnetisable grains. On the other hand, we’ve found metallic grains with a very smooth surface, which is very important for us. The interaction of grains happens due to contact forces. Thus, the high quality of smoothness of their surface means we don’t have to use more complicated processes,” explains Dr Theocharis. Researchers are also looking at the behaviour of different granular
from the geometry of these solids, from contact deformations of the particles. “Our goal is to take advantage of this strong nonlinear response and combine it with the proper geometry of the structure – using different network configurations – to achieve an advanced level of control over wave propagation,” explains Dr Theocharis. Although adding nonlinearity makes the wave physics much more complicated, it also opens up the possibility of enabling more precise control of waves. “There are a plethora of
nonlinear phenomena like harmonic generation, soliton propagation and bifurcations, which can be used to help control waves. So having a medium with a strong nonlinear response can strongly enrich the control of waves,” continues Dr Theocharis. Researchers are studying both homogenous and heterogenous granular crystals as part of this work. From these foundations, researchers hope to learn more about the behaviour of different complex granular structures, like network of waveguides or two dimensional crystals, which could in future inform the design and development of advanced acoustic structures. “For the moment, we are basing our studies on granular structures made of spherical particles in contact, but developing new, even more complicated structures – using advanced 3D printing techniques – with preselected interaction forces could lead to significant advances in terms of the development of certain mechanical devices,” says Dr Theocharis. There is also a purely theoretical dimension to the project’s research. Dr Theocharis and his colleagues are investigating questions of fundamental interest, such as how granular nonlinearities influence the disorderinduced localization, or how acoustic waves propagate in complex solid structures like polydispersed granular solids. “In this case, we start with simplified models to answer the questions we set ourselves, and then we progressively add different features that make the problem more and more difficult, but also more and more realistic,” says Dr Theocharis. Hertzian contact law is sufficient to capture part of the dynamics of these granular structures, yet Dr Theocharis says this approach will not be able to fully answer more complex questions, so in future it will be necessary to consider additional nonlinear contact laws. “More complicated contact laws should
eventually be studied, especially when one studies the propagation of waves which are not longitudinal,” he acknowledges.
Computational research The results of this experimental research could hold wider relevance in terms of the project’s computational work. More detailed computational methods could in future enable researchers to predict the properties of a metamaterial with a specific granular structure. “Simplified models are used at the moment to capture the main physical wave phenomena in different types of granular structures. Other granular features though, like misalignments of the particles in waveguides, or rearrangements of particles under strong excitations, should also be taken into consideration. The latter could lead either to mechanical failure of the structures, or could even be used for more advantageous wave control devices,” outlines Dr Theocharis. One major application Dr Theocharis envisions for this research is in the design of advanced elastic devices for the control of elastic wave propagation. “For example elastic waves and waveguides have been used widely in RF telecommunications. To move in this direction, we will need to scale down these granular structures to the microscopic regime,” he says. This forms part of Dr Theocharis’s future agenda, with plans to investigate microscopic granular structures consisting of microspheres, work which could be relevant to RF telecommunications applications. Research in the fundamental physics and emerging applications of granular crystals is flourishing, and Dr Theocharis intends to continue his work in this area. “I plan to extend these studies to other mechanical structures with complicated configurations that could be built using the advanced 3D techniques that are currently available,” he outlines.
At a glance Full Project Title Complex dynamics of driven granular solids: Physics and applications (Comgransol) Project Objectives Dr Theocharis’s research has focused on the interface between nonlinear dynamics and condensed matter physics. The main objectives of the project are: • a better understanding of the physics of driven granular media • the design of granular-based acoustic devices for the control of acoustic propagation Project Funding G.T. acknowledges financial support from FP7-CIG (Project 618322 ComGranSol). Project Collaborators • Dr. F. Allein, Dr. V. Achilleos, Dr. V. Tournat and Prof. V. Gussev, Laboratoire d’Acoustique de l’Université du Maine. • Prof. Ch. Skokos , University of Cape Town. Contact Details Georgios Theocharis, Researcher at CNRS Laboratoire d’Acoustique de l’Université du Maine UMR-CNRS 6613 Université du Maine Av. Olivier Messiaen 72085 Le Mans cedex 9 France T: +33 143 61 1412 E: firstname.lastname@example.org W: http://georgiostheocharis.weebly.com/ V. Achilleos, G. Theocharis, Ch. Skokos, Energy transport in one-dimensional granular solids, Phys. Rev. E 93, 022903 (2016). F. Allein, V. Tournat, V. E. Gusev, G. Theocharis, Tunable magneto-granular phononic crystals, Appl. Phys. Lett. 108, 161903 (2016).
Dr Georgios Theocharis
Researcher at CNRS
Dr Georgios Theocharis is a researcher at the French National Centre for Scientific Research (CNRS) in Le Mans. He gained a PhD in physics from the University of Athens in 2008, after which he held several academic positions in America, before taking up his current role in 2012.
Sifting the salt from the water It is predicted that by 2025 around 1.8 billion people across the world will be living in regions affected by water scarcity. Researchers at the University of Manchester have developed a new type of graphene-based membrane that could quickly filter salts from water, making seawater suitable for drinking water By Patrick Truss
he discovery of a method of producing graphene in 2004 generated a great deal of interest in both the academic and commercial sectors. While the physical properties of the material were identified long ago, in particular its electrical and thermal conductivity, it was not until the early years of this century that researchers at the University of Manchester in the UK managed to isolate graphene. Now that the material itself has been isolated, researchers across the globe are looking to exploit its properties and identify ways in which it can be applied in the real world. A 2-dimensional layer of graphite that is just an atom thick, graphene is the thinnest material on earth, yet at the same time also the most conductive, with remarkable strength. This set off a renewed wave of interest in graphene, with researchers around the globe looking to harness the properties of the material and exploit its properties. Recent research from nanomaterials broker Fullerex shows that there are 142 graphene producers across 27 countries, with China alone holding around two-thirds of global production capacity. A great deal of research is also focused on identifying practical applications for the material. Universities and businesses in the US and China are both thought to have filed large numbers of patents around the use of graphene-based technologies, while the UK lags behind, despite its key role in the initial development of the material.
The National Graphene Institute in Manchester is central to UK efforts to now capitalise on the wider potential of the material and translate research advances into commercial development. Home to more than 200 graphene researchers, from graduate students to Nobel laureates, the Institute also works in partnership with industry. There are regular opportunities for researchers at the Institute to meet scientists in other disciplines and collaborate with industry, which is fundamental to sharing knowledge and identifying potential commercial applications. These extend across a diverse range of fields, including in energy, composite materials, and in water filtration. The material itself is hydrophobic, yet researchers at Manchester found that membranes comprised of stacks of graphene oxide were impermeable to all gases and vapours, except water. Since this initial discovery, researchers have been working to improve graphene-based filters, aiming to enable precise and rapid filtering of salts and organic molecules from water.
Water scarcity This holds particularly significant implications given wider concern about water scarcity, an issue which already affects a significant proportion of the global population. The UN estimates that around 1.2 billion people live in areas already affected by physical scarcity, a number which is set to rise
further over the coming years in line with the wider effects of climate change. This may seem surprising given that around 70 percent of the earth’s surface is covered by water, yet the vast majority of it is saline water, with only a relatively small proportion drinkable freshwater. As the global population continues to grow and the global middle class expands, pressure on existing water resources intensifies. It is estimated that over the past forty years the global population has doubled while water use has quadrupled, leading to intense pressure on water resources. The US Intelligence Community Assessment of Global Water Security found that, by 2030, humanity’s annual global water requirements will exceed current sustainable supplies by 40 percent. The global economy is currently estimated to use around 2,600 cubic kilometres of water a year, and the gap between freshwater supply and demand is likely to widen further. While desalination plants have been established in many areas, conventional methods are typically quite expensive and energy-intensive to operate. This is leading to a growing awareness of the need to both manage existing supplies more effectively and harness the power of technology to develop new technologies and methods that could help alleviate water scarcity concerns. A team of researchers at the University of Manchester has now succeeded in developing a membrane capable of sieving common salts from water, raising the prospect of seawater being filtered to rapidly supply people with fresh water. The researchers used a chemical derivative called graphene oxide,
which could be significantly cheaper to produce than singlelayer graphene. These graphene-oxide membranes have already been shown to be able to filter out certain molecules, yet previously they were previously too large to sieve common salts. Now researchers are able to precisely control the pore size in a membrane, so that common salts can be sieved out and removed from salty water. This brings the prospect of using membrane technology in water filtration a step closer. “Realisation of scalable membranes with uniform pore size down to atomic scale is a significant step forward and will open new possibilities for improving the efficiency of desalination technology,” says Professor Rahul Raveendran Nair, one of the leaders of the study. “This is the first clear-cut experiment in this regime. We also demonstrate that there are realistic possibilities to scale up the desired approach and mass produce graphene-based membranes with required sieve sizes.” The technique is limited to the laboratory at this stage, yet its potential is clear in terms of addressing water scarcity, while other companies are also exploring other options for improving water filtration. For example, one approach is coating polymer membranes with graphene oxide to improve the filtration process and speed up water transfer.
Global water summit Many of these technologies and new methods were discussed at the 2017 Global Water Summit, which was held recently in Madrid. The summit is held annually, providing an opportunity for representatives from municipalities, water companies and
The ultimate goal is to create a filtration device that will produce potable water from seawater or wastewater with minimal energy input www.euresearcher.com
industry to network, collaborate, and work together to address the key challenges around water scarcity. This is not limited to looking just at new technologies around water recovery, important although that is, but also considering the general business climate and access to finance. Water is of course essential to life, yet there is wide variety in the business models used across the world, and in many countries private companies play a central role. The summit brought together leaders in technology and finance to explore the possibilities around using technology to improve provision of clean water, and helping to remove obstacles to development. With the UN estimating that around 1.6 billion people face economic water shortage, there is a clear need for innovative solutions. This shortage is partly due to the process of desertification, as fertile land is lost, yet it is thought that overall there is enough freshwater on the planet overall to meet demand. However, this water is distributed unevenly, and certain areas of the globe are already experiencing water scarcity, in particular in sub-saharan Africa. Many different ideas have been put forward to alleviate the problem, including more efficient water management in the agricultural sector, and improved irrigation technologies. In some countries, water scarcity is not caused by lack of physical access to water, but rather insufficient investment in the infrastructure required to meet demand. The economic circumstances in the countries that make up the Gulf Cooperation Council (GCC) are very different. While the
region is relatively affluent, it does not enjoy abundant water resources, and the local climate makes it challenging to manage existing resources effectively, leading the GCC states to invest heavily in desalination plants. The Middle East as a whole is home to around 70 percent of the world’s desalination plants, demonstrating their wider importance, yet the economics are becoming more challenging. In desalination, concentrated wastewater is pumped back into the sea, causing the surrounding waters to become ever saltier and the process to become more expensive. This is likely to have long-term consequences, believe some observers. “In time, it’s going to become impossible to use desalination in a way that makes economic sense,” said Gökçe Günel, an anthropologist at the University of Arizona. “The water will become so saline that it will be too expensive to desalinate.” Water scarcity is also a prominent issue in other areas of the world, including China, with a recent Greenpeace report shedding new light on the scale of the problem. More than 85 percent of surface water in Shanghai was found to be unsafe to drink, while the figure rose to 95 percent for the port city of Tianjin. Whatever the underlying causes of water scarcity, or the surrounding social and economic circumstances, water filtration could represent a highly attractive and financially viable option to addressing the issue, potentially providing an effective, efficient and rapid way of enhancing the supply of clean water.
Water separation The research from the University of Manchester holds rich potential in these terms, potentially providing an economic means of filtering out salts from seawater and providing a more sustainable supply of clean water to areas in need. There’s a world of difference between the laboratory and a large-scale commercial operation however, so there are many hurdles still to negotiate. Some steps have been made in this direction, with the University of Manchester working together on a collaborative research programme with the Masdar Institute for Science and Technology, based in Abu Dhabi. The programme will receive funding of $500,000 from the UK (1.5 million AED), spread across projects looking at various engineering applications, including water desalination. This is key to the wider commercialisation of graphene. “Graphene has huge potential for applications in a large range of sectors, and we are delighted to be collaborating with the Masdar Institute for Science and Technology on these important areas of research,” said James Baker, Graphene Business Director at the University of Manchester. “Our partnership with Masdar Institute is crucial to the commercialisation of graphene and we look forward to seeing ground-breaking research develop into exciting applications with potential industrial partners as a result of this activity.” The Masdar Institute is also involved in the development of the Graphene Engineering Info Centre (GEIC), a £60 million facility which is set to open in 2018. Masdar will contribute approximately
half of the cost of the facility, with funding also from various UK-based sources, testament to the level of interest in the material. A number of other initiatives have also been established across the world, exploring a wide variety of potential applications of the material, yet water filtration will continue to attract a lot of interest as policy-makers grapple with the problem of water scarcity. Graphene membranes could aid millions of people, now researchers aim to improve them further. “The selective separation of water molecules from ions by physical restriction of interlayer spacing opens the door to the synthesis of inexpensive membranes for desalination,” wrote Ram Devanathan of the Pacific Northwest National Laboratory. “The ultimate goal is to create a filtration device that will produce potable water from seawater or wastewater with minimal energy input.”
Getting to the heart of antimatter Experiments on antimatter have historically been difficult to realise, as highly sophisticated facilities are required, but the new ELENA ring at CERN promises to open up new avenues in research. We spoke to Professor Carsten Welsch about the AVA project’s research into fundamental questions around antimatter The imbalance between
matter and antimatter in the observable universe is one of the enduring mysteries in the physical sciences. An antimatter particle can be described as a sort of mirror particle of what is found in conventional matter. When the two meet – a matter particle and its antimatter counterpart – they annihilate into pure light, and there’s a 100 percent conversion into energy. The observable universe is comprised largely of conventional matter, yet it is thought that the big bang should have created equal amounts of matter and antimatter, raising a number of fundamental questions about our understanding of physical laws and energy production. Why didn’t the big bang create equal amounts of matter and antimatter? Is there some level of uncertainty in the overall production of energy? Is there something wrong with our current understanding of the fundamental laws of physics? The Accelerators Validating Antimatter (AVA) project aims to make an important contribution in these terms, bringing together academic and commercial partners to train 15 fellows in antimatter research, laying the foundations for continued development. Researchers will utilise the new ELENA research infrastructure at CERN to investigate fundamental questions around antimatter – work which could lead to new scientific and technical breakthroughs. We spoke to project coordinator Professor Carsten Welsch, Head of Physics at the University of Liverpool and member of the Cockcroft Institute, about the project’s research, the challenges they face, and the implications of their work for the future of science and industry.
Antimatter research EU Researcher: What are the fundamental difficulties in researching antimatter? Professor Carsten Welsch: It really starts with the production of the antimatter in the first place, as you need a very large accelerator facility.
Inset Top Left: Scintillating fibre detectors (Image courtesy of M. Doser, CERN). Main Image: Laser based beam diagnostics (Image courtesy of University of Liverpool).
The big problem is that, according to Einstein’s famous equation E=mc2,, in order to create a massive particle like an antiproton you need to first have a reasonably high energy in your particle beam. This high energy means that when you produce the antiproton, it will basically come out of that generation process as a very hot particle – high in velocity and energy – and that’s exactly not the type of particle that we want in research.
lose more than 99.9 percent of the antiprotons that have been produced. So ELENA builds a bridge – it de-celerates the antiprotons further to almost rest, but in a very controlled way, so that it provides a high-quality beam of cold antiprotons. That’s going to make experiments much easier in future – it’s really a game-changer. EUR: Does this mean that you can get more antiprotons?
When an antiproton and a positron meet, they form antihydrogen, which is antimatter. That would then allow us to compare antihydrogen and hydrogen in very fundamental ways EUR: How will the ELENA infrastructure improve on existing research infrastructure? CW: This production process that I’ve described generates antiprotons at very high energies, which are then de-celerated – but even at the lowest energy, they are still much too fast. Previously, a degrader foil was used to degrade the energy even further, so that it could then be trapped in a trapping device. But with that foil, you
CW: Yes, we can get more antiprotons, but we can also get more of them at a much better beam quality, and the better the beam quality, the deeper we can look into antimatter. This really will – I believe – enable us to gain new insights into antimatter. For example, once we have captured these antiprotons, they can be merged with a cloud of positrons – the antiparticle of the electron. When an antiproton and a positron meet, they form antihydrogen, which is antimatter. That would then allow us to
compare antihydrogen and hydrogen in very fundamental ways, looking into the atomic structure, doing laser spectroscopy and measuring the atomic levels.
AVA project EUR: What is the overall goal of the AVA project? CW: There are three different work packages in the project, and the aim is to bring together different antimatter research communities, who before had possibly not talked enough to each other. One work package looks at the design and optimisation of the accelerator, another looks into the diagnostics that are required to measure the beam, and the third addresses novel experiments that haven’t previously been possible. EUR: How is the research organised within the three work packages? CW: There are five different projects within each of the work packages, which will be carried out by different host institutions. Here at the University of Liverpool, we will have one fellow looking into beam stability in these low-energy storage rings, but the simulation results and the tools that will be developed could also be applied to other facilities – they won’t be limited to antimatter beams. EUR: What implications does this research hold for industry? CW: This is very much a frontier research project, but that’s not to say that there aren’t potential applications arising from it. Take for example Positron Emission
Tomography (PET) technology; these scanners that are used in hospitals. A radioactive isotope is injected into a patient – in the body of a patient, it produces an electron and a positron, so a particle and its antiparticle. Both of them are used to image metabolic processes in patients. EUR: So is there interest from the commercial sector in your research? CW: There are 25 partners in the project, of which roughly a third are universities, a third are research centres, and a third is from industry. Our industrial partners are primarily looking into the development of diagnostics and detectorsantimatter beams at low energies are very difficult to measure. The goal for industry is that once they have demonstrated that their detectors work with such a complicated beam, then they can also use that kind of technology and apply them in other fields and hopefully find new markets and new customers. CW: The primary focus is on training the fellows – we hope to recruit fifteen highly qualified and dynamic individuals from all over the world, who will be working on these research challenges. Over the next three years, we aim to train them in these fields and establish a strong network between them, so that there’s a coherent group of fellows who like working with each other and producing outstanding research results. Beyond that, I hope that we can build bridges between these different research communities, who at the moment work rather independently of each other. This would be a good platform to start new collaborative research projects.
At a glance Full Project Title Accelerators Validating Antimatter physics (AVA) – A European Training Network Project Objectives AVA will enable an interdisciplinary and cross-sector program on antimatter facility design and optimisation, advanced beam diagnostics and novel antimatter experiments. Project Funding Funded by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 721559. Beneficiary Partners • University of Liverpool, UK • CERN European Organization for Nuclear Research, Switzerland • CIVIDEC Instrumentation GmbH, Austria • Cosylab d.d., Slovenia • FOTON s.r.o, Czech Republic • FZJ Forschungszentrum Juelich, Germany • GSI Helmholtzzentrum fuer Schwerionen Forschung GmbH, Germany • Max Planck Society (MPG) via the Max Planck Institute for Nuclear Physics (MPIK), Germany • OEAW Austrian Academy of Sciences/Stefan-Meyer-Institute, Austria • Stahl-Electronics, Germany • University of Manchester, UK Contact Details Professor Carsten P Welsch Head of Department Department of Physics University of Liverpool L69 7ZE Liverpool, UK T: (+44) 79 73 24 79 82 E: email@example.com W: http://www.ava-project.eu W: http://www.quasar-group.org
Prof. Carsten Welsch
ELENA - © CERN - Maximilien Brice. Professor Carsten Welsch studied at the Universities of Frankfurt and Berkeley. His further career brought him to RIKEN in Japan, the Max Planck Institute for Nuclear Physics in Germany and to CERN. He joined the University of Liverpool in 2008. In September 2016 he became Head of the Physics Department.
Next generation detector to probe new physics A type of elementary particle, neutrinos hold enormous scientific interest, as they enable researchers to probe physics beyond the standard model. Dr Kostas Mavrokoridis tells us how the Ariadne project’s work in developing a next generation neutrino detector could open up new avenues of investigation in particle physics A type of elementary particle that lacks an electric charge, neutrinos are an area of enormous scientific interest, now researchers are aiming to develop effective detectors to analyse them further. As neutrinos don’t have an electric charge they don’t present any information to a detector on their own, yet Dr Kostas Mavrokoridis says it is possible to detect them through their interactions with other particles. “We can measure the products of neutrinos,” he explains. Based at the University of Liverpool in the UK, Dr Mavrokoridis is the Principal Investigator of the Ariadne project, an ERC-backed initiative aiming to develop a next-generation liquid argon neutrino detector. “Once a neutrino interacts with the liquid argon detector, it’s going to give you other particles. These other particles have a charge – and as they are charged particles, we can record them,” he continues. “We want to precisely record these particles. In their path, these particles leave tracks, and from these tracks, you can then do energy calibrations, look at complex vertices and do new physics.” This kind of research could lead to new insights into physics beyond the current standard model. While it was predicted in the standard model that neutrinos don’t have mass, research has since shown that this is not in fact the case. “We know now that neutrinos do have mass – that was part of the Nobel Prize in 2015,” says Dr Mavrokoridis. The standard model in general is very robust, but not when it comes to neutrinos, now researchers aim to gain further detail about their properties; major topics of interest include dark energy, dark matter and the a-symmetry between matter and antimatter. “Why are we living in a matterdominated universe and not in an antimatter dominated universe? What caused that?” asks Dr Mavrokoridis. “We have found that other types of particles and their anti-particles don’t behave in the same way. Now we need to find if that’s
the case for neutrinos – if they don’t then there’s a charge parity violation there. There are theoretical models to describe the behaviour of neutrinos, yet there is scope to improve the precision of the parameters.”
ARIADNE scale model
Once the neutrino interacts with the liquid argon detector, it’s going to give you other particles. These other particles have a charge – and as they are
charged particles, we can record them
Ariadne detector The Ariadne detector could have a significant impact in these terms, revolutionising the design of future experiments and opening up new research opportunities. Ariadne is designed as a
two-phase detector, meaning that there is a liquid, and on top of the liquid there is also the gas phase of the argon. “When a particle passes through the detector it ionises the argon – so it frees the electrons. So we get all these free electrons, then we apply an electric field, and the electrons drift to the surface of the liquid, then you apply a higher electric field, and extract them to the gas phase,” says Dr Mavrokoridis. There is a device on top of the gas phase called a Thick Gas Electron Multiplier (THGEM), which amplifies the electrons still further. “The THGEM has very small holes – around 500 microns – and within these holes, we then accelerate these electrons that we’ve just taken out to the gas phase of the detector,” outlines Dr Mavrokoridis. “If you accelerate electrons very fast in gas then you create even more electrons. So you have a multiplication, a cascade of electrons.” This type of experiment can generate large quantities of data. The unique point about Ariadne is that instead of dealing with potentially millions of strips, as would be the case with a conventional detector, electrons are processed in the holes on the THGEM to also generate light. “On top of generating electrons – or charge – we also generate light in the 128 nanometre wavelength, so it’s a vacuum ultraviolet light. Then we have a camera on top of that detector – and this camera is sensitive enough to take a picture of this light. Essentially now we are able to photograph the tracks of the particles,” explains Dr Mavrokoridis. This is potentially an easier and more efficient way to track particles, while Dr Mavrokoridis says this could also open up new avenues of scientific investigation. “As you are generating more light than charge in the THGEM holes, this could mean the detector will be more sensitive to lower energy levels. That potentially allows you to also go to lower energy physics – such as dark matter,” he says. “This type of detector is a very similar technology to a dark matter detector, although the read-out is different.”
The current focus however is on developing the detector, validating the technology and assessing its effectiveness with respect to detecting neutrinos. The project is currently in the construction phase, and the plan is to build the detector on a fairly rapid timescale, after which it will be taken to CERN for validation. “A primary aim in Ariadne is the characterisation of the technology for the neutrino sector. So we will do that at CERN on a charged particle beam,” outlines Dr Mavrokoridis. Beyond probing some of the most fundamental question in physics, the Ariadne detector could potentially also be applied in medical imaging to help diagnose disease. “Ariadne can be used to take photographs of gamma particles, and by taking photographs of these low-energy gammas, potentially you will be able to tell where they came from. This is something that could be used in medical imaging,” explains Dr Mavrokoridis. “The current positron emission tomography (PET) scanners use positrons to detect gamma rays in the body.” A better level of resolution of where these gamma rays are coming from would enable medical professionals to identify smaller tumours, which could lead to earlier diagnosis. Dr Mavrokoridis says the project is looking to demonstrate the feasibility of this approach to medical imaging, but at this stage they are not looking to take it any further. “We’ll try to see if it is feasible. Ariadne will be able to tell you – ‘well actually, if you give me a source in that position, I can trace back to that position with that kind of accuracy,’” he outlines. While the development of the
At a glance Full Project Title ARgon ImAging DetectioN chambEr (ARIADNE) Project Objectives The ERC Starting Grant ARIADNE project aims to develop optical-based LAr TPC readout technologies for direct imaging of secondary scintillation light produced by electron avalanches in THick Gas Electron Multiplier (THGEM) holes within two-phase Neutrino Detectors. This approach has the potential to revolutionize future giant LAr Neutrino experiment design, offering highresolution imaging and a lower energy threshold thus enhancing the potential for new Particle Physics discoveries.
ARIADNE Operation Principle detector holds important implications in terms of medical imaging, Dr Mavrokoridis is keen to stress that this is not its sole application, and researchers will be keen to use it in investigating new physics. “Currently there’s a lot of momentum behind the development of liquid argon detectors amongst the neutrino research community, so the Ariadne detector is going to have a tremendous impact if it’s successful,” he enthuses.
Project Partners • European Organization for Particle Physics (CERN), Geneva, Switzerland Contact Details Project Coordinator, Dr Konstantinos Mavrokoridis Lecturer in Experimental Particle Physics Department of Physics Oliver Lodge Laboratory University of Liverpool L69 7ZE, UK T: +44 (0)1517943378 E: firstname.lastname@example.org W: hep.ph.liv.ac.uk/ariadne
Dr Konstantinos Mavrokoridis
Prototype Assembly Images of Muon tracks taken with the ARIADNE Demonstrator.
Project Funding European Research Council (ERC) Starting Grant for ARIADNE
Dr Konstantinos Mavrokoridis is a lecturer in experimental particle physics at the University of Liverpool. Having started his career in Dark Matter searches, he now focuses on detector development for Neutrino Physics. He has established and runs the Liverpool Neutrino Liquid Argon program, developing new detector technologies for future giant LAr Neutrino experiments.
Emerging symmetry in cuprate superconductors The phenomenon of superconductivity was discovered over a century ago, yet it was long thought to be confined to the realm of extremely low temperatures. This changed with the discovery in 1986 of a material which acted as a superconductor at 35 Kelvin, now researchers aim to understand the underlying basis of high temperature superconductivity, as Dr Catherine Pépin explains The phenomenon of superconductivity was discovered in 1911, and it remains a source of fascination to scientists, who continue to build the theoretical foundations to underpin future research. A superconducting material shows exactly zero electrical resistance, enabling perpetual motion of electrons without any energy loss. “This means conductivity is infinite, superconductivity is a perfect phenomenon,” explains Dr Catherine Pépin, the coordinator of the Champagne project. While superconductivity was discovered in 1911, it was not until 1957 and the publication of the Bardeen-CooperSchrieffer theory that the phenomenon was more fully understood. “The electrons pair two-by-two, forming Cooper pairs, which condense into the quantum vacuum. It’s a macroscopic quantum phenomenon,” says Dr Pépin. “When you accelerate an electric charge, it should emit some electro-magnetic radiation, but this doesn’t happen in a superconducting material. Remarkably, in a conducting ring using superconducting materials, there is no energy loss and the conduction of the current is eternal. This idea of perpetual motion was a dream of the Greek philosophers, and nature has given us a realisation of it.” The original carrier in the conduction of electricity is an elementary particle – an electron – yet in a superconductor it’s not an electron that conducts electricity, but rather a Cooper pair. “This is two electrons together, so it becomes more like a wave. So each particle knows exactly what the other is doing, and then the whole field behaves as if everything is in phase, in synchronicity,” explains Dr Pépin. The quantum vacuum protects the phenomenon of superconductivity in these materials. The Maxwell equations that constrain the movement of electrons in metals are not valid in
The SU(2) theory predicts the emergence of a variety of phases with varying temperature (T) and the concentration of carriers (p). In particular, it predicts the arising of topological defects: skyrmions in a pseudo-spin space, which can be ordered, as in the red phase, disordered as in the yellow phase. In the green region, these defects flatten and form a superconducting phase, depicted in green, in which the electrical resistivity vanishes. these situations and are replaced by London equations, under which Cooper pairs can conduct electricity without any loss of energy. Perfecting this phenomenon at room temperature could therefore revolutionise energy provision, greatly improving efficiency, while Dr Pépin says it could also hold implications beyond the energy sector. “A perfect conductor repels electromagnetic fields. A corollary of that is that you can create very strong magnetic fields, which is a realm that is just starting to be studied,” she outlines. For 70 years it was thought that superconductivity was confined to the regime of very low temperatures, but this changed with the discovery in 1986 of a material called lanthanum barium copper oxide (LBCO), the first high temperature superconductor. “This material showed superconductivity at a much higher temperature than the other superconductors, at 35 Kelvin,” says Dr Pépin. “More high-temperature superconducting ceramic materials were discovered in 1987, including yttrium barium copper oxide (YBCO) and others.
Researchers were able to raise the temperature at which this phenomenon is observable by more than 120 degrees, to 136 Kelvin at the end of 1987.” There are significant differences between these materials and standard superconductors however. In standard superconductors, the pairing mechanism between the two particles in a Cooper pair is due to the vibrations of the lattice, to the interaction between electrons and phonons, but this is not the case with high-temperature superconductors. “In this new class of superconductors, phonons are not the main driving force for this pairing mechanism,” says Dr Pépin. “The underlying mechanism behind superconductivity in this class of materials is different to the standard metallic superconductors.”
Cuprates This area forms a central part of Dr Pépin’s agenda, with researchers aiming to develop a new theory to describe a specific type of superconducting material called cuprates. The notion of elementary excitations is fundamental to this research; in particular, it is not clear
whether the electron can be considered as a free particle, an elementary building block from which a new theory can be built. “In some theories, the system is strongly coupled from high-energy to low-energy – meaning that you cannot identify electrons. These theories are known as strongly-correlated electron theories,” explains Dr Pépin. This point of view has dominated over the last thirty years, but now Dr Pépin is developing an alternative theory dominated by the presence of an emerging symmetry, which is more complex than that for the standard theory of superconductivity. “The phenomenon of spontaneous symmetry breaking is a fundamental part of theoretical physics, from high-energy physics to condensed matter physics,” she explains. “In superconductivity, the spontaneous symmetry which breaks is a rotation symmetry which couples to light. It’s the same symmetry as that which describes electro-magnetism. This phenomenon of U(1) symmetry breaking is the same as for the Higgs Boson.”
Researchers plan to investigate the accuracy of this new theory through interactions and collaborations with experimental groups. These interactions are always easier when the theory under investigation is underpinned by a physical principle like a symmetry. “When a theory is controlled by a symmetry it is easier to make an effective model. For example, the Ginzburg-Landau theory was developed to describe superconductivity, before the underlying mechanism was found. And this worked well,” says Dr Pépin. “We also plan to test the accuracy of the theory through the prism of experimental data.” There are also plans to look at the building blocks of cuprates in experiments, which it is hoped will lead to new insights into the origin of superconductivity in these materials. “These compounds fascinate the research community, and experimentalists are working very hard to understand them. They have made a lot of progress, using techniques like Raman spectroscopy, time-resolved
When you accelerate an electric
charge, it should emit some electro-magnetic radiation, but this doesn’t happen in a superconducting material. In a conducting ring using superconducting materials, energy is not lost and the conduction of the current is eternal. This idea of perpetual motion was a dream of the Greek philosophers, and nature has given us a realisation of it The key idea in Dr Pépin’s research is that at low enough temperatures the physics of cuprate superconductors is still controlled by an emerging symmetry, but the symmetry group in this case is the first non-Abellian group, the SU(2) group. SU(2) symmetry originates in these compounds due to the competition between two types of order – superconductivity and a modulation of the electric charge. “These two competing orders are related by the emerging symmetry. So the complexity and the symmetry relies on the presence of a competing order in the phase diagram, infinitely coupled to superconductivity,” outlines Dr Pépin. “Certain topological defects might emerge from this coupling, and they could proliferate. This is something which we will take account of in the project.”
X-rays, and scanning-tunnelling microscopy (STM), and huge sets of experimental data have been generated,” says Dr Pépin. “It’s very important for us to test the accuracy of our theory on experimental data. This is a very strong constraint.” The goal in this research is to understand how cuprates work, with the wider, longer-term objective of increasing the temperature below which they superconduct, to eventually reach room temperature; Dr Pépin says this would have far-reaching implications. “If we succeed in perfecting this phenomenon at room temperature then it will completely change our lives. We will be able to really conduct electricity without any energy loss. So it will completely change energy provision,” she stresses.
At a glance Full Project Title Charge orders, Magnetism and Pairings in High Temperature Superconductors (Champagne). Project Objectives In the quest for finding room temperature superconductors able to carry current without loss of energy, we propose that the physics of a class of materials, the cuprates, is governed close to the quantum vacuum, by an emerging SU(2) symmetry relating the superconducting state to the charge sector. A single gap equation describing a “non abelian” superconductor is tested to a wide range of experiments. Project Funding ERC-ADG-2015 - ERC Advanced Grant. Project Partners Xavier Montiel, Corentin Morice, and Debmalya Chakraborty. (Post Doctorates at the IPhT) Contact Details Dr Catherine Pépin Institut de Physique Théorique Orme des Merisiers CEA - Saclay F-91191 Gif-sur-Yvette - FRANCE T: +33 (0)1 69 08 72 18 E: email@example.com W: http://ipht.cea.fr/Pisp/catherine.pepin/ index_fr.php X. Montiel, T. Kloss, C. Pépin, Phys. Rev. B 95, 104510 (2017) T. Kloss, X. Montiel, V. de Carvalho. H. Freire, C.Pépin, Rep. Prog. Phys., 79, 084507 (2016) K.B. Efetov, H. Meier, C.Pépin, Nat. Phys. 9, 442 (2013)
Dr Catherine Pépin
Dr Catherine Pépin is a Researcher in Theoretical Condensed Matter Physics at the Institut de Physique Théorique, based near Paris. Her main research interests are zero temperature phase transitions, in particular heavy fermion systems and high Tc superconductors.
Conflict and care; a complex relationship The First World War left many ex-servicemen with serious disabilities, for which they required long-term medical and social care. New formal and informal structures developed during the interwar period as a result, which affected traditional perceptions of gender roles in the provision of care, as Dr Jessica Meyer of the MenWomenCare project explains Many of the men returning from the front after the end of the First World War had suffered serious disabilities, so required long-term medical and social care. While in a military context care was often provided by men, once the conflict had ended new formal and informal structures developed, an area of great interest to Dr Jessica Meyer, the Principal Investigator of the MenWomenCare project. “We’re looking at domestic circumstances, and how they relate to the state and charitable institutions. What is the nature of these relationships? How are they structured by gender?” she outlines. In many cases women took on a lot of care-giving work in the home, while doctors and other staff in charities and state institutions tended to be male. “What does this say about how medicine was practised, about how those with disabilities were cared for, and about our perception of care-giving as a gendered practice?” asks Dr Meyer.
National Archives Researchers are looking at files from the National Archives on First World War Pensions awards to investigate these and other questions. These files often contain a Men, Women and Care: The gendering of formal and informal care-giving in interwar Britain Objectives: To develop a database of material contained in the 22,756 personal pension files from the First World War held by the National Archives, London. This will be used to explore how formal and informal structures of care for disabled ex-servicemen developed in the interwar years, and how these structures were both themselves gendered and helped to gender caregiving practices. Dr Jessica Meyer Legacies of War University of Leeds T: +44 0113 343 4194 E: firstname.lastname@example.org W: http://menwomenandcare. leeds.ac.uk/ Jessica Meyer is Associate Professor of Modern British History in Legacies of War at the University of Leeds, specialising in histories of masculinity, disability, and popular fiction in the era of the First World War in Britain. She is currently completing a monograph on the experiences of Royal Army Medical Corps rankers during the war.
Personal pension file for Denis Cavanagh created by the Ministry of Pensions. variety of different documents, including official documents, medical reports, and letters from the pensioners themselves, from which Dr Meyer and her colleagues in the project hope to gain new insights.
questions more efficiently, while this research also holds relevance today in terms of the state’s responsibilities towards those it sends into harms way. The huge numbers of disabled men returning from the First World War ushered in an era of greater state intervention in medical care, and in some ways could be seen as the precursor of the welfare state. “It’s also the precursor to the idea of the miltary covenant, which structures how the disabled from current conflicts are dealt with,” says Dr Meyer. The introduction of conscription in 1916 was an important event in these terms. “If you’re going to require citizens to serve in the military, to put themselves in harms way, then you have a responsibility to support their dependents if they suffer a disability as a result,” continues Dr Meyer. “This introduces the idea of state responsibility for servicemen.” This continues to shape the contemporary military covenant and the treatment of veterans of current conflicts. The University of Leeds has recently established a threeyear partnership with History & Policy, an organisation which aims to inform current policy. “They train historians and put them
We’re creating a database which pulls out a lot of the demographic information and will include things like the nature of the pensionable disability, how much money men received in a pension, and whether or not they were living at home or in institutional settings “We’re creating a database which pulls out a lot of the demographic information and will include things like the nature of the pensionable disability, how much money men received in a pension, and whether or not they were living at home or in institutional settings,” she explains. “It will also tell us whether they were receiving care in institutional settings or in other forms of non-state institutions. We’ll then use that to do some quantitative analysis. So for example, how many men in Bradford were being pensioned for the loss of a limb? How many lived at home?” The project’s database will enable social historians to investigate these kinds of
in contact with policy-makers, both at local and national level, around questions where the historical background might hold relevance,” explains Dr Meyer. This allows policy-makers to draw on the knowledge of researchers around historical events, giving greater perspective and depth to their decisions. “The analysis we’re doing will add that depth and knowledge, and will enable policy-makers – as they work with charities – to think about what aspects of care for the disabled need to be funded,” says Dr Meyer. “This might initially be around disabled people returning from current conflicts, but then there are also wider questions about the disabled in society.”
Laying the foundations of tomorrow’s detection techniques Analysis of cosmic neutrinos could allow researchers to gain new insights into fundamental questions around the evolution and fate of the universe, yet detecting these particles is a complex challenge. We spoke to Dr Alina Badescu about the CosNed project’s work in helping to lay the foundations of a new technique to detect cosmic neutrinos in natural salt mines A type of elementary particle, cosmic neutrinos hold great scientific interest, enabling researchers to probe some of the most fundamental questions around the evolution and fate of the universe. Neutrinos are very hard to detect however, and even harder to detect at extremely low energies, a topic that is central to the work of the CosNeD project. “The project is focused solely on cosmic neutrinos,”says Doctor Alina Badescu, the project’s Principal Investigator, based at the University Politehnica of Bucharest. Detecting ultra high energy cosmic neutrinos has long been a major goal in research, yet this is a technically demanding challenge. “The main difficulties in detecting neutrinos are due to their small flux number, and the small probability of interaction,” explains Dr Badescu. “The latter point is exactly what makes them so interesting, as they will travel in space undeflected, indicating the direction of the high energetic source that produced them.” The project is investigating a new technique for measuring high-energy cosmic neutrinos, centred around their radio detection in natural salt mines. These salt mines have some natural advantages in terms of detecting cosmic neutrinos, as Dr Badescu explains. “We have chosen a salt dome because the
The network of antennas can determine the energy of the primary particle through the measurement of the amplitude of the radio EM field, and it can reconstruct the direction of the primary particle through the measurement of the occurrence times of the
radio impulses www.euresearcher.com
detector volume is very large and salt is so transparent to the radio waves that the spacing of detectors can be sufficiently large to enhance the effective volume and event rate. Beyond that, a salt mine is more radioquiet than other places in the world, so as to reduce artificial signals considerably, as the soil above the dome absorbs most of the radio noise. Lastly, the density of salt is higher than the density of ice or water, thus the particle shower dimension is smaller,” she outlines. Salt settlements in Romania are among the biggest in Europe, which is another important consideration. “The construction of such a detector would be a world first,” says Dr Badescu.
Salt neutrino detector There are many technical hurdles to negotiate first however. The salt neutrino detector is a complementary approach, still in the conceptual stage, that deals with the measurement of a very wide formation of natural salt with antennas. “The network of antennas can determine the energy of the primary particle through the measurement of the amplitude of the radio EM field, and it can reconstruct the direction of the primary particle through the measurement of the occurrence times of the radio impulses. In order to compensate for the small interaction probability a huge volume of detecting material is required that is “Unirea” Salt Mine (Photograph by Radu Sandovici)
found in naturally occurring bulk of dielectrics, such as natural salt domes,” explains Dr Badescu. The first step in the project would be to experimentally determine the radio attenuation length in salt, after which researchers can look towards wider objectives. “We are also interested in modelling, both theoretically and empirically, the radio wave propagation in layered, heterogeneous media,” outlines Dr Badescu. This work could hold important implications for geophysics and industry, for example in modelling the vertical profile of soil to ensure highways are constructed on stable ground. The development of indirect radio techniques for detecting cosmic neutrinos could also enable researchers to probe more deeply into fundamental questions in physics. “The enduring impact of this project will be to solve some of the outstanding conceptual problems in the origin of the high energy cosmic particles, relating to their production, acceleration and propagation mechanisms,” says Dr Badescu. CosNeD Research programs represent a main priority at the University Politehnica of Bucharest. The University has invested in a top performance research center, CAMPUS, and academic staff at the university are involved in many international research projects. The university also makes educational offers tailored to the needs of international students. Dr Badescu’s faculty offers 2 bachelor programs taught in English (one in Telecommunications, and one in Electronics) and one masters programme (“Advanced wireless telecommunication”). Dr Alina Badescu University Politehnica of Bucharest T: +40 72 396 7698 E: email@example.com W: http://www.radio.pub.ro/erc/ Alina-Mihaela Badescu has obtained her M.Sc. degree in Radio and Optical Telecommunications in 2006, and in 2008 - in Radio Astronomy. In 2011 she obtained her PhD degree in radio detection of astroparticles. She has worked in several research projects in the field of radio detection and telecommunications. She is affiliated to University POLITEHICA of Bucharest, Faculty of Electronics, Telecommunications and Information Technology.
A new viewpoint on scientific progress Scientific progress is based to a large degree on the search for truth, yet the history of science is also marked by significant conceptual revolutions that changed the way scientists view the natural world. We spoke about these issues with Professor Michela Massimi, PI on the ERC Consolidator Grant Perspectival Realism. Science, Knowledge, and Truth from a Human Vantage Point The aim of much scientific endeavour is to search for objective truth, to develop theories that help researchers understand the way the world functions; yet the history of science has also been marked by radical conceptual changes that challenged accepted viewpoints. The Perspectival Realism project aims to develop a novel view in philosophy of science that combines perspectivism and realism. “The question I’m asking is: Can we be realists about science while also acknowledging that theories change, and that there have been major scientific revolutions over time?” explains Professor Michela Massimi, the project’s Principal Investigator. The work of Thomas Kuhn is central to addressing this question. His 1962 book, The Structure of Scientific Revolutions, radically challenged the realist’s intuition that scientific progress is a linear accumulation of scientific discoveries that eventually take us closer to the final truth about nature. Instead, Kuhn regarded science as characterised by periods of what he called ‘normal science’, ‘crises’, and
‘scientific revolutions’. “When there’s an increasing number of anomalies that cannot be explained by the accepted scientific paradigm, a period of crisis eventually leads to a scientific
Beyond Standard Model physics and the Dark Energy Survey
Perspectival Realism. Science, Knowledge and Truth from a Human Vantage Point The project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement European Consolidator Grant H2020ERC-2014-CoG 647272 Perspectival Realism. Science, Knowledge, and Truth from a Human Vantage Point). W: http://www.perspectivalrealism.org W: http://www.perspectivalrealism.org/publications/ Professor Michela Massimi Principal Investigator E: Michela.Massimi@ed.ac.uk W: https://sites.google.com/ site/philosophymassimi/ Professor Michela Massimi is Professor of Philosophy of Science at the University of Edinburgh. She is the PI on ERC Consolidator Grant “Perspectival Realism”. From 2011 to 2016 she was Co-Editor in Chief of the British Journal for the Philosophy of Science. She is currently Vice-President of the European Philosophy of Science Association. She has authored numerous articles in history and philosophy of modern science.
matter, to a large degree – but we still lack a full understanding of the nature of dark matter and dark energy; nor have we yet found any direct experimental evidence for them.”
revolution. A prominent example is the passage from Ptolemaic astronomy to Copernican astronomy,” says Professor Massimi. The discovery of new physics Beyond the Standard Model (BSM) – if there is such physics, as some physicists believe – would represent another significant conceptual change, says Professor Massimi. “Scientists have been looking for the existence of BSM particles, some of which may include candidates for what we call dark matter,” she says. “Cosmological evidence suggests that our universe is made of dark energy and dark
Researchers at CERN have been looking for evidence of supersymmetric particles and other possible BSM particles. The search for dark energy also continues in cosmology via large surveys, such as the DES (Dark Energy Survey). “How do scientists devise experiments and models to look for new physical entities, whose exact nature may be different from what their models suggest?” asks Professor Massimi. This question forms an important part of the project’s overarching goal, with Professor Massimi and her team undertaking fieldwork at CERN and DES to understand the modelling challenges facing physicists. “We are trying to understand how scientists are coming up with new methods, assumptions and modelling techniques,” she outlines. A prime motivation behind this project is defending the realist view that scientific truth matters, and there are good reasons for being realist about science despite the situated nature of scientific knowledge. The perspectival component of the project suggests that we can’t build an omniscient view of the natural world, precisely because we are limited human beings. “We have limited resources and our knowledge is always situated in a given scientific perspective,” says Professor Massimi. Hence, the project aims to combine perspectivism and realism: “My goal is to develop a fully-fledged philosophical view called ‘perspectival realism’ where we can spell out in detail how one can be a realist about science, while also taking on board lessons from the history of science and current scientific practice,” points out Professor Massimi.