EU Research Winter 2017
INTELLIGENT MACHINES THE RESEARCH OF ROBOTICS & ICT
STEM THE FLOW: spot light on biomedical research and organ health
Spectrum of Physics, RESOURCE MANAGEMENT, Higg Bosun’s to Ecological development and change Black holes Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
Editor’s No A As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
rtificial Intelligence (AI) is one of the most talked about movements in technological development today. Its impact is predicted to be huge in every industry. AI’s most familiar guise is the sci-fi, human-like robot idea – or a thinking machine. In this respect, The Turing Test was laid down in 1950 as a benchmark of how to recognise this incarnation of AI. If a person could be ‘fooled’ or rather ‘convinced’ when in conversation with a computer, that they were talking to another human being, then it would be a dawn of the truly intelligent machine. Now – this notion of a thinking machine is extremely controversial, as clever algorithms do not make a brain, it is argued. You need a human body and senses to be a human after all – and a computer won’t ever know the true meaning of a kettle unless it makes a cup of tea because it’s thirsty – let alone the semantic confusion saying something like, ‘boil the kettle’ can create. However, since 1950, the Turing Test has endured and thus been tested several times. It was relatively recently in 2014 at the Royal Society in London that a Russian chatterbot convinced one in three judges there that it was a human they were chin-wagging with, in a series of five-minute text conversations. It passed the test. Considering that quantum computing is just around the corner and with the regular advances in AI, it is certain we are going to witness something of a revolutionary development in this field in our lifetimes. What’s really fascinating and a little concerning, is the way the world’s top experts are arguing about whether we should be worried or not about AI deciding to threaten us. We’ve seen a spat about the danger of AI between Elon Musk and Mark Zuckerberg, Stephen Hawking very publicly warning us of doom from clever machines – and several reports of AI being turned off when it decides to invent its own languages that no one else understands. It’s not unheard of that new technology can create a little panic but it will be fascinating to witness how AI will change our lives and our fortunes. Who knows? Maybe in the future we won’t need to talk to other people because our computer will provide all the conversation we need? Hope you enjoy the issue.
Richard Forsyth Editor
www.euresearcher.com
1
Contents 30 MetNH3
The EU has passed regulations limiting ammonia emissions in Member States. We spoke to Daiana Leuenberger, Ph.D. and Bernhard Niederhauser about the MetNH3 project’s work goal of improving the metrological infrastructure for ammonia at ambient air amount fractions
4 Research News
EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation
10 YEASTCELL Yeast has provided us with food and drink for thousands of years, and new research will enable yeast to provide valuable compounds for a sustainable society. We spoke to Dr John Morrissey and Dr Jan-Maarten Geertman about the lasting impact of EU-funded projects on yeast biotechnology
13 Role of the IL-13 system Interleukin 13 and its alpha 1 receptor may play an important role in the loss of dopaminergic neurons. Work testing this hypothesis holds important implications for our understanding of the underlying causes of Parkinson’s Disease, as Professor Bruno Conti explains
14 CARDYADS Sub-cellular structures called dyads play a crucial role in the contraction and relaxation of the heart. Professor William Louch tells us about the CARDYADS project’s work in investigating the structure of dyads and the consequences of altering their organisation
15 HIVBIOCHIP The HIVBIOCHIP project aims to develop a portable, inexpensive imaging system to count the number of CD4+ T-cells in HIV infected patients, helping improve the effectiveness of treatment, as Professor Nikos Chronis explains
2
16 CrUCCial
The CrUCCial project aims to build a deeper understanding of the underlying mechanisms involved in inflammatory bowel diseases (IBD), providing the basis for individualised treatment, as Professor Séverine Vermeire explains
18 VASNICHE The vasculature plays a key role in maintaining organ homeostasis. We spoke to Professor Eli Keshet about the VASNICHE project’s work in investigating how blood vessels affect stem cell properties and maintenance
20 ReDVA Patients suffering from kidney failure depend on peritoneal or haemodialysis to purify their blood. The ReDVA project aims to overcome the scientific and technical barriers associated with the failure of vascular access, as Dr Shona Matthew explains
23 Characterizing and
imaging nanoparticles Magnetic nanoparticles are attracting intense research in treating disease. Professor Antoine Weis, Victor Lebedev and Vladimir Dolgovskiy tell us about their work in developing a novel imaging method to visualise magnetic fields
26 Stem Cell Research Every time stem cell research is in the news, we are offered new hope against conditions that have long been untreatable. Is stem cell research a branch of science that will completely transform our healthcare options? Richard Forsyth investigates
33 SESAM’ALP
Many different factors affect the distribution of plant and insect species in the natural world. The SESAM’ALP project aims to develop improved methods of modelling biological communities, as Professor Antoine Guisan explains
36 OutGroup Professor of Behavioural Ecology, Andy Radford, heads up a fiveyear project at the University of Bristol, investigating the impact for social animals of out-group conflict
38 Sustainable Caucasus There is a long history of scientific collaboration in the Caucasus region, but geopolitical shifts since the early ‘90s have led to changes in international relations. It’s important to build strong research partnerships and share expertise with practitioners, as Professor Jörg Balsiger explains
40 MOEEBIUS Improving energy efficiency in buildings is a major priority for the EU. The MOEEBIUS framework will provide the basis for more accurate energy performance assessment, underpinning efforts to improve efficiency and opening up new commercial opportunities, as Dawid Krysiński explains
42 Novel Calorimetry The discovery of the Higgs Boson at the LHC at CERN has led to further experimentation being conducted and searches made for new physics beyond the standard model. That will require improved detectors, explains Professor Sir Tejinder Virdee
EU Research
44 SPECTRUM Researchers in the Spectrum project are studying numerous fundamental questions in theoretical physics from the mathematical perspective, as Professor Sasha Sodin explains
46 NanoStreeM The NanoStreeM project aims to help assess the health risks posed by specific nanomaterials, as Dr Dimiter Prodanov explains
48 SeSaMe There is demand for a more natural method of colouring everything from the food we eat to products we use. That’s why a project headed by Dr Silvia Vignolini is focusing on innovation in more natural colouration techniques
50 REProMag Rare earth magnets play a crucial role in many everyday applications, yet Europe does not currently enjoy an independent supply of the materials. We spoke to Professor Carlo Burkhardt about the REProMag project’s work in developing a new processing route
53 QBH The thermodynamic behaviour of black holes can reveal important insights into the microstructure of gravity, as Dr Sameer Murthy of the QBH project explains
56 Artificial Intelligence AI is being billed as the tale of two possible futures. Elon Musk believes AI in the wrong technology could threaten us all, whilst Mark Zuckerberg sees the opportunities for Mankind
60 Bots2ReC Bots2ReC aims to develop a robotic system for the automated removal of asbestos without exposing workers to health risks, as Professor Burkhard Corves and Tim Detert explain
62 Hendrik Arent Hamaker Hendrik Arent Hamaker’s role in philology and oriental studies in the early 19th Century has been undervalued, says Hendri Schut
www.euresearcher.com
EU Research
64 ACOSAR
Winter 2017
INTELLIGENT MACHINES: THE RESEARCH OF ROBOTICS & ICT
Martin Benedikt talks about ACOSAR’s work in developing an interface to integrate realsystems, work which holds important implications for the transport sector in particular
STEM THE FLOW: spot light on biomedical research and organ health
Spectrum of Physics, RESOURCE MANAGEMENT, Higg Bosun’s to Ecological development and change Black holes Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
66 Prosperity4ALL We are, unintentionally excluding more and more people from participation in daily living by digitizing everything. Prosperity4All seeks to help reverse this digital exclusion as Dr Gregg Vanderheiden and Dr Matthias Peissner explain
68 symbIoTe
Professor Ivana Podnar Zarko and Dr Sergios Soursos tell us about the symbIoTe project’s work in developing middleware that will both ease the development process and open up new commercial opportunities for IoT providers
70 TrainMALTA The TrainMALTA project provides training in bioinformatics analysis, enabling researchers to gain new insights into the genetic causes of disease, as Dr Rosienne Farrugia explains
72 VALCRI
We spoke to Professor William Wong, Dr Chris Rooney and Dr Neesha Kodagoda about the VALCRI project’s work in developing an intelligent system to support police analysts and help them work more effectively
76 Behavioural Finance for Retail Banking
The Behavioral Finance for Private Banking project aims to help everyday people make better, more informed financial decisions and make their money work harder, as Professor Thorsten Hens explains
78 LABORHETEROGENEITY
We spoke to Philipp Kircher about his work in investigating unemployment and building a deeper understanding of labour markets
EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
3
RESEARCH
NEWS
The EU Research team take a look at current events in the scientific news
Future research mega projects announced by European Commission ICT, energy and health main focus of the next €1B flagship projects The European Commission has published early details on funding opportunities for the next three years under its Future and Emerging Technologies research stream. The Commission says it wants to receive game changer proposals for new candidate megaprojects, or flagships, which will each receive €1 billion. Final work programmes for the 2018-2020 period of Horizon 2020 are expected to be published next month. Proposals must target “a visionary unifying goal” within one of three main areas of ICT; health and life sciences; and energy, environment and climate change. There will also be more money for ongoing, decade-long flagships. The human brain project, led by the Swiss Federal Institute of Technology in Lausanne will receive a further €150 million, as will the graphene project led by Chalmers University of Technology.
There will also be money for projects that develop new algorithms, software, big data analytics, and hardware, developed with Mexican and Brazilian partners who are expected to provide their own funding. There are indications in the paper that the Commission will be doing something about the huge demand for FET competitions, which from next year will be run out of the new European Innovation Council. “Oversubscription and underfunding [] is addressed by increased budget, clearer and enforced scoping, and advice on resubmission,” the paper says. The Commission has not yet confirmed whether it would be publishing information for other Horizon 2020 programmes.
Other competitions cover topics such as living technologies, socially interactive technologies, artificial organs and microenergy technologies. Projects with a radical vision in these fields will be rewarded with up to €3 million each.
4
EU Research
Brexit warning from one of the UK’s most influential research leaders Britain risks losing its leading position in drug research and development as researchers applying for grants has begun to drop dramatically Professor Jeremy Farrar, head of the Wellcome Trust, called for “clarity”. He said delays could slow down the development of drugs. Ministers have said they want to maintain a close relationship with the European medicines regulator. Prof Farrar was speaking at a meeting in London, organised by the body that regulates medicines in the UK, the MHRA. He stated that large pharmaceutical companies were waiting to see details of the government’s plans for regulating medicines after Brexit. “If that clarity is not forthcoming and people can’t be sure of their long-term stability and the clarity of those future relationships, then inevitably that will be factored into their investment decisions. Prof Farrar said. In a letter to the Financial Times, Health Secretary Jeremy Hunt and Business Secretary Greg Clark said that their aim was “to ensure that patients in the UK and across the EU continue to be able to access the best and most innovative medicines and be assured that their safety is protected through the strongest regulatory framework and sharing of data”. But more than three months on from that joint statement, Mr Farrar would like to see some action from the ministers. “What is needed is clarity around where the negotiations are going soon so that people who are planning their investments today for 2021, 2022 and 2025, which is what boardrooms are currently doing, get it as quickly as is possible.” The UK has led the way in developing a European regulatory system that is regarded by researchers and industry as having the perfect balance between ensuring patient safety and encouraging innovation. The NHS and the country’s excellence in medical research have encouraged many of the UK’s pharmaceutical companies to have large research and development operations in Britain and conduct clinical trials here. Prof Farrar is concerned that the UK would become isolated if it failed to reach an agreement with the EU over close medicines regulation after Brexit.
www.euresearcher.com
5
New technology to revolutionise Archaeology research Blockchain technology that revolutionised financial systems, could do the same for archaeological data The world’s first “archaeology coin” launched in October to fanfare might be part of a coming social science data revolution. Named Kapu, the digital currency is similar to Bitcoin, but specifically designed for archaeology. The technology underlying Kapu and Bitcoin is called blockchain and it may change data storage and cultural heritage protection. While the public is unaccustomed with blockchain, there is good reason to believe we may be witnessing the first step in what will become a standard technology over the next decade. It is a complex technology, but the underlying idea is quite simple. Blockchain is a public ledger, unlike a ledger kept by a bank or government institution. Each person who owns a “coin” also maintains a copy of all other assets and transactions, creating a peer-to-peer asset and transaction registry network. This provides transparency and avoids centralized “trust” institutions such as banks. The result is a “distributed” network where all the ledgers and transactions are replicated on delegates’ or users’ computers throughout the network. Data manipulation can be prevented since it will not be approved by the network. “The Kapu team is investigating the best ways to store different types of data and will be evaluating ways to use the block chain to develop a distributed public network to sustain and host heritage content” said Archaeologist Grant Cox.
Focus on robot safety in the work place An alliance between OSHA and a robotics industry trade group puts emphasis on safety The Robotic Industries Association (RIA), which is North America’s leading robotics trade group, just announced the signing of an alliance partnership with the Occupational Safety and Health Administration (OSHA), the Department of Labour agency tasked with ensuring safe and healthy working conditions in the US. The partnership comes as robots begin making their way into new work environments, such as light manufacturing, logistics and fulfilment centres, and agriculture. Of course no technology can be made foolproof, and OSHA will be taking safety standards in environments where humans and robots work together seriously. So will the engineers designing these robots. “RIA has a long history of helping to keep around industrial robotics,” says Jeff Burnstein, the organization’s president. “We developed the first American national robot safety standard in 1986 and we’ve kept it up-to-date since then.” Burnstein adds, “This new alliance with OSHA and NIOSH [the National Institute for Occupational Safety and Health] will help us to continue advancing worker safety as more robots enter the workplace.”
Democrats ask Trump to renew gun research at NIH Senate Democrats are asking the National Institutes of Health to resume conducting research on gun violence after funding lapsed in early 2017. “In spite of the toll of gun violence on Americans’ health and safety, a dearth of scientific research has hindered efforts to reduce gun-related fatalities and injuries,” they wrote in a letter sent Wednesday to NIH Director Francis Collins. Senators said October’s mass shooting in Las Vegas, which resulted in 59 deaths and more than 500 injuries, was the impetus for their request, though they also noted that 30,000 gun-related deaths occur each year, two-thirds of which are suicides. “With 93 Americans dying per day from gun-related fatalities, it is critical that NIH dedicate a portion of its resources to the public
6
health consequences of gun violence,” the senators wrote. In their letter, they pointed to the Dickey Amendment, a 1996 rule that prohibits the advocacy or promotion of gun control, saying that they believed it often was minsinterpreted as a federal ban on funding gun research conducted by the Centers for Disease Control and Prevention. It’s not clear if the Trump administration plans to renew the NIH funding. Science magazine reported recently that the funding is under consideration but no timeline has been set for making a decision. Gun rights groups oppose the research because they view it as an attempt to enact gun control legislation.
EU Research
Concern over lack of diversity of Nobel Prize Winners The Nobel Prize committee explains why women win so few prizes The last Nobel Prize of 2017, in economics sciences, was awarded to Richard Thaler “for contributions to behavioral economics.” All six prizes given to individuals this year have gone to men, apart from the Nobel Peace Prize which was awarded to a coalition of 100-plus NGOs from around the world. The Swedish Academy says that, in aggregate since the prize was founded in 1901, women have won 49 out of 923 prizes. About one in every 20. Göran Hansson, vice chair of the board of directors of the Nobel Foundation, said: We are very proud of the laureates who were awarded the prize this year. But we are disappointed looking at the larger perspective that more women have not been awarded. Part of it is that we go back in time to identify discoveries. We have to wait until they have been verified and validated, before we can award the prize. There was an even larger bias against women then. There were far fewer women scientists if you go back 20 or 30 years. This week however, also marks the time that the Nobel Prize comes under fresh scrutiny from critics around the globe. With every year
Regular marijuana users have more sex The researchers found that the women in the study who said they hadn’t used marijuana in the past year reported that they’d had sex an average of six times in the previous month. Women who did report using marijuana in the past year reported that they’d had sex more frequently, an average of seven times in the previous month. The findings were similar for men, according to the study, published today (Oct. 27) in the Journal of Sexual Medicine. Men who said they’d used marijuana in the past year reported having sex an average of seven times in the previous month, compared with an average of six times for men who said they hadn’t used marijuana in the past year. The researchers used data from three cycles of survey, one given in 2002, one given from 2006 to 2010, and one given from 2011 to 2015. The participants were men and women ages of 15 and 44, but the researchers limited their analysis to respondents ages 25 and 44. Across all cycles of the survey, both men and women who used marijuana reported a higher frequency of sexual activity. The researchers noted in the report that the study isn’t proof that marijuana use causes increased sexual activity; however, “the data imply that regular marijuana use will not impair sexual function or desire.” The research is an “impressive study because of its size,” said Joseph Palamar, an associate professor of public health at New York University’s Langone Medical Center, who studies drug use and sexual behavior and who was not involved in the new study. But Palamar stressed that the findings show a correlation, not cause and effect. In other words, using marijuana more frequently doesn’t directly cause more sexual activity.
www.euresearcher.com
that passes, science becomes increasingly interdisciplinary and, now more than ever, the greatest advancements tend to be the ones with international collaborative efforts involving thousands of individuals. Why then does the prize reserve its honour for a maximum of three individuals? This pernicious trend is damaging to science and there is evidence to suggest that it reinforces the stereotype that women cannot perform as well as men. Astrophysicist Katie Mack said on Twitter that, “we should keep in mind that demographics of the winners reflect and amplify structural biases.” The need for female role models in STEM subjects is critical to achieving some form of equality for generations to come, and having more female Nobel Prize winners is a perfect way to enact change.
Bloodhound 1000mph car completes first public run The Bloodhound supersonic car has completed its first high-speed test run, reaching speeds of 200 miles per hour on a runway at Cornwall Airport Newquay. The car, which is designed to go up to 1,000 miles per hour and will attempt to break the land speed record next year, was driven by Wing Commander Andy Green, a Royal Air Force fighter pilot who set the current record of 763 miles per hour in the ThrustSSC 20 years ago. The successful trial is a measure of validation after nearly a decade of work and represents a step toward Bloodhound’s planned record-breaking attempt in South Africa’s Hakskeen Pan desert, which is likely to happen next year. The not-for-profit Bloodhound project based in Bristol UK, combines aircraft and automotive technology, with a larger aim of interesting schoolchildren in engineering. It is led by 71-year-old Richard Noble, who brought the land speed record back to the UK in 1983 and was also the project director of the team behind the current best of 763mph, Thrust SSC, achieved 20 years ago. “It was very straightforward,” Mr Noble said of the runs. “What I think was important was the demonstration to people of the real power of the car.” Bloodhound uses a Eurofighter Typhoon EJ200 jet engine made by Rolls-Royce. Three Norwegian Nammo rockets will later be added to provide a boost to get the car to 1,000mph. Perhaps more immediate and in many ways impressive is Bloodhound’s educational impact: its schools programme, which uses data from the project to inspire children, reached 129,000 pupils last year. Image credit © Flock London
7
Photograph © Sam Ose / Olai Skjaervoy
Camouflage patterns offer clue to feathered dinosaur’s habitat Carnivore with a raccoon-like face and plumage was dark on top and light underneath A dinosaur from China sported a “bandit mask” pattern in the feathers on its face, scientists have said. Researchers came to their conclusion after studying three well-preserved fossil specimens of the extinct creature, called Sinosauropteryx. They were able to discern the dinosaur’s colour patterns, showing that it had a banded tail and “counter-shading” - where animals are dark on top and lighter on their underside.The study appears in Current Biology. The bandit mask pattern is seen in numerous animals today, from mammals - such as raccoons and badgers - to birds, such as the nuthatch. “This is the first time it’s been seen in a dinosaur and, to my knowledge, any extinct animal that shows colour bands,” co-author Fiann Smithwick, from Bristol University, told BBC News. Great white sharks, for example, are dark on top to blend in with murky waters when seen from above, but are lighter on their bellies so that they match the sky when viewed from below. Countershading also makes an animal appear narrower from the side, which can make it seem like a smaller, less attractive meal for predators.
Carlos Moedas says ERC must keep focus on ‘excellence’ Carlos Moedas defends 10 years of success for research council as talk builds over next framework programme The European commissioner for research, science and innovation has defended the European Research Council’s “geographically blind” basis for distributing grants, which critics say disadvantages weaker science systems in eastern Europe. Carlos Moedas, speaking in Brussels to mark the ERC’s 10-year anniversary, urged delegates to “remain united” in a “vision of an ERC based on scientific excellence”. There have been longstanding concerns that because it focuses purely on “excellent” research, the ERC distributes the vast majority of its grants to already successful research countries such as the UK and Germany. In 2015, for example, EU countries that have joined since 2004 – mostly in Eastern Europe – won just three ERC advanced grants out of nearly 300. This system has been called “Robin Hood in reverse”, with Eastern European researchers having “next to no chance” of winning a grant. With the UK – which has been a big winner of ERC grants – leaving the EU and possibly it’s research framework programmes, the balance could be tipped in favour of a more redistributive approach.
8
In their study, published this week in Current Biology, study coauthors Jakob Vinther and Fiann Smithwick of the University of Bristol present their evidence that Sinosauropteryx was countershaded. Based on preserved pigments found in the fossils, they say that the animal would have had a coat of rusty brown feathers on its back. From the side, the feathers would have starkly shifted from dark to light, with paler plumes running across its chest. The dinosaur looked like “something between a roadrunner and a rock wallaby,” Vinther says. The research is interesting not just because it gives us a good idea of what the dinosaur looked like, but also because it could offer great clues as to what the environment also looked like at the time. Assuming the dinosaur’s pattern and colouring helped it blend into its surroundings, it’s not a stretch to think that the Sinosauropteryx spent its time in areas much like those of modern day animals that share similar colour patterns, like gazelles. “We knew before that its feathers were vibrantly patterned, but this study shows that it was counter shaded and even striped,” Dr. Steve Brusatte of the University of Edinburgh told the New York Times. “These findings breathe life into this dinosaur.” © CERN
Cern calls on industry to collaborate on its datacentre challenges Particle accelerator lab sets out a plan, inviting industry to help it develop next-generation IT to support science Cern, home of the Large Hadron Collider (LHC), has identified 16 IT challenges that it wants to work with the IT industry to overcome. In a whitepaper describing its challenges, Cern Openlab, the facility’s public-private partnership, has categorised the 16 issues into four main areas. In the whitepaper, Cern notes: “A weakness in the architecture of many of today’s datacentres is that they cannot easily accommodate rapid variations of cycles and workloads.” The whitepaper describes how rack disaggregation could help it to allocate the correct amount of computing and storage resources. It identified software-defined infrastructure as one of the technologies new datacentres could be based on. It identified stream processing, cloud resources, machine learning and scale-out databases as among the areas that could be investigated. It has also looked at NVRam (non-volatile memory), which could be used to run in-memory databases and analytics workloads.
EU Research
More patients with ulcerative colitis, but fewer numbers of surgeries Today, a higher number of patients with ulcerative colitis are able to keep their bowel and steer clear of surgery but the number of people suffering from the disease is 10 times higher today than in the 1960s Carl Eriksson a doctoral student at örebro University has examined the outcome for people developing the inflammatory bowel disease ulcerative colitis in örebro between 1963 and 2010. Over 1,000 patients are part of the study. “The risk of having to undergo surgery is clearly reduced now. I would like to think that this has to do with improved treatments,” says Carl Eriksson. Today, patients receive stronger treatments that reduce inflammation and symptoms. At the same time, his study shows that the number of people developing the disease is significantly higher today than in the 60s. “There are 10 times as many sufferers from ulcerative colitis today as in the 60s. Why there has been such an increase is an interesting question. One reason could be the fact that we do not smoke as much.” Research has shown that smoking protects against ulcerative colitis. Another reason, says Carl Eriksson, may be that healthcare has got better at identifying patients with inflammatory bowel conditions: “More people are diagnosed since the examination methods are better today. But even if we account for that, there has been a significant increase,” says Carl Eriksson.
Planet will need ‘carbon sucking’ technology by 2030s, warn scientists New methods to capture and store emissions urgently needed, as planting more forests and pumping carbon underground, are currently too costly. Many new technologies that aim to capture and store carbon emissions, thereby delivering “negative emissions”, are costly, controversial and in the early phase of testing. But “if you’re really concerned about coral reefs, biodiversity [and] food production in very poor regions, we’re going to have to deploy negative emission technology at scale,” said Bill Hare of Climate Analytics, a science and policy institute. “I don’t think we can have confidence that anything else can do this,” the Berlin-based chief executive told a London climate change conference. Machines might be developed to capture carbon dioxide directly from the air and pump it underground or otherwise neutralise it. But efforts to store captured carbon underground are “showing no progress … and even backwards steps in some cases”, said Corinne Le Quéré, director of the Tyndall Centre for Climate Change Research at the University of East Anglia. World leaders agreed in 2015 an aim of holding global warming to 1.5C above pre-industrial times. Scientists believe this is key to protecting small island nations from sea level rises, shoring up food production and preventing extreme weather.
Astronomers find epic ring around an exotic dwarf planet in our Solar System A ring system has been found around a dwarf planet for the first time — the distant, potato-shaped Haumea, lies beyond Neptune. Haumea, first discovered in 2004, is one of five dwarf planets: large objects like Pluto, mostly in the outer solar system, which are not significant enough for planetary status. It is known for its peculiar and rapid rotation, spinning end-over-end once every four hours.
one, and the larger one turns out to be in the same plane as the ring which has been discovered. Image © Instituto de Astrofísica de Andalucía
But the discovery of Haumea’s ring, reported recently in the journal Nature, is another surprise for scientists. “This is the first time a ring around a dwarf planet has been discovered, so it is a really very peculiar, unexpected and weird finding,” said the paper’s co-author, Dr Pablo Santos-Sanz from the Instituto de Astrofísica de Andalucía. The ring appears to be dense and dark, blocking out about half of the light that passes through it to Earth. It’s 70 kilometres wide and lies about 1,000 kilometres away from Haumea’s surface. Dwarf planets are unique by themselves but Haumea is even more special among them. It has two moons, a large and a small
www.euresearcher.com
9
Unlocking the potential of yeast Yeast has provided us with food and drink for thousands of years, and new research will enable yeast to provide valuable compounds for a sustainable society. We spoke to Dr John Morrissey and Dr Jan-Maarten Geertman about the lasting impact of EU-funded projects on yeast biotechnology Yeast has provided
humankind with food and drink for thousands of years, now new research is going to enable yeast to provide valuable compounds for a sustainable society. Yeasts are widely used in biotechnology to produce a range of valuable items, from biofuels to flavours and from insulin to anti-aging products. There is huge potential to expand this research in Europe and open up opportunities to use yeast to make more products quickly, efficiently, and sustainably. Imagine taking a sip of your favourite wine or beer, or a scoop of your favourite ice cream. Maybe you have never given much thought to why one Sauvignon Blanc tastes different to another, what separates ale from lager, or where the flavours in your ice cream originate. However, for people working in the food and beverage industries, getting the flavours just right for the consumer is critical. Products must remain consistent in quality and price in spite of weather events, climate change, seasonal availability of ingredients and many other variables. Consumers also want innovation without compromising on sustainability. Recent research, funded by the European Union (EU), is focussing on yeast biotechnology to solve some of these problems in an economically and environmentally sustainable way. This was a major goal of the YEASTCELL training network. “The research focus of YEASTCELL was two-fold. On one side there were beverage yeasts, and on the other there were yeasts for industrial biotechnology,” explains Dr John Morrissey, who coordinated the project from University College Cork. “The emphasis was on understanding the kinds of yeasts that are used for fermented beverages, namely beer, wine and cider, and also on using modern technologies to create new yeast strains for a range of industrial applications. The research outcomes from YEASTCELL have been embraced by industry and inspired two new projects funded under the EU’s
10
Horizon 2020 programme, YEASTDOC and CHASSY, that will lead to further innovative solutions to pressing challenges facing European companies.”
Better Beverages: New Yeasts for Beer, Wine and Cider Brewers are particularly interested in using knowledge of yeast genetics and metabolism to diversify their product ranges and offer beers with new flavours that satisfy consumer desires. The yeast used to make modern lager (Saccharomyces pastorianus) is a hybrid - the result of a natural crossing of the standard yeast used for brewing ale and baking bread, Saccharomyces cerevisiae, and a ‘wild’ yeast, Saccharomyces eubayanus. Although the two yeasts originally crossed in the 15th century, the wild parent was not identified until 2011, when S. eubayanus was found in Patagonia, Argentina. The discovery of this species has created new opportunities for the beverage industry. “It opened up the possibility of crossing these two species to make new hybrids with different traits,” outlines Dr Morrissey. Several academic partners in the project did exactly this and crossed ale yeasts with S. eubayanus to create new
yeasts for producing fermented beverages. One of the partners in the project, HEINEKEN, are very interested in the potential of these new yeasts. “We aim to better understand these hybrid yeasts. Different partners have been working to identify the traits in yeast that affect the characteristic flavours of beer,” says Dr Jan-Maarten Geertman, Manager of Product and Process Research at HEINEKEN. PhD researchers from the project worked with HEINEKEN to make beer and cider using their new yeast strains. These drinks had unique flavours and were very positively evaluated in a large-scale sensory trial. The new strains could be commercialised, and they serve as proof of principle for further strain development. Companies that produce fermented beverages are aware of the need for sustainability, especially relating to the efficiency of their processes. Indeed, several projects in YEASTCELL explored ways to encourage yeast to perform more efficiently during fermentation of either wine or beer. Wine fermentation is actually a very stressful process for yeast, where it has to deal with low availability of some nutrients at the same time as
Dr Morrissey in University College Cork, Ireland.
EU Research
Ângela Carvalho was one of 11 Early Stage Researchers who received PhD training in the YEASTCELL network. We spoke to her about her research and the benefits of working with both academic and commercial partners.
EU Research: Why was your research important? Dr Ângela Carvalho: In recent years, the use of
EUR: What were the main technical challenges? Dr Carvalho: We were trying to create a metabolic pathway,
medicinal Cannabis has increased globally, and legislation in many countries is becoming more liberal. However, there are still problems with legalising cultivation of Cannabis in most countries, and attempts to make synthetic versions using chemistry did not prove to be an effective alternative because of high production costs and low yields obtained. We wanted to modify yeast cells to create a cost-effective, environmentally friendly, reliable supply of high-quality medicinal cannabinoids.
the stages that a cell goes through to produce a compound, and there was one part that we couldn’t figure out. Fortunately, one of the other partners in the YEASTCELL project knew how to do it, so I spent some time with them learning how to do it.
EUR: How did you do this? Dr Carvalho: We studied the literature available on Cannabis plant to identify the genes that are involved in producing different cannabinoid compounds. We then introduced these genes into yeast. It’s not that simple, though. On top of that, to produce the compounds we wanted, we also had to change the metabolism of the yeast. When you modify yeast to produce more of a certain compound, you have to make other changes to support this production.
strong osmotic pressure from the high concentration of sugar. A collaboration between INRA (France) and the wine yeast company Lallemand used the power of evolution to select yeast strains that were more efficient at wine fermentation under these conditions. “Our partners, Lallemand, carried out some very interesting research using a method called laboratory evolution to select new strains of yeast,” Dr Morrissey explains. This process makes use of the natural variation that occurs as yeast cells grow and divide. Researchers apply certain pressures and then identify variant yeasts that are able to withstand the stresses imposed by wine fermentation. These strains are then further analysed to identify the ones most suitable for largescale wine production. According to Dr Morrissey, this work can have even broader applications. “These approaches can help the wine industry select yeasts to cope with some of the negative effects of climate change on wine quality,” he says. The problem is that rising temperatures are causing grapes in wine-producing areas to have increased levels of sugar. This, in turn, causes an increase in the alcohol content of the wine to unacceptable
www.euresearcher.com
EUR: You worked with academic and industrial partners, did they have different approaches to research? Dr Carvalho: Yes, the perspective is different. In academia, you are often just focussed on a small part of the work. You are just trying to figure out a specific mechanism, for example. In industry, you might work on a product, so you can see the whole process and the wider benefit. For me, it was very important to be able to see the bigger picture. EU Researcher: What are your plans for the future? Dr Carvalho: I want to carry on working in synthetic biology in the future. I want to continue to use organisms to produce important products, or to discover new compounds.
levels. The challenge is to reduce the amount of alcohol in the wine without having a negative effect on the flavour profile. A follow-on project, YEASTDOC, will further address this challenge.
This new science is adapting traditional yeast fermentation.
A Fermented Future for Sustainable and Natural Ingredients Developing a sustainable society means finding new ways of making common consumer goods. Petrochemicals are the source of a large array of products, for example, flavours, aromas, cosmetics and plastics. Plant-based or ‘natural’ ingredients are often marketed as safe alternatives to these ‘man-made’ chemicals
in cleaning products, foods, healthcare products, cosmetics, insecticides, and many other everyday items. However, extracting active ingredients from plants is not necessarily a greener or safer option - at a large scale, extraction can be expensive, unreliable, environmentally damaging, and even impossible. The second major focus of the YEASTCELL research project was to investigate the potential to use yeast as a viable alternative to plant extraction or petrochemicals in biotechnology. There was special interest in the sustainable production of wax esters and fatty acids, which are commonly used in a range of cosmetics, personal care products, and other commercial applications. Currently, many of these ingredients are either synthesised from petrochemicals, or are extremely expensive or difficult to source. Creating alternative industrial processes is a priority for the EU, and Dr Morrissey and his colleagues are addressing this. “We are looking at yeast to produce many highvalue molecules. This is not only cheaper than using petrochemicals, there are also benefits in terms of sustainability,” he says. For example, you may be familiar with cosmetics and personal care products
11
At a glance Projects YEASTCELL was a training network of partners from 8 European countries. 11 researchers worked towards their PhDs in topics that ranged from basic research into yeast cell biology to manipulating yeasts for industrial applications in the nutrition, beverage, and chemical sectors. YEASTCELL received funding from the People Programme (Marie Curie Actions) of the European Union’s Seventh Framework Programme FP7/2007-2013/ under REA grant agreement n° 606795. YEASTDOC is a European Joint Doctorate network that will train 12 PhD candidates to apply modern genetics to enhance the performance of yeasts for industrial fermentation. They will also develop methodologies to improve yeast strains for novel applications in the fermented beverage industry. YEASTDOC has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 764927. CHASSY is an academia-industry research partnership focussed on designing chassis yeast strains that can be used as platforms for the production of a range of high-value compounds. CHASSY has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 720824. Contact Details Project Coordinator, Dr John Morrissey School of Microbiology University College Cork Cork, T12YN60, Ireland T: +353 21 490 2392 E: j.morrissey@ucc.ie W: www.yeastcell.eu W: www.yeastdoc.eu @YeastdocEU W: www.chassy.eu @ChassyProject
Dr John Morrissey
Dr Jan-Maarten Geertman
containing jojoba. The wax esters that are extracted from the Simmondsia chinensis plant and marketed as jojoba are used in lubricants and coatings, as well as in creams and conditioners. However, the cost of extracting the wax esters from the plant, and the limited number of areas in which it is grown, constrain production of jojoba extracts. Researchers in YEASTCELL succeeded in using yeast to biosynthesise jojoba-like wax esters, as well as other molecules that can replace petrochemicals in consumer products. The challenge for the research teams now is to move from proof-of-principle and prototype yeasts, to industrial yeast strains that are as efficient at producing valuable products as brewer’s yeast is at producing beer.
Zygosaccharomyces bailii. “The reason that we work on those yeasts is that they have properties, like thermo-tolerance and acid-tolerance, that are very useful for industrial production. By contrast, brewers yeast does not have these properties,” explains Dr Morrissey. Researchers are building a deeper understanding of the genetic basis of these properties, and are investigating whether they could be introduced into S. cerevisiae. CHASSY is a collaborative academiaindustry project that aims to support the European biotechnology sector. This project aims to move research along the pipeline from the lab and closer to commercial production. “CHASSY will focus on using yeast to produce aromatic molecules and
The focus was on understanding the kinds of yeasts that are used for fermented beverages, namely beer, wine and cider, and also on using modern technologies to create new yeast strains that might have interesting new applications Researchers built these prototype strains by reprogramming yeast, which involves finding relevant genes from different plants, yeasts and bacteria and assembling them in a particular order in a yeast strain. “We then have a new yeast, which is capable of producing a molecule that it previously wasn’t able to produce,” outlines Dr Morrissey. This process is likened to traditional fermentation – where yeast uses sugars to produce ethanol and flavours in a beverage like wine or beer – only with a strain that has been programmed to produce the target compounds. The industrial side of the research also has a focus on alternative yeasts such as Kluyveromyces marxianus and
oils,” says Dr Morrissey. “We are interested in taking some of the yeasts we already work with, for example, S. cerevisiae, K. marxianus and Yarrowia lipolytica, and engineering them for industrial production.” This will not be straightforward. “There is a very big gap between a proof-of-concept strain that is able to produce a small amount of an interesting compound, and a strain that can do this in an industrial setting and produce enough to be commercially viable,” stresses Dr Morrissey. CHASSY aims to address this issue by optimising yeast strains and achieving commercial viability for the production of a wide range of molecules that you might find in your hand cream, your vitamin pills, your medicines, and even in your food.
Dr John Morrissey is a Senior Lecturer in microbiology at University College Cork. He has led his research group of postgraduate and postdoctoral scientists working in microbial ecology, microbe-host interactions and yeast biotechnology since 2003. Jan-Maarten Geertman is Senior Manager Product & Process Research in HEINEKEN’s Global Innovation & Research department. Current initiatives include optimizing the sustainability of yeast and associated fermentation processes, as well as the development of novel products.
Noemi Montini and Valentina Sforza in the Yeast Research Lab in University College Cork.
12
EU Research
Probing Parkinson’s disease Evidence suggests that interleukin 13 and its alpha 1 receptor may play an important role in the loss of dopaminergic neurons in a region of the brain called the substantia nigra pars compacta (SNc). Scientists aim to test this hypothesis, work which holds important implications for our understanding of the underlying causes of Parkinson’s Disease, as Professor Bruno Conti explains Parkinson’s disease (PD) is characterised
Dopaminergic neurons
by the loss of dopaminergic neurons in the SNc, yet the underlying causes of the condition are not fully understood. Based at the Scripps Research Institute, Professor Bruno Conti aims to develop a deeper understanding of the pathophysiology of the disease, building on earlier analysis of human health data. “My laboratory investigates how inflammatory processes can contribute to the neuronal loss occurring in PD. We focused our attention on a human chromosomal region linked to PD, which included the gene encoding for a protein known to regulate inflammation: the interleukin 13 receptor alpha 1 (IL13Rα1),” he says. This provided a basis for further investigation using mouse models. “When we checked if this receptor was present in the brain we found that it was expressed in the neurons of the SNc, the same that are lost in PD,” continues Professor Conti, “Next, we tested in mice whether lack of IL-13Rα1 rendered them more or less susceptible to experimentally induced PD. We found that the mice that did not have IL-13Rα1 were protected.”
This suggests that IL-13Rα1 is an important factor in the death of dopaminergic neurons in the SNc, a hypothesis that Professor Conti and his colleagues now plan to investigate more fully. There are several strands to this work, with the first centering on investigating how the activation of IL-13Rα1 can be detrimental. “There are two endogenous proteins that can activate IL-13Rα1: IL-13 and IL-4. While neither IL-13 nor IL-4 alone are toxic, they greatly potentiate the toxicity of special types of molecules known as reactive oxygen species and cause oxidative damage. Both these factors need to be considered in terms of understanding the underlying mechanisms behind the death of dopaminergic neurons,” says Professor Conti. “We are trying to figure out how this happens. If we find out which molecules mediate the potentiating effects that IL-13 and IL-4 have on oxidative stress, then each of those molecules could become drug targets for PD.” Another aim relates to understanding when IL-13 and IL-4 are produced. “We are studying which cells produce IL-13 in the brain, and under which circumstances,” outlines Professor Conti. Researchers have found that a type of glial cell called microglia, as well as some neurons, can make IL-13, now Professor Conti and his colleagues aim to probe deeper. “When do these cells decide to make IL-13? We know that allergies elevate the production of IL-13 and IL-4 in the periphery. Does this also occur in the brain? Is it possible that an allergy can be a risk factor for PD - if so, can limiting allergic reactions reduce the risk of developing PD? We are now investigating whether inducing allergies in the mouse, in quite a severe way, can lead to the production of IL-13 in the
IL-13 or IL-4 activate the IL-13Rα1 on dopaminergic neurons increasing the susceptibility of the cells to damage that can be caused by reactive oxygen species (ROS)
www.euresearcher.com
central nervous system,” he says. “We are also investigating a model of chronic stress, as we know that stress increases the production of free radicals in the brain, and we want to check if it also leads to the production of IL-13. We’ve found that under some circumstances it does.” This research could help scientists build a deeper understanding of the underlying factors behind the development of PD, and in the long-term lead to improved treatment. The majority of current cases are considered idiopathic and sporadic, in that medical professionals are unclear why an individual has the disease, underlining the wider importance of this work. “Some genes have been demonstrated to cause PD in familial cases, but they account for only a small proportion of the overall total,” says Professor Conti. The molecules identified during the project could also hold potential as drug targets, something Professor Conti plans to investigate further in future. “Once we have identified the target we will team up with chemists to develop small molecules that will hopefully block this pathway, opening the way to the development of new drugs,” he outlines. Role of the IL-13 system in dopaminergic cell death We investigate the mechanisms by which inflammation can contribute to neurodegeneration. At present, we study how activation of the interleukin-13 receptor alpha 1 can cause the death of dopaminergic neurons. Our goal is to identify novel drug targets and to devise a strategy to prevent or treat Parkinson’s Disease. Funded by the National Institute of Health and by the Michael J. Fox Foundation. Professor Bruno Conti, Ph.D. Molecualr Medicine 10550 N. Torrey Pines Rd La Jolla, California 92037 mail SR 307 T: +858 784 9069 E: bconti@scripps.edu W: http://www.scripps.edu/conti/ Professor Bruno Conti, Ph.D. Born and raised in Italy he moved to the US in 1992. He trained in immunology at the New York Medical College and in molecular neuroscience at Cornell University. In 2000, he joined the Scripps Research Institute where he is a Professor in the department of Molecular Medicine.
13
Getting to the heart of sub-cellular structure Sub-cellular structures called dyads play a crucial role in the contraction and relaxation of the heart, yet much remains to be learned about their structure and function. Professor William Louch tells us about the CARDYADS project’s work in investigating the structure of dyads and the consequences of altering their organisation, research which could hold important therapeutic implications The cardiac muscle
cells in the heart contract when calcium rises, a process triggered at dyads, tiny junctions between two membranes. Dyads are therefore very important in regulating the contraction of these cells, known as cardiac myocytes, and the heart, an area that forms the primary research focus of the CARDYADS project. “The project aims to understand how these dyad structures are put together in the first place, which we know very little about. What keeps them together? What keeps them working? How does this change during disease?” says Professor William Louch, the project’s Principal Investigator. These structures are known to break down during disease, which disrupts the control of calcium release and the contraction of both cardiac myocytes and the heart. Professor Louch and his colleagues are investigating the structure of dyads in development, adulthood, and cases of heart failure to build a deeper understanding of the effects of structural changes. “We have access to a biobank here, and we have samples from healthy patients and transplant patients. We also use animal models,” he outlines. The dyads themselves are extremely small. “It’s Controlling Cardiomyocyte Dyadic Structure (CARDYADS) The Norwegian Research Council The South-Eastern Norway Regional Health Authority (Helse Sør-Øst) The National Association for Public Health (Nasjonalforeningen for folkehelsen) Professor William E. Louch Kirkeveien 177 4th floor, Building 7 0407 Oslo T: +47 23 01 68 00 E: w.e.louch@medisin.uio.no W: www.iemr.no
formed of two membranes, and they’re about 12-15 nanometres apart from each other,” explains Professor Louch. A t-tubule extends from the surface of the T-tubules (coloured green) from a ventricular cardiomyocyte.
function. But in cases of heart failure, those proteins are spread out, they’re dispersed.” Researchers are also investigating whether there any early indicators of these changes in the dyads, work which could hold important therapeutic implications. Maintaining the structure of the dyads could be an effective way of maintaining a healthy heart. “We’re learning that there are molecular anchors that hold the two membranes of the dyad together. They anchor the t-tubules to the sarcoplasmic reticulum, so they hold the two membranes together,” explains Professor Louch. “These anchors are lost during heart failure, which we think is probably a major part of the reason why the dyads are disrupted. One possible therapy could be to give patients more of these anchors.”
The project aims to understand how these dyad structures are put together in the first place, which we know very little about.
What keeps them together? What keeps them working? How does this change during disease? cardiac myocyte into the interior, close to the membrane of the sarcoplasmic reticulum. Together, these two membranes form the dyad; Professor Louch says there are clear differences in dyad structure between the healthy and disease state. “The t-tubules are disorganised in disease, they move further away. Also, the proteins present on the sarcoplasmic reticulum are disrupted,” he says. “There’s a very important protein called the ryanodine receptor, which is a calcium release channel. These proteins are typically close together, as a group, which helps them
This is more of a long-term goal however, and at the moment researchers are still characterising how the dyads are put together during development, and how they break apart during disease. New techniques like super-resolution microscopy are being applied to investigate how dyads are arranged on the nanometre scale. “We’re looking at how they’re put together, we’re learning about what keeps them together, and we’re testing out a few of the novel, important genes, mostly at the cellular level so far,” says Professor Louch.
William E. Louch received his PhD in Pharmacology in 2001 from Dalhousie University in Halifax, Canada, and is currently Professor of Medicine at Oslo University Hospital / University of Oslo in Norway. His research examines structure and function of normal and diseased cardiac myocytes, with particular focus on calcium homeostasis.
14
EU Research
Portable test to monitor HIV Reliably assessing the status of the immune system in HIV infected patients is essential to effective treatment of the virus. The HIVBIOCHIP project aims to develop a portable, inexpensive imaging system to count the number of CD4 + T-cells in HIV infected patients, helping improve the effectiveness of treatment, as Professor Nikos Chronis explains The treatment of HIV
has advanced significantly over recent years, with highly active antiretroviral therapies (HAART) now available that promise to improve quality of life for infected patients. These treatments should be regularly adjusted to reflect the patient’s condition and help them maintain a healthy immune system, as Professor Nikos Chronis explains. “A patient should be tested every few weeks or so, to make sure that they take the correct dose and the right cocktail of drugs,” he outlines. This can be difficult in remote areas of the developing world, where HIV infection rates are often high, an issue that Professor Chronis and his colleagues in the HIVBIOCHIP project are working to address. “The idea in the project is to develop a portable, inexpensive device to perform HIV monitoring tests out in the field, where often there is no available infrastructure,” he says.
CD4+ T-cells This device is designed to count the number of CD4 + T cells in a blood sample, which is an important indicator of the health of the immune system. Researchers are developing both a biochip and an imaging system, which can image and count cells in 1 microliter of whole blood within minutes. “First you have to prick your finger with a needle (like diabetic patients do) and then touch the chip, which draws the blood, and the blood cells are then immobilized on the surface of the biochip,” explains Professor Chronis. While previously it was necessary to prepare and treat the blood sample before the CD4 + T-cells could be imaged, Professor Chronis says that these further steps aren’t required with the project’s biochip. “The imaging scanner takes images using white light rather than fluorescence, and at the end it reports the concentration of the CD4 + T-cells to either the doctor or the patient, most likely the doctor or someone with medical training,” he outlines. The device itself is portable, self-contained and battery-powered, so well-suited for use in remote locations in the developing world. In some areas of the developing world
www.euresearcher.com
Microfluidics combined with advanced micro-optics technology can enable monitoring of HIV-infected patients in resource-limited settings at the point-of-care. infection rates can be as high as 20 percent, yet with effective monitoring of the immune system and the right treatment, patients can control the condition. “By regularly monitoring the progression of the disease and taking the right drugs, HIV can be controlled, it basically becomes like a chronic disease,” says Professor Chronis. The project’s work holds clear promise in these terms, and Professor Chronis says there are plans to bring their research to the commercial marketplace. “The project is not only about developing a prototype and testing it with healthy volunteers, but also about building a commercialisation plan,” he says. A positive application of the system in the developed world would help to attract investment, so researchers are looking to use the scanner to count other types of cells, aside from CD4 + T-cells. One example would be looking at how cancer treatment affects the immune system by counting the number of white blood cells, and there are many other possibilities. “This device is a platform, it’s not limited to CD4 + T-cells, it could be used for any type of cell,” says Professor Chronis. There are two simple modifications in adapting the platform to count other types of blood cells. “If you want to trap a different type of cell, you have to functionalise the biochip surface with a different type of antibody that will
recognise the specific type of cell that you’re looking for,” explains Professor Chronis. The second modification is in adapting the imaging recognition algorithm that identifies the cells. “If you’re looking for another type of cell, then you might have to train the algorithm to identify the type of cell that you’re looking for,” says Professor Chronis. A point-of-care Biochip for HIV monitoring in the developing world (HIVBIOCHIP) Professor Nikos Chronis National Center for Scientific Research ‘Demokritos” End of Patriarchou Grigoriou E and 27 Neapoleos Street 15341 Agia Paraskevi Greece T: +30 2106 503 000 E: chronis@umich.edu W: http://erc.europa.eu/ Professor Nikos Chronis received his B.E. and Ph.D. degrees from Aristotle University (Greece) and University of California at Berkeley in 1998 and 2004 respectively, both in mechanical engineering. In 2004, he worked as a postdoctoral researcher in Cori Bargmann’s lab at Rockefeller University. In 2006, he joined the faculty of Mechanical Engineering at the University of Michigan. His research interests include implantable MEMS sensors, point-of-care microfluidics and in-vivo imaging of neuronal circuits.
15
New light on inflammatory bowel disease A number of different factors are involved in the development of inflammatory bowel diseases (IBD), yet current treatments do not always effectively target the specific nature of the condition. The CrUCCial project aims to build a deeper understanding of the underlying mechanisms involved in IBD, providing the basis for individualised treatment, as Professor Séverine Vermeire explains There is currently
no cure for Crohn’s disease or ulcerative colitis, two closely related conditions which come under the wider umbrella of inflammatory bowel diseases, so treatment is focused primarily on managing the diseases. A number of treatments are available to help reduce the impact of IBD, including corticosteroids and immunosuppressive drugs. “They are antiinflammatories, which reduce inflammation. Newer treatments are also available, where you block certain pro-inflammatory cytokines,” explains Professor Séverine Vermeire, the Principal Investigator of the CrUCCial project. These treatments are not always well-suited to the specific needs of the individual patient however, as IBD is highly complex and heterogenous in nature. “IBD is difficult to treat. Powerful drugs are available, but if you apply these drugs to 100 patients for example, then this will lead to clinical remission and complete healing of the bowel in only 30 percent of patients roughly,” says Professor Vermeire. “Furthermore, we don’t know - and still cannot predict - which patients will react to which drug.”
16
The work of the CrUCCial project, an ECfunded initiative based at Katholieke Universiteit Leuven (KU Leuven), holds clear importance in these terms. Professor Vermeire and her colleagues in the project aim to shed new light on the underlying factors behind the disease, which could eventually lead to improved, more precisely targeted treatment. “We would like to look at what is really triggering the disease in a particular patient. What goes wrong in that patient, in contrast to another? Can we treat the disease in a more functional way?” she says.
Inflammatory bowel diseases The foundation of this work is a deeper understanding of the underlying causes behind IBD, which are rising in prevalence in many parts of the world, with around 0.3-0.4 percent of the population thought to be affected. While genetic background is an important factor in susceptibility to IBD, this cannot explain the rapid recent increase in the prevalence of these diseases, so Professor Vermeire says other
factors must be considered. “The fact that we have not seen rising prevalence in subsaharan Africa or very rural areas of Latin America has led many people to think that this must have something to do with changes in lifestyle and diet,” she outlines. These are multi-factorial diseases, and the underlying causes behind each individual case may be different. “We want to first identify what has gone wrong with a patient, and then reflect that in treatment,” says Professor Vermeire. “So maybe you will want to block the inflammation, but if we can identify primary barrier defects in a patient, or disturbance in their recognition and defense against bacteria, there might also be other therapeutic considerations.” Researchers are analysing and integrating data from over 4,500 patients, including data from blood samples, stool samples and biopsies, to gain a deeper understanding of the key cellular pathways involved in IBD. From these foundations, an index can then be developed reflecting the proportional contribution of each
EU Research
pathogenic mechanism in each patient, providing the basis for more tailored, individualised treatment; Professor Vermeire says this is complex work. “The difficulty is in analysing all of these different pathways together,” she explains. This work involves integrating data from different ‘omics studies, including genomics, proteomics, transcriptomics and metagenomics, each of which generates huge volumes of data. “We need to overlay all these ‘omics layers together, and this is the difficulty. So we have set up a kind of intermediate ‘omics layer,” says Professor Vermeire. “We’ve taken research that people have been working on over the last 15 years, and we’re trying to integrate all the knowledge that has been gathered and to bring it all together.”
Clinical outcomes
At a glance
These diseases can also vary widely in severity, from relatively mild cases through to more serious situations, where a patient may require surgery. The CrUCCial index gives greater detail on the specific nature of each individual patient’s condition; researchers now aim to correlate this with data on clinical outcomes. “Now that we have data on clinical outcomes, we’re looking to see if we can draw links. Are the people that have problems in the composition of their microbiota the people that have more severe disease? Are these the people that have more disease in the small bowel?” says Professor Vermeire. This work could provide a more effective framework for the management of other multi-factorial diseases, aside from IBD. “There are many complex, multi-
Full Project Title Novel diagnostic and therapeutic approach to inflammatory bowel diseases based on functional characterization of patients: the CrUCCial index (CrUCCial)
We would like to look at what is really triggering inflammatory bowel disease in a particular patient. What goes wrong in that patient, in contrast to another? Can we treat the disease in a more functional way? A number of important cellular pathways involved in IBD have been identified over the years, including bacterial recognition, autophagy, endoplasmic reticulum (ER) stress and intestinal barrier function. The project aims to build on this data and develop an index that reflects the proportional importance of each pathway in each patient. “We aim to rank patients, for example in terms of intestinal barrier function. If a patient is in the first quartile, that means they have a comparatively poor barrier function and may need treatment to strengthen it,” outlines Professor Vermeire. The project’s research could also hold relevance in terms of disease prevention. “If we identify a particular defect in a given individual that we think is a factor in disease, then maybe it would be worth testing their siblings to see if they also have that defect,” she continues. “That could be a first step towards prevention of the condition.” Ulcerative colitis (UC)
www.euresearcher.com
factorial diseases, like asthma, hypertension and rheumatoid arthritis, where we really think these diseases could benefit from a better therapeutic approach and a better characterisation of patients,” says Professor Vermeire. “I think that an integrated treatment approach, in a personalised way, is really the way forward for many of these diseases.” There is still scope for further improvement in the CrUCCial index however, which remains the project’s more immediate focus, particularly in terms of bringing the available information together. While some researchers have focused purely on the genetic background, and others on the intestinal barrier function, Professor Vermeire says it’s also important to consider how these different elements interact. “We need to look at how the genes interact with the microbiota and the epithelium,” she outlines. Crohn’s disease (CD)
Project Objectives We hypothesize that a functional characterization of patients for the major identified pathways implicated in disease pathogenesis will improve management of patients in many aspects. This type of screening (both (epi)genetic, transcriptional, functional and on microbiome level) could not only enable predictions of an individual’s disease course or capac¬ity to metabolize therapeutic agents, it could ideally lead to a highly optimized personal regime to manage IBD, maximizing treatment response while avoiding unwanted adverse effects. Project Funding H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC) Advanced Grant Contact Details KATHOLIEKE UNIVERSITEIT Oude Markt 13 3000 LEUVEN Belgium T: +32 16 37 74 71 E: severine.vermeire@kuleuven.be W: http://cordis.europa.eu/project/ rcn/204842_en.html Machiels K, Joossens M, Sabino J, De Preter V, Arijs I, Eeckhaut V, Ballet V, Claes K, Van Immerseel F, Verbeke K, Ferrante M, Verhaegen J, Rutgeerts P, Vermeire S. A decrease of the butyrate-producing species Roseburia hominis and Faecalibacterium prausnitzii defines dysbiosis in patients with ulcerative colitis. Gut. 2014;63:1275-83. Jostins L, Ripke S, Weersma RK, …, Vermeire S, Barrett JC, Cho JH. Host-microbe interactions have shaped the genetic architecture of inflammatory bowel disease. Nature. 2012;491:119-24.
Professor Séverine Vermeire
Séverine Vermeire is a Professor in the Faculty of Medicine at the Catholic University of Leuven, where she also holds the position of Chair of the Department of Chronic Diseases, Metabolism & Ageing (CHROMETA). She gained her PhD on Genetic Polymorphisms and Serologic Markers in Inflammatory Bowel Disease in 2001.
17
Vascular manipulations
that boost adult neurogenesis
Evidence shows the vasculature plays a key role in maintaining organ homeostasis, alongside its established function of tissue perfusion. We spoke to Professor Eli Keshet about the VASNICHE project’s work in investigating how blood vessels affect stem cell properties and maintenance, research which could lead to new insights into adult neurogenesis The sole function of blood vessels in the body has long been thought to be perfusion, namely supplying tissues with oxygen, nutrients, hormones and other blood-borne substances. While this is indeed a central part of the vasculature’s function, recent research suggests that blood vessels also play additional roles in the body. “Over recent years the idea has emerged that blood vessels, in addition to their established role in providing perfusion, also have additional roles in maintaining organ well-being, what we call organ homeostasis,” explains Professor Eli Keshet. Based at the
18
Hebrew University of Jerusalem, Professor Keshet is exploring this area further in the ERC-backed VASNICHE project, using mice to study blood vessels in association with adult stem cells. “The latter are cells with stem cell properties that reside in normal adult tissues, including in the brain,” he explains. The old view was that the adult brain loses its capacity to produce new neurons over time, yet it has since become clear that new neurons can in fact be added in the adult brain via repeated divisions of resident stem cells in a process known as ‘adult
neurogenesis’. However, these neurons can be born anew only in the two regions of the brain where neural stem cells reside. Of these two regions, Professor Keshet is focusing his attention on the hippocampus. “The hippocampus has many functions, some of which are related to learning and memory, functions which in part rely on newly-added neurons,” he outlines. Unfortunately however, the capacity to generate new neurons in the hippocampus is known to decline with age, thus calling for some ways to attenuate age-related decline of adult hippocampal neurogenesis.
EU Research
Vascular stem cell niche Stem cells, in general, need to communicate constantly with their surrounding environment, known as the ‘stem cell niche’, in order to function effectively. The stem cell niche is the major focus of the VASNICHE project, with researchers aiming to gain new insights into the structure and composition of stem cell niches. “What cell types make the niche? Which cell types control the performance and maintenance of the stem cells?” says Professor Keshet. There is an increasing body of evidence to suggest that blood vessels are an integral component of stem cell niches at large, and of neuronal stem cells specifically. “The proposition here is that vascular cells must communicate constantly with neuronal stem cells in order to secure their proper function,” explains Professor Keshet. The apparent close physical proximity between blood vessels and neuronal stem cells is consistent with this proposition, yet this is not on its own sufficient to
manoeuvre, we can dramatically boost adult neurogenesis and improve cognitive functioning of the mice,” says Professor Keshet.
Attenuation of age-related neurogenic decline The capacity to make new neurons in the brain is known to also deteriorate with age in humans. Debate continues around the underlying cause of age-related deterioration in adult neurogenesis; Professor Keshet and his colleagues have put forward the proposition that it is not only the stem cells themselves that are aging. “Possibly the ability of the stem cell niche to support proper stem cell function is also deteriorating with age,” he explains. “Rejuvenating the vasculature could have an even greater impact on neurogenesis in the elderly.” The researchers have indeed found that rejuvenating the niche vasculature in aged mice - at a time where hippocampal neurogenesis has already substantially decreased - results in a highly significant
We have developed a very sophisticated transgenic mouse system, where we can manipulate blood vessels in mice at will. This includes both expanding or reducing the vasculature in the brain region of choice and recording the effect of these vascular manipulations on stem cell-driven neurogenesis prove a causal relationship. The project aims to build a stronger evidence base supporting the idea that blood vessels play an indispensable role in the control of stem cell function by harnessing the research group’s unique expertise in vascular manipulations. “We have developed a very sophisticated transgenic mouse system, where we can manipulate blood vessels in mice at will,” outlines Professor Keshet. “This includes both expanding or reducing the vasculature in the brain region of choice and recording the effect of these vascular manipulations on stem cell-driven neurogenesis.” To reinforce and rejuvenate the vasculature in the region where hippocampal stem cells reside, the researchers ‘switch on’ production of the protein known as Vascular Endothelial Growth Factor (VEGF), which promotes and orchestrates the generation of new blood vessels. Researchers have found that when more blood vessels are induced at the stem cell niche, more and more neurons are produced. “By using this
www.euresearcher.com
increase in the rate of new neuron formation. Could a similar approach be harnessed to boost neurogenesis in humans and help maintain cognitive function for longer? While Professor Keshet is keen to stress that the project’s approach has been experimental, and the findings are not directly applicable to humans, he believes that this research could help lay the foundations for further investigation into treatments aimed at boosting adult neurogenesis via manipulating the stem cell niche. “Maybe we can consider other, more realistic ways of region-specific blood vessels rejuvenation for which this study could be regarded as a proof-ofconcept,” he says. The next question, which is currently being explored in Professor Keshet’s laboratory under the VASNICHE project, is which molecules produced by the vasculature contribute to improved neurogenesis. “We are still in the process of trying to see what the blood vessels produce to achieve this increased neurogenesis,” he says.
At a glance Full Project Title The vascular stem cell niche and the neuromuscular unit (VASNICHE) Project Objectives Generation of new neurons in the hippocampus of the adult brain declines with age rendering its contemplated ‘rejuvenation’ highly desirable. Assuming that age-related neurogenic decline us due to aging of the stem cell microenvironment and, specifically, the niche vasculature, attempts were made to boost neurogenesis through ‘rejuvenation’ of the niche vasculature. Project Funding ERC-AG-LS4 - ERC Advanced Grant Physiology, Pathophysiology and Endocrinology. EU contribution: EUR 2 499 980. Project Partner • Dr. Tamar Licht, The Hebrew University of Jerusalem Contact Details Project Coordinator, Professor Eli Keshet THE HEBREW UNIVERSITY OF JERUSALEM EIN KEREM CAMPUS 91120 JERUSALEM Israel T: +972 2 675 8496 E: elik@ekmd.huji.ac.il W: http://www.cordis.europa.eu/project/ rcn/106989_en.html Licht T, Rothe G, Kreisel T, Wolf B, Benny O, Rooney A, Ffrench-Constant C, Enikolopov G, Keshet E. VEGF pre-conditioning leads to stem cell remodeling and attenuates age-related decay of adult hippocampal neurogenesis Proc. Natl. Acad. Sci. USA, Published online November 14, 2016, E7828-E7836
Professor Eli Keshet
Professor Eli Keshet is a professor of Molecular Biology at the faculty of Medicine of the Hebrew University of Jerusalem. He received his PhD from the Hebrew university in 1975, conducted post-doctoral research at the University of Wisconsin and advanced research at American National Cancer Institute, USA. His research focuses on the vascular system and the mechanisms of blood vessel formation. Professor Keshet is a member of the Israel Academy of Sciences.
19
Overcoming problems around vascular access Patients suffering from kidney failure depend on peritoneal or haemodialysis to purify their blood, yet the failure of renal dialysis vascular access is a significant and serious problem. The ReDVA project is a joint industry-academia research programme that aims to overcome the scientific and technical barriers associated with the failure of vascular access, as Dr Shona Matthew explains
20
All haemodialysis patients need some
Multi-disciplinary
form of vascular access. An arteriovenous fistula (AVF) is the preferred choice, as it reduces the risk of subsequent complications and serious infections in comparison to the other options; arteriovenous grafts and central venous catheters. AVF’s are created by surgically joining an artery to a vein so that the fast flowing blood from the artery is fed directly into the vein, causing the vein to dilate and the vein walls to thicken over a period of weeks or months. This process is known as maturation. A mature fistula provides the high flow rates required to ensure that the patient receives adequate dialysis clearance within a standard 3-4 hour dialysis session and is robust enough to withstand repeated cannulations. “Because of the high failure rates and limited number of suitable sites on the body, surgeons prefer to create the first AVF in the wrist. This is not always possible as the patient’s vessels may be too small; so two potential fistulae sites are lost straight away,” says Dr Shona Matthew of the ReDVA project. “It is not uncommon for all suitable AVF sites in the body to be ‘used up’ and once that happens there’s very little else that can be done.” Repeated AVF failure impacts on patient morbidity and mortality. It also costs healthcare services a large amount of money each year as patients require an immediate and often temporary form of vascular access, and further surgery to create a new AVF. The ReDVA project was formed to investigate the problems associated with renal dialysis vascular access, bringing together partners from industry and academia with the shared goal of improving clinical understanding in this challenging area.
The multi-disciplinary nature of the project means that the problems around haemodialysis vascular access are being approached from several different angles. A lot of attention was initially devoted to analysing the existing guidelines for vascular access. “In the first year of the project, partners worked together on literature reviews, looking at existing guidelines for vascular access across the world. Several members of the ReDVA team worked with European Renal Best Practice, to update their guidelines, and one of our fellows had a Cochrane systematic review title accepted,” says Dr Matthew. Single slice image through a patient’s arm acquired using MRI scanner.
Imaging ReDVA clinicians, physicists and engineers looked at pre-operative AVF vein mapping. “Before a patient goes for surgery, the surgeon will use ultrasound (US) to ‘map’ the patient’s vein and artery to ensure that they are suitable, as recommended in all the current guidelines,” explains Dr
Matthew. Very few centres offer more than one post-operative US scan to check on the maturation process, preferring a more ‘hands on’ assessment until a problem arises. However, Dr Matthew says that her NHS colleagues apply a highly pro-active imaging approach. “In Dundee we have a system where we use US to scan our patient’s vessels before their surgery and then again at regular intervals after surgery.” This allows problems to be picked up and hopefully resolved before they affect the maturation process, or the patient’s dialysis treatment. “We are trying to encourage other centres to be more proactive with US imaging and so we developed optimised imaging protocols, which we shared with other centres,” says Dr Matthew. Researchers from the University of Dundee, the University of Limerick and Vascular Flow Technologies also looked at Magnetic Resonance Imaging (MRI) as a tool for pre-surgical vein mapping and post-surgical surveillance, as they believe that MRI may offer additional information, which cannot be obtained with US. For example, many renal patients require a central venous catheter (CVC) at some point, perhaps as a form of temporary vascular access or as a conduit for administering drugs and fluid during surgery. Central vein stenosis is a common complication of using a CVC and needs to be considered when planning further AVF surgery, as it can lead to post operative complications. “This type of stenosis cannot be detected using US, but can be imaged with MRI, ” says Dr Matthew. However, MRI is not routinely used as it’s quite expensive, not easy to access
EU Research
and the contrast agents used in MRI are contraindicated for patients with poor renal function, as some agents have been linked to a disease called nephrogenic systemic fibrosis (NSF) in this cohort. ReDVA researchers worked to develop an MRI protocol that allowed them to acquire detailed 3D images of the patient’s vessels without the use of a contrast agent. Seven patients listed for AVF formation were scanned with MRI, both before surgery and at various time points after, allowing researchers to track the geometric changes in their vessels during maturation. The main limitation of using non-contrast imaging was the length of time that the patient had to spend in the scanner so that researchers could obtain the images they required.
Data analysis Knowing that a contrast agent would dramatically speed up this type of image acquisition, ReDVA researchers used the power of big data to investigate the safety of the macrocyclic Gadolinium based
A 3D model of a patient’s fistula, created using patient images.
3D model of a patient fistula.
Computational Fluid Dynamics contrast agents used in-centre. “We looked at data from over 24,000 contrast enhanced MRI scans, acquired in our hospitals between Jan 2004 and Dec 2016. We identified, linked and analysed the data of all patients, including those considered ‘at risk’, such as patients with poor renal function - many of whom had undergone numerous contrast-enhanced MRI scans. Our detailed analysis failed to find significant differences in the very low rates of adverse events between patients with normal renal function and those with poor renal function,” stresses Dr Matthew.
The imaging element of the project also holds implications for the computational dimension of the project. When a fistula is formed, blood flow changes from a natural spiral laminar flow, into a disturbed flow. “Suddenly you’ve got this high-pressure, high-velocity arterial blood flow in a vein,” explains Dr Matthew. This unnatural flow affects wall sheer stress - where once you had levels that were moderately uniform and uni-directional, the wall shear stress is now disturbed and oscillating. The reason why this matters is that cells on and within the arterial wall respond to this by releasing chemical messengers that can cause blockage or failure
Image of an interventional procedure being carried out on the Thiel Cadaver model.
www.euresearcher.com
21
At a glance Full Project Title DEVELOPMENT OF HEMODYNAMIC SOLUTIONS IN RENAL DIALYSIS VENOUS ACCESS FAILURE (ReDVA) Project Objectives The ReDVA project proposes to perform a joint industry-academia research programme to overcome the scientific and technical barriers to the understanding, development and adoption of technologies to combat the significant clinical problem of the failure of renal dialysis venous access. Project Funding Funded by the European Commission. EU contribution: T2 635 228 Project Partners Univerisity of Dundee (Professor Houston), Queen Elizabeth Hospital Birmingham (Mr Nick Inston), Guerbet, Paris (Dr Eric Lancelot), Vascular Flow Technologies, Dundee (Craig Dunlop), University of Limerick (Professor Michael Walsh). Contact Details Chief Investigator, Professor Graeme Houston University of Dundee Dundee DD1 4HN T: +44 1382 632 651 E: j.g.houston@dundee.ac.uk W: www.redva.eu
Mr Nick Inston FRCS PhD. (Left) Professor Graeme Houston (Right)
of the fistula. The specific circumstances that cause failure are not known at present but it is something which researchers in the project aim to shed light on. “Our colleagues have been looking at blood flow patterns in realistic fistulas obtained from the MRI images using computational models,” says Dr Matthew. “This involves looking at what we call intimal hyperplasia on the vessel wall where the muscle cells in the vein wall multiply at an accelerated rate. It can cause lesions, narrow the vessel, block it, or narrow and block a graft.” The human body is highly complex, so it takes a multi-disciplinary team of physicists, sonographers, radiographers, engineers, and interventionists to figure out what’s happening, even in a relatively small part of it.
Image Guided Therapy Research Facility Recently in Dundee, ReDVA’s Chief Investigator, Professor J Graeme Houston, in collaboration with ReDVA partners, led the development of a unique resource, The Image Guided Therapy Research Facility, bringing together three key resources. State-ofthe-art imaging facilities, including 128 interventional CT, C-Arm X-ray and 1.5 T MRI coupled with the multidisciplinary network of ReDVA clinicians, anatomists, engineers and scientists to provide the basis for the development of Thiel embalmed human cadavers as a model for both teaching and the testing of preclinical devices, including those
We’re looking at the problems associated with arteriovenous fistulae, and we’re trying to improve understanding of factors that impact on AVF patency rates. Researchers from a number of disciplines are involved in the project, so we can look at the problems from a number of perspectives Due to the development of stenosis, some of the ReDVA patients required post-operative interventions within months of their AVF surgery, to prevent their fistula failing. The data collected during their interventions was collated and analysed alongside that of many other patients who had undergone similar interventions, allowing the long-term success rates of interventional techniques to be compared.
intended as an alternative form of vascular access. Thiel embalmed human cadavers retain many of the living properties of the human body, including a patent vascular system, so flow can be introduced into the bodies, simulating anatomically accurate models. This model, developed as part of the ReDVA project, is attracting a significant amount of interest from clinicians and industry.
Efstratios Kokkalis attending an interventional training day at Ninewells Hospital. Mr Nick Inston FRCS PhD. Consultant Surgeon at Queen Elizabeth Hospital, University Hospitals Birmingham. As part of the ReDVA study was one of the leads for the European Renal Best Practice Guidelines group on Vascular access, presented at the Vascular Access Society and acted as supervisor for two ReDVA fellows. Professor Graeme Houston is currently Co-Director of the Clinical Research Imaging Facility, at the University of Dundee and NHS Tayside collaboration, delivering clinical imaging for both research and NHS services. He has published 250 papers and holds over 30 patents comprising 12 patent families, 11 design patents.
22
EU Research
A magnetic vision system Magnetic nanoparticles are attracting intense research attention as a means of treating disease, yet they need to be precisely delivered to their targets to be fully effective. Professor Antoine Weis, Victor Lebedev and Vladimir Dolgovskiy tell us about their work in developing a novel imaging method to visualise magnetic fields, research which helps to localise nanoparticles in tissue Magnetometers, devices that measure the strength and orientation of a magnetic field, are increasingly used today across a range of sectors, including fundamental science, bio-medicine, and geo- and spaceexploration. Based at the University of Fribourg in Switzerland, Professor Antoine Weis and his colleagues hold deep expertise in the development of ultra-sensitive magnetometers. “Our core specialism here is making atomic magnetometers, devices that can measure very weak magnetic fields, such as the ones produced by nanoparticles, nanometre-sized magnets,” he outlines. There is a growing need for sophisticated imaging methods that can accurately represent the spatial distribution of magnetic fields, for example in biomedicine. “There has been rapid development over the past ten years regarding the use of magnetic nanoparticles in the field of biomedicine,” explains Professor Weis. “A range of applications have been identified, including using magnetic nanoparticles for targeted cancer therapy.” The idea in this approach is to inject functionalised particles into the bloodstream, which are intended to carry drugs to tumour cells, yet the delivery of these nano-drugs to their target destinations needs to be accurately www.euresearcher.com
assessed if they are to be fully effective. This is where imaging comes into play. With a suitable detection device outside the body, medical professionals can then accurately determine where the particles are inside the body, a research field that Professor Weis is investigating along several lines. “We are doing laboratory experiments, developing prototype detectors that record the magnetic field pattern generated by magnetised
Magnetic field sensing The starting point in this work is the magnetic field source, the nanoparticles themselves. These particles are magnetised in an arbitrary spatial direction, and they then produce a magnetic field with a specific pattern. “This magnetic field pattern can be measured some distance away from the particles, which is key to the noninvasive nature of the method.” says
We’re investigating at which accuracy, sensitivity and spatial resolution our detectors can visualise magnetic nanoparticles, sources of a magnetic field. This means finding out how they are distributed inside an object; what is the smallest amount we can see? nanoparticles,” he says. “From these patterns we can trace back to the particles’ positions inside an object. We’re investigating the accuracy, sensitivity and spatial resolution with which our detectors can localise the magnetic nanoparticles. This means finding out how they are distributed inside an object; what is the smallest amount we can see? The goal in this research is to develop methods to answer such questions.”
Professor Weis. The magnetic field produced by the nanoparticles has a spatial distribution; at each point in space It is described by a vector with three components, Bx, By, and Bz. “Our apparatus allows us to measure each of these components separately. This gives us full information on the magnetic field at a given point in space,” continues Professor Weis. “The idea of the project is that, from the measured strength and
23
Left image: The magnetic field lines from a magnetised sample intersect a layer of spin-polarised caesium (Cs) atoms. The black arrows represent the magnetic field vectors, the blue and yellow arrows being their respective components along x and y. The magnetic field perturbs the local spin polarisation in a characteristic manner. A homogeneous offset magnetic field (not shown in the figure) applied along x, y or z, allows the selective detection of the field component Bx, By or Bz. The example shown represents the distribution of the By components produced by a sample magnetised along y. Right image: Experimental realisation of the magnetic vision system. The light-emitting square layer of spin-polarised Cs atoms (imaging plane) is produced in a cubic glass cell by optical pumping. Fluorescence from this layer is detected by a camera. The camera images show characteristic patterns presented in the images below. From these intensity distributions one can infer the shape and the amount of magnetic material in the sample.
orientation of the magnetic field, we can then calculate backwards to identify where the (source) particles are located and how many of them are at a given position.” The researchers are developing a magnetic source imaging camera (MSIC) to visualise magnetic field patterns. “In the MSIC, a video camera looks at a cubic glass cell that contains a vapour of caesium (Cs) atoms that are illuminated by a sheet of laser light. The intersection of the laser beam and the cell defines a flat square volume, and the camera detects the fluorescence light emitted by this two-dimensional layer of Cs atoms (imaging plane),” explains Professor Weis. The resonant laser light spin-polarises the atoms, yielding a homogenously dark camera image. Any perturbation of the atoms by a magnetic field produces a brighter, structured pattern on the dark background. From this pattern, which reflects the magnetic field’s spatial distribution, the researchers can then calculate backwards to derive the spatial distribution of the sources that produced the magnetic perturbation. “This source
24
reconstruction then tells us the shape and strength of the object that produced the field, itself producing the recorded image,” says Professor Weis. Previous approaches to imaging were relatively inefficient, says Professor Weis. “You could move a very small-sized magnetometer over this square, and measure the field point-by-point. You would get the same image, but the procedure would take a very long time,” he outlines. The new sensor, by contrast, makes measurements at thousands of different points simultaneously, a much more performing approach allowing the real-time visualisation of spatial field dynamics.
Medical implications This holds important implications in terms of targeted cancer therapy. If drugcarrying nanoparticles are to be used in cancer treatment, it’s essential that the injected particles are delivered to the tumour and do not accumulate in organs, such as the liver, the spleen, or the kidneys. Patient screening by an MSIC may prove useful in both respects. This research also has a potential
diagnostic application, for example in localising lymph nodes, an important consideration in the treatment of breast cancer. A primary tumour in the breast will spread through the lymph system, which contains nodes, small, millimetresized entities. “Cancer cells moving in the lymph system will accumulate in these lymph nodes, the nodes nearest to the breast, being located in the axilla. When a patient is diagnosed with breast cancer, often doctors will surgically remove the lymph nodes in the axilla, because they may already contain cancer cells that may spread further into the body,” explains Professor Weis. However, not all lymph nodes in the axilla are connected by lymph vessels to the primary tumour. “Sometimes lymph nodes are removed in surgery unnecessarily, which has negative consequences for the patient,” continues Professor Weis. A detector showing doctors which lymph nodes contain tumour cells would therefore be highly beneficial, potentially enabling more precisely targeted treatment. It has been shown that if nanoparticles are injected into a
EU Research
tumour, they will follow the same pathway as the cancer cells and accumulate in the lymph nodes, at which point Professor Weis believes that the MSIC could play an important role. “Our camera system will help in localising the affected lymph node using the MRX method illustrated in the bottom figure,” he says. Prior to deploying nanoparticles in hospital applications, questions such as their biocompatibility and the length of time they stay in the body have to be investigated. The MSIC may prove to be useful in this context. While fully aware of the wider potential of their work, Professor Weis and his colleagues are primarily looking to improve the camera and explore its limitations. “We are trying to determine the smallest amount of nanoparticles that we can detect, and we publish our results. We will derive as many results as we can from this research, and try to do as many
proof-of-principle demonstrations as possible. We are also working on refining our algorithms for magnetic source reconstruction from the detected field patterns,” he says. The primary focus at the moment is on methodological developments. “We develop new methods and technologies, and then other people can take up our ideas and push them towards commercialisable devices,” says Professor Weis. “This has happened with our previous demonstration that an array of atomic magnetometers can produce dynamic maps of the magnetic field generated by the beating human heart.” He adds “We are confident that the MSIC method will find its way into the clinical environment. On the other hand, the device is so universal that it can image any magnetic field producing entity, with applications not only in biomedicine, but also in material screening or magnetic microscopy.”
At a glance Full Project Title Characterizing and imaging magnetic nanoparticles by atomic magnetometry Project Objectives Exploiting more than 15 years of expertise with atomic magnetometers (AM), we currently develop devices for imaging the spatial distribution of magnetic nanoparticles (MNPs) using AM detection. One device detects MNPs in fluids via their anharmonic response to a harmonic excitation. The other device images blocked MNPs, whose distribution is inferred from spatiallyresolved detection of their magnetic field. Project Funding Both projects are funded by grants from the Swiss National Science Foundation, viz., Grants No. 200020_162988 “Magnetic particle imaging (MPI) with atomic magnetometers” IZK0Z2_164165 “Atomic fluorescence imaging of magnetic fields using an imaging fiber bundle” Contact Details Project Coordinator, Professor Antoine Weis Physics Department University of Fribourg Chemin du Musée 3 CH-1700 Fribourg T: +41 (0)26 300 90 30 E: antoine.weis@unifr.ch W: physics.unifr.ch/en/page/625/
Dr Victor Lebedev (Left) Dr Vladimir Dolgovskiy (Centre) Professor Antoine Weis (Right)
Left: Anticipated magnitude distributions of the field components Bx, By, Bz (each coded by a specific colour), produced by small samples with magnetisation Mx, My or Mz. Right: Experimentally recorded distributions reproduce well the anticipated patterns.
Professor Antoine Weis got his PhD degree from ETH Zurich. He worked at MPI for Quantum Optics and was Associate Professor at the University of Bonn. Since 1999 he is Full Professor at the University of Fribourg. His current fields of interest are the development, modeling and applications of atomic magnetometers. Dr Victor Lebedev, senior research assistant with research expertise on optical magnetometry and spectroscopy of atoms, plasmas and quantum solids. Dr Vladimir Dolgovskiy, post-doctoral research assistant with background in time/frequency metrology, conducts research on weak magnetic field sensing and imaging.
Spatially-resolved magneto-relaxation (MRX): Time sequence of camera images, produced by magnetised nanoparticles, under conditions of the red-framed pattern in the figure above. The magnetising coil is switched off at t=0. The decay curve represents the characteristic (logarithmic) demagnetisation of the sample over a period of 6 minutes.
www.euresearcher.com
25
Breakthroughs with Stem Cell Research Every time we hear about stem cell research in the news we are offered new hope against conditions that have long been untreatable. Is stem cell research a branch of science that will completely transform our healthcare options? Richard Forsyth looks at some groundbreaking projects around the world that are demonstrating almost miraculous results.
S
tem cell therapy has become a subject of global scientific interest for its value in treating diseases and preventing them. Many clinics and so-called stem cell entrepreneurs have sprung up in recent times, exploiting this innovative healthcare. There are also many trials about to begin which offer hopes for new proven treatments. It’s widely believed that stem cell therapy will play an increasingly significant part in our future. It’s not that this is a new study area either, as it’s been a focus of both research and treatment for decades but precisely because
26
it has so much promise and potential for what seems like a limitless tally of serious conditions, there’s something of a scramble to be the first to unravel the remaining secrets of stem cells, to fully harness them. Despite advances and available treatments – the gaps in knowledge keep it firmly in the bracket of a ‘pioneering’ treatment. Much of stem cell therapy is experimental and it’s important that extensive clinical trials prove the safety of methods. Even so, it’s difficult to not be excited by the research activity, currently circulating positive results from early stages in studies.
EU Research
What are stem cells? There are two main types of stem cell being researched for treatments in healthcare, the adult stem cells and the embryonic stem cells. The adult cells can be extracted from tissues in the brain, blood, bone marrow, liver, skeleton and muscles, and are part committed in terms of cell fate. Stem cells from the umbilical cord can also be extracted (after birth – so no harm to the baby or mother) which have similar properties to cells in bone marrow. Embryonic stem cells are different in that they are pluripotent, which means they can change into all kinds of human tissues, which equates to over 200 cell types. They can be taken from inside the inner cell mass of a human embryo, which is a few days old. Whilst there has been ethical debate about the use of human embryos specifically, for stem cell research and therapy, the cells are resourced from in vitro fertilisation clinics, where the earlystage embryos are not being used and would otherwise be disposed of. The beauty of embryonic and adult stem cells is that they facilitate a basic building material for different tissue types in the human body. Considering embryonic stem cells have an ability to grow into any kind of tissue, this means they have adaptive regenerative properties for injury and where there is organ damage, or threat of organ failure. Stem cells can grow into brain cells, skin cells, blood cells, muscle cells and be versatile in application.
The University of Bristol recently managed to ‘trap’ adult stem cells in the early stages of development so they would divide and make red cells, in a perpetual cycle indefinitely, without dying – therefore negating the need for further donations. Considering a typical bag of blood has a trillion red blood cells and usually stem cells make around 50,000 before dying – this could prove a revolutionary development. The team has called the cell the Bristol Erthroid Line Adult, or BEL-A cells. After refining the manufacturing methods for red blood cells, clinical trials will begin with patients soon after. Stem cells have also been widely used for skin grafts in the cases of burns victims and this practice has been undertaken for well over 30 years. There are estimated to be around 80 conditions treatable with stem cells today and this figure is changing and increasing all the time – one of the latest being a stem cell treatment for repairing a damaged cornea, on the surface of the eye. One of the most promising treatments for sufferers of Alzheimer’s disease is with stem cells, to replace damaged neurons. With Parkinson’s disease, researchers believe it’s possible to make dopamine producing cells, which during the onset of the disease, disappear in the brain – leading to the symptoms of the condition. The world’s growing diabetes pandemic could be confronted if stem cells can be used to produce insulin. Patients with spinal injury and stroke victims are also the focus for developing treatments derived from stem cell research.
Game changing therapies
Research into hearts
Stem cell treatments have already helped many patients in reversing a range of debilitating conditions. Where cells are damaged in an organ – including the brain, the stem cells adapt to become the relevant cells and bolster the functionality of that organ. In stem cell therapy for blood diseases, more than 26,000 people are treated with blood stem cells a year in Europe. Thousands of children have been saved from leukemia after stem cell therapy. The way blood is harvested for use in hospital treatments could be dramatically improved due to stem cell research. Blood is obviously essential for transfusions and stem cells have shown promise as a source for creating blood supply.
In 2001, Professor Christine Mummery’s research team in the Netherlands used stem cells to make beating heart cells outside of the body for the first time. Since then there has been progress in what stem cells can do for the human heart. A very exciting branch of stem cell research, is investigating how they can help repair damaged or weakened heart walls. In the latest study, published earlier this year in the journal, Circulation Research, it was revealed that stem cells extracted from umbilical cords and given via intravenous infusions to heart failure patients, significantly improved the heart’s ability to pump blood. Half of heart failure patients typically die within 5 years of diagnosis and it’s a condition that an estimated 26 million people
After the research team involved in the study injected mice with stem cells, they concluded that these test mice would live up to 15% longer. It’s now hoped this will lead to treatments that will eventually mean we can expand our natural lifespan, initially, by just over a decade.
www.euresearcher.com
27
Stem cells could help produce increased blood supplies. live with. The results were therefore chalked up as a breakthrough – with patients receiving the treatment reported to be having a better quality of life. This may well lead to new effective and noninvasive therapies for heart failure patients. At present, solutions for heart repair or transplant seem crude and temporary, in comparison to stimulating tissue to self-repair. It’s the flexibility of the regenerative qualities of stem cells that is so exciting – like it could pose as a universal repair kit for any serious degenerative health condition, bringing damaged bodies back to health without artificial aids.
Reversing paralysis For paralysis patients, the outlook can often be bleak and tortuous and where there is a serious spinal injury, this is rarely fixable and the patient is simply managed for comfort. However, the early stages of a clinical trial by Fremont’s Asterias Biotherapeutics has reaped very exciting results in reversing paralysis in patients, presenting real potential for a new treatment. In the trial, patients suffering from severe paralysis had 10 million embryonic stem cells transplanted into their spinal cords. The study revealed that six out of ten people receiving the stem cell treatment improved dramatically after a year. Some patients could move their hands for the first time since the paralysis, feed themselves and even bath themselves. The results need to be further qualified but researchers involved were startled by what they observed.
28
Defying ageing Beyond treatments for diseases – stem cells are also seen as a route to lengthening our lifespans. Scientists at the Albert Einstein College of Medicine in New York are looking to soon launch clinical trials into slowing the ageing process in humans. They have already had success in slowing the ageing process in animals by implanting stem cells in their brains. They focused their attention on the hypothalamus of the brain which regulates growth, development, reproduction and metabolism. In 2013, scientists from the college discovered that the hypothalamus was also responsible for regulated ageing in the body. In their follow-on study, the research team’s findings suggested that ageing is controlled specifically by stem cells in the hypothalamus area of the brain, which secrete miRNAs – which in turn help preserve youthfulness. These stem cells naturally decline and die off around middle age, leading to ageing effects and impacts. However, the process is not thought to be irreversible and so aspects of ageing could be slowed down and perhaps, more profoundly, completely reversed. After the research team involved in the study injected mice with stem cells, they concluded that these test mice would live up to 15% longer. It’s now hoped this will lead to treatments that will eventually mean we can expand our natural lifespan, initially, by just over a decade.
EU Research
The future of stem cell research There are many questions still not adequately answered about all the processes involved in stem cell therapy but the positive results from experiments often speak volumes. With such an adaptable therapy there seems to be no limit to what could be achieved for the future of healthcare. Pharmaceutical companies are taking stem cell research very seriously as an investment, partly because we are seeing a rise in chronic illnesses and partly because of advances in understanding and techniques. This is a tell-tale signal that we should expect its prevalence to increase in health-related treatments. The global stem cell market is predicted to be worth US$ 270.5 bn by 2025, according to Transparency Market Research. The interest in industrialising stem cell medicines is not the only target for this research. With stem cells it can be possible to treat a patient with their own body’s resources, for example, by taking stem cells from a patient’s own bone marrow and reinjecting them in the blood stream – a technique that has shown positive results in boosting the immune system and treating conditions like Multiple Sclerosis.
There are other ways stem cells can be used. In time, we may be able to grow bespoke organs outside the body from stem cells, designed for patient-matching implantations. To put this in perspective, to date, there have been several successful attempts to grow organs (often mini-versions of organs) from stem cells in the lab. For instance, the inner layer of fallopian tubes were grown by the Max Planck Institute for Infection Biology in Berlin and at The Ohio State University, researchers grew a brain structure akin to a five week old human fetus, described aptly by representatives of the University as ‘a brain changer’. The idea that we have the means to heal ourselves and grow human organs, in ways we’ve never seen before, creates a vision of the future where we can eradicate diseases and repair even the most severe of injuries. There may even be a day when stem cells make it possible to reverse the ravages of ageing. However, we must proceed with care and not get carried away with reports of miracle cures and the prospects of immortality. There is already concern in the science field that charlatans are hopping onto the stem cell band-wagon because so many people associate it as a treatment that creates miracle cures. At this juncture in time, what’s needed is more research and more clinical trials, before we jump to grand conclusions too soon about how stem cells can cure all, however tempting.
In the latest study, published earlier this year in the journal, Circulation Research, it was revealed that stem cells extracted from umbilical cords and given via intravenous infusions to heart failure patients, significantly improved the heart’s ability to pump blood.
Stem cells have lengthened the lifespan of mice by 15%.
www.euresearcher.com
29
Getting the measure of ammonia The EU has passed regulations limiting ammonia emissions in Member States, yet it remains difficult to reliably measure levels of the gas in the atmosphere. We spoke to Daiana Leuenberger, Ph.D. and Bernhard Niederhauser about the MetNH3 project’s work goal of improving the metrological infrastructure for ammonia at ambient air amount fractions A gas compound
that plays an important role in atmospheric chemistry, ammonia (NH 3) can have harmful effects on both human health and the environment. With emissions steadily increasing, accurately measuring atmospheric ammonia amount fractions is an important issue in environmental science, yet it is a major challenge. “Ammonia is very difficult to measure, as it reacts easily with other molecules, for example with water. In addition, it readily adsorbs on material surfaces,” says Daiana Leuenberger. Based at the Swiss Federal Institute for Metrology (METAS), Leuenberger is acting coordinator and one of the research scientists working on the MetNH3 project, an initiative that brought together 10 partners from national metrology institutes (NMIs) and other research institutions across Europe to address a diverse range of issues around ammonia metrology in a 3-year project that ended in May 2017. A key goal of MetNH3 was to develop SI-traceable reference material, either in the form of reference gas mixtures or as instrumental transfer standards. “It is only via traceability to a common reference, in our case to the International System of Units (SI), that measurements from one place can be made comparable to other measurements elsewhere,” says Bernhard Niederhauser, the coordinator of the project.
The project itself was organised in three technical workpackages, encompassing work in several different areas. Some partners focussed on the traceability of laser based measurement instruments, while others investigated the generation of SI-traceable reference gas mixtures (RGM) at known amount fractions of NH 3 used for the calibration of laser based spectroscopic instruments “A third focus was on the application and validation of the developed reference gas mixtures and laser based optical instruments and their
It is only via traceability to a common reference, in our case to the International System of Units (SI), that measurements from one place can be made comparable to other measurements elsewhere comparison to instruments of collaborators and stakeholders,” continues Leuenberger. “This has been realised in laboratory and field studies and allowed the assessment of difficulties associated with ammonia measurements, while we have also been able to lower their uncertainty.”
Ammonia emissions This work has been prompted in part by regulatory changes. European Directive 2016/2284/EU sets individual emission ceilings for each Member State to be met by 2020, based on the revised Gothenburg
Provisional results of the NH3 intercomparison study held in South East Scotland.
30
Protocol and sets even more ambitious reduction commitments for 2030 so as to cut the health impacts of air pollution by half compared with 2005. Previous directives led to the incorporation of ammonia measurements into national air monitoring networks and considerable funds have been invested in compliance measures, with the agricultural sector a major area of interest. “The main source of ammonia emissions is the agricultural sector, due to fertilizer application and livestock production,” outlines Leuenberger.
Reducing agricultural emissions requires substantial governmental subsidies, making the assessment of their effectiveness all the more important, over both the short-term and the longer-term and with a consistent geographical resolution. The choice of site for measurements is highly important, as the majority of NH 3 is deposited close to the source. “The development in the emissions is very slow. The detection of small trends requires measurements realised with high precision and low measurement uncertainty,” continues Leuenberger. “Moreover, the datasets have to be coherent over long time periods and consistent in space in order to be comparable. Regular calibration of the instruments with SI-traceable reference gas mixtures, or by comparison to an
EU Research
instrumental standard, is imperative in order to achieve the required accuracy of the data.” “The role of metrology is to provide the measurement community with the link to the SI via traceable reference material, and thus to allow for the comparability of the data obtained with different methods and at different times and places. MetNH3 was realised in the framework of the European Metrology Research Programme EMRP to find solutions for the problems with NH 3 reference materials (Pogány et al. 2016).”
Laser-based methods Many new laser-based methods for measuring atmospheric NH 3 have been developed over the years. The majority of measurements are currently made using passive samplers for reasons of costefficiency, even though indirect quantification is associated with large uncertainties. The alternative approach of applying extractive, laser-based systems operating in the infra-red wavelengths is much more expensive, though it does bring benefits. “The ambient air is analysed at real-time resolution at higher quality - i.e. at lower levels of uncertainty compared to passive samplers. In addition, these analysers can be calibrated with reference gas mixtures making the measurements SI-traceable. Both passive sampling and extractive systems were investigated in the project,” continues Leuenberger. “The chemical uptake rates of various types of passive samplers have been re-evaluated in a test chamber (CATFAC) using projectdeveloped NH 3 RGMs. This will lead to lower levels of uncertainty associated with measurements with these types in the future due to the fact that the new findings are integrated in a European standard document.” “Furthermore, the project managed to realise a so-called optical transfer standard (OTS) for NH 3. This is an IR spectrometric analyser performing absolute NH 3 amount fraction
measurements. Such instrumental standards do not rely on a calibration with NH 3 gas standards but use stable molecular parameters (so-called spectral line parameters) to describe the light absorption properties of ammonia and to deduce the amount fraction in air samples. This opens a complementary route to SItraceability and instrument calibration and circumvents difficulties in the temporal stability of NH 3 reference gas standards normally used for instrument calibration.” Where not all of the relevant input parameters of an instrument are very stable and traceable to the SI, optical analysers ought to be calibrated regularly with SI-traceable reference gas mixtures to detect potential drifts and instrumental offsets in measurements over time. In the course of such tests with reference gas mixtures, a non-negligible crosssensitivity of a commercial optical NH 3 analyser was detected (Martin et al. 2016). In collaboration with the instrument manufacturer, a water correction algorithm has been developed which has now been incorporated in all the new NH 3 analysers.
Reference gas mixtures The previous paragraph highlights the importance of SI-traceable reference gas mixtures (RGM). “The project was, however, strongly motivated by significant discrepancies between different NMIs depending on the techniques applied to prepare NH 3 reference gas mixtures and associated uncertainties,” says Niederhauser. RGMs have to be available at ambient air NH3 amount fractions (0.5 – 500 nmol mol-1, corresponding to 10 -9 mol per mol also referred to as ppb, parts per billion), or dynamically diluted to those fractions, in order to cover the measurement range of the analysers which are to be calibrated. Moreover, the required relative expanded uncertainty is U NH3 ≤ 3 percent (k = 2 at the 95 percent confidence interval). Yet, RGMs produced gravimetrically in pressurised stainless steel or aluminium
Photo of the test chamber (CATFAC). © Nicholas A. Martin, NPL. cylinders are subject to the aforementioned adsorption of NH 3 on their surfaces (Vaittinen et al. 2014). Thus, NH 3 amount fractions released from the cylinder are lower than their initially assigned gravimetrical value. As pressure in the cylinder decreases over time, adsorbed molecules are desorbed and NH 3 amount fractions increase. This instability over time results in high uncertainty in the RGM. Moreover, the mixtures in cylinders are commercially available in amount fractions NH 3 ≥ 20 µmol mol-1 with U NH3 ≥ 3 percent, i.e. at amount fractions 40 to 40000 times higher than in the atmosphere, thus they require dynamic dilution, further adding to the uncertainty of the resulting RGM. Thus, MetNH3 has investigated surface materials which show reduced adsorption of NH 3 molecules. Apart from polymer surfaces, a silica-based coating which is commercially applied by chemical vapour deposition has shown outstanding adsorption-minimising properties when applied on stainless steel cylinders, as well as to other gas-wetted stainless steel surfaces. This considerably increases the stability of the RGM, with NH3 = 10 µmol mol-1, and reduces the relative expanded uncertainty of the RGM to U NH3 ≤ 1 percent. Furthermore, stabilisation times of measurements are significantly reduced, due to considerably decreased adsorption on instrument inlets and cavities. An altogether different approach to producing reference gas mixtures avoids the problem of adsorption losses by dynamically generating NH 3 at ambient amount fractions in real-time. NH 3 stored in a tube in its pure form permeates through a selective polymer membrane as
MetNH3 field intercomparison at CEH Edinburgh, Scotland.
www.euresearcher.com
31
At a glance Full Project Title Metrology for ammonia in ambient air (MetNH3) http://metnh3.eu Project Partners 8 European metrology institutes (METAS, BAM, DFM, MIKES VTT, NPL, PTB, VSL) 2 Researcher Excellence Grant holders (Dr. Christine Braban, NERC CEH Edinburgh, UK. Dr. Olavi Vaittinen, University of Helsinki, Finland) Project Funding MetNH3 is a Joint Research Project (JRP) ENV55 running for three years starting 1st of June 2014 under the European Metrology Research Project (EMRP). The EMRP is jointly funded by the EMRP participating countries within EURAMET and the European Union. Publications Vaittinen et al. 2014. DOI:10.1007/s00340-013-5590-3 Pogány et al. 2016. DOI:10.1088/0957-0233/27/11/115012 Martin et al. 2016. DOI:10.1007/s00340-016-6486-9 Contact Details Daiana Leuenberger, Ph.D. Research scientist E: daiana.leuenberger@metas.ch Bernhard Niederhauser Head of Laboratory / Coordinator MetNH3 E: bernhard.niederhauser@metas.ch Laboratory for Gas Analysis Federal Institute of Metrology METAS Lindenweg 50, 3003 Bern-Wabern Switzerland W: www.metas.ch/gases
Daiana Leuenberger, Ph.D. (Left) Bernhard Niederhauser (Right)
Daiana Leuenberger is a research scientist in the gas analysis laboratory at METAS, a position she has held since 2014. Her tasks are primarily related to the advancement and development of techniques and infrastructure in the dynamic generation of reference gas mixtures for reactive compounds at atmospheric amount fractions. Bernhard Niederhauser is head of the gas analysis laboratory at METAS. He holds deep expertise in high accuracy low inert gas volume flows and the dynamic preparation of traceable low concentration reactive gas mixtures, as well as primary ozone measurements.
32
Summary of the mean of the reported NH3 concentrations for diffusive and pumped samplers tested in the atmosphere test chamber CATFAC, expressed as a percentage deviation from the reference values. a function of temperature and pressure into a precisely controlled flow of NH 3free matrix gas. The amount of NH 3 added to the matrix gas is determined by continuous weighing of the mass loss of the permeation tube (of the order ng min-1, i.e. 10 -9 g min-1) in a magnetic suspension balance. The initial NH 3 mixture can then be diluted in further steps to required amount fractions. The drawback of this method is that the RGMs cannot be preserved for on-site calibrations of instruments in measurement stations. To fill this void, MetNH3 has produced a reference gas generator combining the permeation method with two dilution steps for the on-site production of RGMs at NH 3 amount fractions (0.5 – 500 nmol mol-1) with U NH3 ≤ 3 percent (k = 2).
Inter-comparison event Two inter-comparison events were held under field conditions in Scotland in the summer of 2016 to assess the different developments of the project, as well as to offer collaborators and stakeholders the opportunity to test the performance of their instruments and passive samplers. The different instruments were calibrated beforehand, with the RGMs prepared in cylinders as well as with the mobile reference gas generator, with the aim of ensuring that they all generated consistent measurements and performed well. “In this kind of exercise it can be seen which instruments manage to measure with the required precision, detection level and within a reasonable level of uncertainty,” says Leuenberger. “Fertiliser was applied to the field and NH 3 concentrations increased correspondingly, giving researchers the opportunity to assess the instruments at both background and elevated conditions. We were interested in
seeing whether the analysers could capture both concentrations as well as temporal resolution of the event.” “At a different site nearby, a second inter-comparison was held exposing passive samplers which had been previously tested in the test chamber two times over the course of four weeks. This allowed for the validation of the newly developed, diffusive uptake rates under field conditions.” This work formed an important part of the project’s third workpackage, which centred around the validation of field measurement techniques. The wider goal in the project has been to improve metrological traceability for measuring NH 3 in air, which will enable rigorous monitoring of the impact of efforts to reduce emissions. “It would be very positive if the results of the project were taken up by the respective end-user communities,” says Leuenberger. The project is involved in organisations on both the European and national levels. The moment of truth for researchers developing reference gas mixtures at individual NMIs will come next year when, in a key comparison, they will assess their concordance with the SI on the highest instance. “We’ve seen in the past that the uncertainty estimations with respect to different realisations of primary gas mixtures were too optimistic,” outlines Niederhauser. “The effects of adsorption were not taken into account and this resulted in significant discrepancies between mixtures prepared with different methods, though within the respective methods, there has been good agreement. Considering all the lessons learnt since and from the MetNH3 project, the potential for much better agreement is high.”
EU Research
A deeper picture of biological communities Many different factors affect the distribution of plants, insects and other species in the natural world, including not only their environmental requirements, but also their interactions with other species. The SESAM’ALP project aims to develop improved methods of modelling biological communities, as Professor Antoine Guisan explains The dispersal, organisation and distribution of species in the natural world depends on many different factors. For a plant, temperature, light, water and nutrient availability are all important factors for example, now researchers in the SESAM’ALP project aim to develop improved models of plant and insect distribution, taking into account all the available information. “We try to incorporate all known influences on species distribution, and from that to reconstruct wider communities,” says Professor Antoine Guisan, the project’s Principal Investigator. This work builds on the SESAM (Spatially Explicit Species Assemblage Modelling ; see Fig. 1) framework, a community modelling scheme, effectively a series of modelling steps. “In the SESAM’ALP project we are aiming not only to test the SESAM framework, but to improve it,” outlines Professor Guisan. “Over the past two years we’ve been working on these models for predicting the distribution of individual species, and from that moving to biological community predictions.” Environmental requirements
www.euresearcher.com
A number of different methods have been proposed to build a deeper picture of ecological communities, among the most widely used of which is S-SDM (Stacked Species Distribution Models). Each species is modelled individually and a map of their distribution in a specific area is produced; scientists can then look to bring these maps together. “You can stack all the maps on each other, so that at the end you can
example if there is limited energy and nutrients at a site because you are in a cold climate system, then this might not allow as many species as in a tropical system. It may be that if there is low energy in a system there will be less species.” This, however, is still a matter of debate (see below). Nevertheless, stacking species distribution models can lead to overestimates of the number of species in a
Biotic interactions will influence whether a species will be found in a given location, given the presence of other species. A specific location might be environmentally suitable for a species, but it might not be a very good competitor, and other species could exclude it accumulate species predictions for every site in a given area,” explains Professor Guisan. This approach however ignores the biotic interactions, positive or negative, between species; Professor Guisan and his colleagues aim to integrate more information to reflect this complexity. “There might also be some further constraints on a given site, on the biological community,” he says. “For
given site, so researchers are also looking to introduce top-down constraints (Fig. 2). “We can predict species richness in space using macroecological models (MEM), then use that to constrain the S-SDM prediction (Fig. 2 to 4). For example, your S-SDM prediction may predict 100 species in a site, and your macroecological model (MEM) predicts there would be 50,” says Professor
©NASA
This starts from an understanding of a particular species’ environmental requirements, what biologists call its environmental niche. While the environmental conditions are clearly important in terms of the development and survival of a species, Professor Guisan says there are also other factors to consider. “For example biotic interactions – interactions between species – will influence whether a species will be found in a given location, given the presence of other species. A specific location might be environmentally suitable for a species, but it might not be a very good competitor, and other species could exclude it,” he explains. Another key factor to consider is the dispersal ability of a species. “Typically you find species within a specific distance from where they originated,” says Professor Guisan. “These are the three main influences on species distribution, and they are all included in the SESAM framework.
Fig 1: SESAM framework : Journal of Biogeography (2011) 38 / Guisan et Rahbek. © 2017 John Wiley & Sons, Inc.
33
©NASA Earth Observatory images by Joshua Stevens, using Landsat data from the U.S. Geological Survey: Pola Lem Guisan. It might be the case that 50 species will not be able to survive in that specific site, so rules need to be identified to exclude those species. “These rules, the so-called ecological assembly rules, should be based on known interactions, like competition, predation, or other interactions that could lead to the removal of a species from a site for instance,” continues Professor Guisan. The ecological assembly rules are an important dimension of the SESAM framework. The S-SDM is developed, which is then constrained with MEM, then on this basis researchers use assembly rules, such as the probability ranking rule (PPR; Fig. 3 and 4), to remove species that would not be in a real-life assemblage; Professor Guisan says that developing these rules is a major challenge. “It’s extremely difficult to quantify these rules,” he explains. While insights have been gained on some species from experiments, there is still much left to learn, so researchers have also been observing ecological communities in natural landscapes, such as the Swiss Alps. “We have been observing different communities, including different types of organisms and plants,” says Professor Guisan. “We are observing biological communities at a number of sites, and from this co-occurrence of species, we try to infer assembly rules. So from this we can say for example that across all the sites we have been surveying, two specific species always co-occur together, or never co-occur together.”
The species richness of a site is not the only major consideration in terms of constraining ecological communities and improving predictions, and researchers have investigated other possibilities. Along with considering the number of species at a site, Professor Guisan and his colleagues have also considered constraining predictions by functional – or trait – diversity. “For example think of a highaltitude site – tall trees or plants are less likely to survive there. So if you have both measurements of the height of species and predictions of the expected height for that type of environment, you will be able to exclude all species that are maybe too tall to survive there, as it’s too windy or cold,” he says. This approach was not as successful as species richness however in terms of improving the predictions. “We always
compare the predictions we make to independent observations. We build the model, we exclude the data that is not used to build the model, and we test the predictions on that independent data,” outlines Professor Guisan. From these tests researchers learned that, unlike richness constraints, the inclusion of trait constraints did not generally improve the quality of predictions of ecological communities, and in some cases even had a negative impact. This might be due to the limited number of traits available, which mainly exist for above-ground characteristics (e.g. no trait measurements are currently available for roots), but it might also be partly caused by the difficulty of modelling the distribution of some individual species. “Typically, generalist species are more difficult to model than
Biotic interactions This latter case might be the result of a biological interaction, or it might just be because the two species do not occur in the same type of environment. Researchers aim to tease apart the different influences on species distribution and derive rigorous rules of biotic interactions from the available field data. “When species tend to co-occur, it might be that two species both like the same environment and are not in competition, or there could be facilitation between them, that one helps the other. Alternatively the opposite might be the case – two species may seem to have similar environmental requirements, but they very rarely co-occur together,” outlines Professor Guisan. This data will then be incorporated in the SESAM framework, using high-resolution environmental maps. “It’s almost a case of refining and improving the initial S-SDM map by adding these two further steps, the MEM and the assembly rules,” continues Professor Guisan. “We have been testing different approaches to these steps (Fig. 3 and 4).”
34
Figure 2. Predicted and observed plant species richness (SR) at 912 plots in open habitats in the western Swiss Alps (260 species) at a resolution of 25 · 25 m. (a) Estimated richness by macroecological modelling (MEM) using five variables – growing degree-days above 0 ºC, moisture index, global solar radiation, slope and topographic position – and the average of four modelling techniques [generalized linear modelling (GLM), generalized additive modelling (GAM), gradient boosting model (GBM) and random forest (RF)]; see Dubuis et al. (in press). (b) Estimated richness by stacked species distribution modelling (S-SDM) using the same five variables and generating species distributions for each species using the average of the same four techniques used for MEMs. (c) Observed richness. (d) Scatterplot of MEM versus observed richness. (e) Scatterplot of S-SDM versus observed richness. (f) Scatterplot of MEM versus S-SDM richness. In scatterplots (d) to (f), the dotted red line represents the 1:1 relationship, and the plain blue line represents the regression line of a standard ordinary linear squares (OLS) regression across the cloud of points. (g) Histogram of MEM minus observed richness values. (h) Histogram of S-SDM minus observed richness values. (i) Histogram of S-SDM minus MEM richness values. In histograms (g) to (i), the vertical red line represents zero difference. Maps were generated and provided by Anne Dubuis (Dubuis et al., in press). / Guisan et Rahbek. © 2017 John Wiley & Sons, Inc.
EU Research
specialist species. If a species has a very clearly defined ecology, it’s easier to model than if it’s present in a lot of places and we can’t really identify the factors that drives it,” explains Professor Guisan. There is still scope for further improvement, and alongside the work of his group in the Swiss Alps, Professor Guisan says the SESAM framework is also being tested elsewhere, testing its wider relevance. “The concept behind the framework is universal, in theory it could work everywhere,” he outlines. This is not to say that the theoretical foundations of the SESAM framework are beyond question though. The idea that there are constraints on communities, in terms of the number of species that can be supported in a specific community, is still a matter of intense debate. “Species richness as constraint is often discussed along with the idea of saturation. Is there a limit to the number of species that can coexist in a given community?” says Professor Guisan. Some scientists argue that there is no limit, as biological invasions tend to show that new species can always invade a community, but others take a different view. “Others say that there is saturation, that while a species could invade, over time the community will come back to its initial number, and maybe the invasive species
will replace a previously native species,” continues Professor Guisan. “We recently published a paper in Trends in Ecology & Evolution about this question of how to deal with this saturation question in models.” The flexibility of the SESAM framework is an important attribute in this respect, as a constraint like the MEM can be added or removed as required. The wider goal in this research is improving the accuracy of predictions, which holds clear importance in terms of assessing the likely impact of climate and other environmental changes. “In SESAM’ALP we have been working to improve the models for individual species groups, and the SESAM framework,” says Professor Guisan. “We are also working on the EcoGeoIntegrAlp project, in which we aim to improve plant distribution predictions by integrating further environmental data. For that, we are working on a multidisciplinary basis, with environmentalists, geologists, hydrologists and others. We aim to predict two main ecosystem services – one is the aesthetic quality of the landscape, which is a very pressing question here in the Swiss Alps. The second is hydrological runoff. For example, if forests cover a larger surface area in future, how will this affect water flow down mountains to rivers, and thus water provision?”
At a glance Full Project Title Challenges in simulating alpine species assemblages under global change (SESAM’ALP) Project Objectives The knowledge gained at the end of the project, and the new innovative approaches, tools and datasets delivered, should foster important advances in our capacity of modeling and predicting communities across entire landscapes. In particular, it should allow addressing partially the question: will plant and insect communities evolve into novel assemblages under global changes? Project Funding Swiss National Science Foundation, SESAM’ALP project, grant nr 31003A1528661, 480,000 euros. Contact Details Antoine Guisan, Full Professor Head of the Spatial Ecology Group (ECOSPAT) With affiliation to: Dept. of Ecology and Evolution (DEE) & Institute of Earth Surface Dynamics (IDYST) University of Lausanne (UNIL) 1015 Lausanne, Switzerland T: +41 692 42 54 / 36 15 E: antoine.guisan@unil.ch Site of Spatial Ecology Group at the University of Lausanne (ECOSPAT) website: W: http://www.unil.ch/ecospat SESAM’ALP Project website: W: https://www.unil.ch/ecospat/en/home/ menuinst/projects/main-projects/sesamalp.html EcoGeoIntegrAlp interdisciplinary project website: W: http://wp.unil.ch/integralp
Figure 3. Boxplots comparing unconstrained stacked species distribution model (bS-SDM) predictions to results from the ‘probability ranking’ rule and random tests when applied constraining richness by the sum of probabilities from SDMs (PRR.pS-SDM and rand.pSSDM, respectively) or by macroecological models (PRR.MEM and rand.MEM, respectively) (a, b, c). The metrics utilized in the comparison are: species richness error, i.e. predicted SR – observed SR (first column); prediction success, i.e. sum of correctly predicted presences and absences divided by the total species number (second column); and Sørensen index, i.e. a statistic used to compare the similarity of two samples (third column). Journal of Biogeography (2015) 42 / D’Amen et al. © 2017 John Wiley & Sons, Inc.
Professor Antoine Guisan
Antoine Guisan is Full Professor of Ecology at the University of Lausanne. His main focus is on spatial predictive modelling of plant and animal distributions, and the communities they form. His group is developing models for various applied purposes, such as rare species management, assessing climate change impacts and anticipating biological invasions.
Figure 4. Predictions of species richness for the whole study area produced by (a) the unconstrained stacked species distribution model (bS-SDM), and by the application of the SESAM framework implemented with (b) the ‘probability ranking’ rule implemented with the sum of probabilities from SDMs (pS-SDM), (c) the ‘probability ranking’ rule implemented with the richness estimation by the macroecological model (MEM) and (d) the ‘trait range’ rule (using the combination of the three traits as constraints). Journal of Biogeography (2015) 42 / D’Amen et al. © 2017 John Wiley & Sons, Inc.
www.euresearcher.com
35
Aftermath of the fight: The consequences of out-group conflict Professor of Behavioural Ecology, Andy Radford, heads up a five-year project at the University of Bristol, investigating the impact for social animals of out-group conflict. The study seeks to uncover short and long-term effects for groups of confrontations with outsiders Whilst there are many published studies on the effects of fighting within groups of animals, the lasting consequences of conflicts with outsiders (rival individuals or groups) has received relatively little scientific attention. However, these consequences of out-group conflict have deep relevance for our understanding of biology, anthropology, economics and psychology, as well as the social and political sciences. In a wide range of social species, including various insects, birds, fish and of course humans, there is an investment into acquiring and defending resources to survive. Battles with out-groups over these resources have knock-on effects for all involved in the immediate aftermath, but also in the longer term and for future generations. Andy Radford and his team, in collaboration with other universities, are conducting experiments with wild populations of dwarf mongooses and macaques, captive cichlid fish, and humans. Combining these with theoretical approaches, they endeavor to reveal effects of out-group conflict on: individuals, within-group interactions and decisionmaking, stress, social and reproductive health, and the evolution of weaponry, fortification, cooperation and punishment.
Introducing intruders “Experimental manipulations allow us to isolate the factors that are important, by controlling potential confounding effects, and thus to work out the true impacts of outgroup conflict,” explains Radford. “Using dwarf mongooses and cichlids as model study systems gives us a complementary approach. We work with the mongooses in their natural habitat, so you gain ecological validity but lose a level of control over everything. Whereas an advantage with the captive cichlid experiments is that there is more control of their environment and what goes on around them.
36
“With the cichlids, we give a group its own territory in the form of a tank. We then set tanks in ‘neighbourhoods’ so certain groups can see each other. The neighbours are rivals and they can recognise individuals visually,
interested in is whether there are changes in behaviour among the territory-holders in the immediate aftermath of an intrusion, and whether that is affected by the identity and the number of intruders.”
Experimental manipulations allow us to isolate the factors that are important, by controlling potential confounding effects, and thus to work out the true impacts of out-group conflict although they are in separate water. We then carry out simulated intrusions: you can put a transparent divider in a tank and introduce an intruder whenever you want, and then take the intruder away. What we are currently
The research is in its early days, being one year into the five-year study, but the aim is not only to consider short-term effects of single intrusions, but also the long-term implications of repeated intrusions.
EU Research
The impacts of fighting out-groups
How does this relate to humans?
Out-group conflict has a range of potential consequences for an animal. There can be direct effects such as mortality, injury, or loss of resources or reproductive position. Lesser direct impacts arise from the investment of time and energy during the contest, which are traded off against finding food or parental care. Moreover, a fighter may be rewarded for defending the group territory by those that didn’t take part, while individuals that should have taken part might be punished to try to ensure that next time they make a stand. “It does not have to be you involved in the contest – there are consequences for all individuals in the group, including offspring yet to be born. We can examine this possibility with the cichlids, because they will breed in captivity and so you can count the number of eggs, remove them to take various measurements or even raise them separately to investigate maternal effects. If mothers are stressed, what goes into their eggs might differ, so it’s possible there is an effect on how offspring develop. Moreover, there might be changes in how quickly parents produce the next clutch depending on the frequency of intrusions by outsiders.” Monitoring in this way will show how out-group conflict can have longer lasting impacts, including those for subsequent generations and, ultimately, evolutionary processes. Such considerations are particularly relevant in a world of shrinking habitats where animal groups are competing for limited resources.
Whilst the study is primarily focusing on non-human animal behaviour postconflict, there are undeniable parallels with humans. As a warring society, can the research be relevant for understanding human nature better? “Fundamentally, wars are about resources such as oil fields or food; even religious sites can be viewed as a resource – it’s the same principle,” observes Radford. “If you talk about primitive human warfare in ancestral societies then there are lots of parallels, but modern warfare is complicated by the fact commanders are nowhere near the frontline – they direct pawns and longrange weapons from far away.” The study has more common ground with conflict between smaller groups than whole nations. For instance, the team are starting collaborative work with psychologists investigating street fights after pubs close or instances of shoplifting (where the criminal is the outsider, and shop workers and customers have different levels of investment in the resource). “We’re also starting to consider work on mothers and toddlers, and the cliques that form in group settings. There can be conflict between toddlers fighting over a toy or a mother being nasty to another mother, and we’re interested in the aftermath effects and interactions.” Conflict with out-groups has significant effects for us, as much as any social animal, so the project’s work will produce data and knowledge that could have a wide range of applications.
At a glance Full Project Title Consequences of out-group conflict (OutGroup) Project Objectives The aim of this project is to determine the proximate and ultimate consequences of a fundamental but neglected aspect of sociality: out-group conflict. In a wide range of social species, from ants to humans, group members invest considerable defensive effort against individual intruders and rival groups. The lasting impacts of these conflicts with conspecifics are poorly understood. We will integrate empirical and theoretical approaches to uncover the effect of out-group conflict on: (i) individual behaviour, within-group interactions and group decisionmaking; (ii) steroid hormones that underlie stress, social behaviour and reproduction; (iii) variation in reproductive success arising from maternal investment and offspring care; and (iv) the evolution of societal structure, cooperation and punishment among group-mates, and weaponry and fortification. ERC Team Members Dr Ines Goncalves, Dr Julie Kern, Amy Morris-Drake and Dr Susanne Schindler. Project Funding ERC Consolidator Grant Contact Details School of Biological Sciences University of Bristol Life Sciences Building, 24 Tyndall Avenue, Bristol BS8 1TQ T: +44 117 394 1197 E: andy.radford@bristol.ac.uk W: http://www.bio.bris.ac.uk/research/behavior/ Vocal_Communication/research.html
Professor Andy Radford, Ph.D
Photograph by Shannon Benson Professor Andy Radford, Ph.D did his first degree at the University of Cambridge (Girton College, Zoology BA Hons, 1996), then completed a Masters at the University of Oxford (New College, Biology: Integrative Bioscience MSc, 1998). He returned to the Department of Zoology (and Girton College) in Cambridge to conduct his PhD under the supervision of Professor Nick Davies FRS, which he obtained in 2003, when his thesis was examined by Professors Tim Clutton-Brock FRS and Ben Hatchwell. By that stage, he was a Junior Research Fellow at Girton College, a position he occupied until 2005. In 2005 he was awarded a BBSRC David Phillips Research Fellowship, which he moved to Bristol in 2006 to take up a proleptic lectureship in the School of Biological Sciences. He was promoted to a Readership in 2012 and became Professor of Behavioural Ecology in 2016.
www.euresearcher.com
37
Laying the foundations for research collaboration in the Caucasus There is a long history of scientific collaboration in the Caucasus region, but geopolitical shifts since the early ‘90s have led to changes in international relations. With countries in the region facing many common challenges, it’s important to build strong research partnerships and share expertise with practitioners, as Professor Jörg Balsiger explains The Caucasus experienced
dramatic geopolitical shifts following the fall of the Soviet Union, which also affected scientific infrastructures and research relationships in the region. While there is a long history of scientific collaboration in the region, this was mainly during the Soviet era, and since the early ‘90s international relationships have changed. “When the Soviet Union collapsed in the early ‘90s, three countries were created in the Caucasus – Armenia, Azerbaijan and Georgia. Because of these new countries’ ambition to assert their independence, their relations with Russia changed,” explains Professor Jörg Balsiger. Based at the University of Geneva, Professor Balsiger is the coordinator of a project aiming to support research collaboration in the Caucasus, working with scientific institutes across the region. “While the countries that are traditionally thought of as part of the Caucasus are the Russian Federation, Armenia, Azerbaijan and Georgia, we’re working at the level of the Caucasus ecoregion, which includes Iran and Turkey,” he says. This work is designed to support the Scientific Network for the Caucasus Mountain Region (SNC-mt), which was formed in the Georgian capital Tbilisi in the Spring of 2013. The network was established in recognition of the need for closer collaboration on transnational issues such as water management, cultural heritage conservation and socioeconomic development which affect the wider Caucasus region. This was reflected in the composition of the project. “We decided from the beginning that we wanted to focus
38
on regional issues in the Caucasus, so we wanted to include the surrounding countries,” continues Professor Balsiger. A number of these countries didn’t have the resources required to maintain existing scientific infrastructure following the collapse of the Soviet Union, so in some cases universities and research laboratories went into decline, limiting the scope for scientific collaboration. The project aims to help address this by bringing together researchers from scientific institutions across the region to identify key issues and common challenges. “We aim to identify priority areas of research, to summarise the state of knowledge in the Caucasus. This would then become a tool to be used by scientists, to demonstrate to both potential funders and scientists elsewhere that they have a regional vision, that they do talk to each other,” says Professor Balsiger.
vulnerable to the impact of climate change than others. The topological complexity means that plant and animal species have less room to move,” explains Professor Balsiger. Supporting sustainable mountain development in the region is a key priority, taking into consideration not only the environmental perspective, but also the local economy. “If you come at it from a purely environmental perspective, you might say let people move away, then forests and animal populations can recover. But then you might not have the workforce and local knowledge necessary to maintain the cultural landscape,” points out Professor Balsiger. Research in this kind of area is by nature multi-disciplinary in scope, underlining the wider importance of close scientific collaboration. Professor Balsiger believes the project can have a significant impact in these terms. “The project helps
We aim to identify priority areas of research, to summarise the state of knowledge in the Caucasus. This would then become a tool to be used by scientists, to demonstrate to both potential funders and scientists elsewhere that they have a regional vision Caucasus region The region itself is extremely mountainous and is home to numerous rare plant and animal species, yet this diversity is under threat for two main reasons. One is the issue of rural poverty, which is leading younger people to move away from the area and intensifying pressure on resources, and the other is the impact of climate change. “Mountainous regions are known to be more
scientists to adapt to a changing world by improving organisational capacity with respect to ‘modern’ approaches to teaching, to research collaboration, to the enterprise of science itself and the position of science in society,” he says. One of the key activities within the project is the establishment of an online information and collaboration platform, which provides a basis for continued scientific
EU Research
cooperation between countries in the region. “We talk about trans-disciplinary science, where knowledge is co-produced with stakeholders. We try to promote that, and to do it in a way that builds bridges between scientists and practitioners in the Caucasus,” says Professor Balsiger There are still some major challenges to deal with in terms of encouraging closer scientific collaboration across the Caucasus however, one of which is the region’s linguistic diversity. Over forty different languages are spoken in the region, and unlike their parents the younger generation no longer systematically learns Russian at school. “While Russian remains an important common language, it is only common to those countries that are traditionally thought of as part of the Caucasus, namely Russia, Armenia, Azerbaijan and Georgia,” outlines Professor Balsiger. A summer school for graduate students was organised within the project, which has helped to bridge cultural divides, establish areas of common interest and enable scientists to establish and maintain strong research networks. While continuing geopolitical differences in the region can limit travel between countries, Professor Balsiger believes these challenges can be overcome. “There are serious geopolitical differences in the region. They are always on the horizon, but they don’t affect us in the project directly, as the people who work with us know and trust each other,” he says.
Research collaboration The project is part of the longer-term goal of supporting research collaboration in the Caucasus, and beyond the scope of the current initiative Professor Balsiger is keen to build further on the foundations that have been laid. A further project is currently being negotiated, with potentially a far bigger budget, which will centre on adaptive capacity. “One thematic focus will be disaster risk reduction,” outlines Professor Balsiger. This is a major interest of the Swiss authorities who are set to fund the research, as well as other international partners such
www.euresearcher.com
At a glance Full Project Title Supporting Sustainable Mountain Development in the Caucasus (Sustainable Caucasus) Project Objectives The aim of the “Sustainable Caucasus” project is to support scientific institutions in the Caucasus in their modernization efforts in research and teaching, to increase their attractiveness and international competitiveness by improving the overall framework conditions and to better link research outcomes to development practice. Project Funding Swiss National Science Foundation and Swiss Agency for Development and Cooperation (SCOPES Programme).
as UN Environment; Professor Balsiger hopes the project will help broaden the base of expertise across the region. “Over the course of the project we will want to ensure that modern disaster risk reduction management approaches are well known throughout the region, and are streamlined throughout Georgia where they have been piloted. We’ll also work in Armenia and Azerbaijan, and hopefully in Russia, Turkey and Iran,” he says. “We’ll aim to ensure that our scientific network can be a source of expertise and trans-national links between experts and practitioners.” The initiative also aims to make the Caucasus more visible globally, and to help change perceptions of the region. While the Caucasus has a rich cultural heritage and is home to many rare species, like snow leopards, the region is more commonly associated with political instability. “The Caucasus is primarily known internationally as a site of conflict. We aim to help promote a different view of the Caucasus,” says Professor Balsiger.
Project Partners Jörg Balsiger, University of Geneva (Main applicant) • Raisa Gracheva, Institute of Geography, Russian Academy of Science • Armen Gevorgyan, Institute of Botany, National Academy of Sciences of Republic of Armenia • David Tarkhnishvili, Faculty of Natural Sciences and Engineering, Ilia State University (Georgia) • Joseph Salukvadze, Department of Geography, Tbilisi State University (Georgia) • Ramiz Mammadov, Institute of Geography, Azerbaijan Academy of Sciences • Kamran Shayesteh, Faculty of Natural Resources and Environment, Malayer University (Iran) • Mehmet Somuncu and Hakan Yigitbasioglu, Geography Department, Ankara University (Turkey) • Caucasus Network for Sustainable Development of Mountain Regions (sd-caucasus.com) Contact Details Professor Jörg Balsiger University of Geneva Department of Geography and Environment Uni Carl Vogt B308 Boulevard Carl-Vogt 66 1211 Geneva 4 T: +41 22 379 9453 E: joerg.balsiger@unige.ch E: info@caucasus-mt.net W: www.caucasus-mt.net
Professor Jörg Balsiger
Jörg Balsiger is Associate Professor at the Geneva School of Social Sciences and the Institute for Environmental Sciences, as well as Director of the Institute/ Hub for Environmental Governance and Territorial Development. His inter- and transdisciplinary research examines the origins, dynamics and prospects of transboundary environmental and sustainable development cooperation.
39
Improving energy efficiency in buildings is a major priority for the European Union, yet current modelling processes do not accurately reflect consumption. The MOEEBIUS framework will provide the basis for more accurate energy performance assessment, underpinning efforts to improve efficiency and opening up new commercial opportunities, as Dawid Krysiński explains
A new era in the energy performance of buildings Buildings are responsible
for 40 percent of energy consumption and 36 percent of CO2 emissions in the European Union. Given this fact, the EU promotes solutions which reduce energy consumption in the buildings sector. With increasing demand for more energy efficient buildings, the construction and energy services industries are faced with the challenge of ensuring that predicted energy performance and savings are achieved during operation. However, this is very difficult because of a ‘performance gap’ which is caused by occupants’ behaviours and the lack of a ‘magic formula’ that accurately captures the dynamic aspects of building operations. As a result, current modelling processes and simulation tools do not accurately reflect the realistic use and operation of buildings. Due to this, post-occupancy evaluation studies in built and occupied buildings show significant gaps between predicted and actual energy consumption.
MOEEBIUS closes ‘performance gap’ This underperformance highlights the need for a deeper understanding of the critical underlying performance factors related to the behavior of occupants, as well as other major active building elements. MOEEBIUS is the answer to these challenges, introducing a holistic modelling approach that focuses on appropriately addressing all sources of
40
uncertainty and inaccuracy in building and district performance assessment. As Ander Romero, project coordinator, emphasizes, “these highly accurate predictions, simulations and optimization are possible due to implementation of dynamic models reflecting user comfort and overall behaviour in the built environment, enhanced district heating models, novel Indoor Environmental Quality models and short-term weather forecasts.” The MOEEBIUS Framework comprises the configuration and integration of an innovative suite of end-user tools and applications enabling improved building energy performance assessment, precise allocation of detailed performance contributions of critical building components, near real-time building performance optimization, optimized retrofitting decision making and near real-time peak-load management optimization at the district level. Thanks to that, MOEEBIUS is able to deeply grasp and describe real-life building operation complexities in accurate simulation predictions that significantly reduce the ‘performance gap’. This is a key factor for improving energy efficiency in buildings and meeting the European Commission’s 20 percent energy savings target by 2020. MOEEBIUS will also contribute to better development of Energy Service Companies (ESCOs) which are responsible for different energy solutions, including the design and
implementation of energy projects, retrofitting and conservation.
savings energy
‘Performance gap’ as a challenge for ESCOs Current predictions tend to be unrealistically low whilst actual energy performance is usually unnecessarily high . Post-Occupancy Evaluation studies in built and occupied buildings have demonstrated that the measured energy use can be as much as 2.5 times the predicted energy use (more than 70 percent in the retail sector, 100 percent in residential, 150 percent in offices, and over 250 percent in the education sector). In more detail, the ‘performance gap’ generates a consequent gap between payback estimates and techno-commercial Return-On-Investment calculations in ESCO projects which still use previous energy audits based on simplistic and inaccurate calculations. Due to this, ESCOs are not able to provide customers with appropriate simulations on acceptable paybacks of 2-5 years using low-cost interventions. This constitutes a significant barrier to the development of the ESCO market. In the opinion of Ander Romero, “ESCOs are forced to add installation and commissioning services, project management, man effort, measurement and verification costs to hedge the risks induced by prediction uncertainty and inaccuracy”. It makes many contracts
EU Research
totally unattractive, also in cases where the ESCO takes over the full implementation of a refurbishment project (from auditing to design and implementation). This introduces extra risks for ESCOs and significantly reduces their profit margins. Through significant reduction of the ‘performance gap’, the holistic approach introduced in MOEEBIUS enhances the ability of ESCOs to guarantee attractive energy savings. “This will, in turn, eliminate the need for the addition of riskhedging costs on-top of pure energy services, consequently increasing the payback attractiveness of energy performance contracts and reinforcing confidence of customers regarding EPC effectiveness. This is crucial for the growth of the ESCO market, especially at EU level,” says Ander Romero.
and automated adjustment of heating, ventilation and air conditioning systems as well as lighting loads. The introduction of a user-driven innovation approach in MOEEBIUS will also ensure the involvement of end-users throughout the whole duration of the project, so as to fully cover their needs and expectations. To achieve this goal, the innovative MOEEBIUS solutions will be validated in real-life conditions over an extensive 20-month pilot roll-out period in a variety of buildings. This evaluation will be performed under different environmental, social and cultural contexts/criteria in three dispersed geographical areas: London (UK), Mafra (PT) and Belgrade (RS). Moreover, “MOEEBIUS will establish a complete awareness and communication with ESCOs, Maintenance Companies,
These highly accurate predictions, simulations and optimizations are possible due to implementation of dynamic models reflecting user comfort and overall behaviour in the built environment, enhanced district heating models, novel Indoor Environmental Quality models and short-term weather forecasts Humans at the centre of MOEEBIUS Energy performance optimization in buildings relies heavily on the deep and comprehensive understanding of real-life complexities imposed during actual operation. Such complexities span not only physical systems (buildings, districts and their equipment) and weather fluctuations, but also occupants and their behaviours. This is the reason why humans are placed at the centre of all optimization processes applied in MOEEBIUS. Energy performance optimization will use human-centric approaches that allow for continuous assessment of relevant aspects
www.euresearcher.com
Facility Managers, Aggregators, either involved in or affected by the project,” says Ander Romero. To this end, the MOEEBIUS Living Lab involves end-users right from the genesis of a new idea, creating the motivation to share and discuss their experiences, expectations and requirements. This is a collaborative environment where all stakeholders cocreate the solutions leading to a natural acceptance by all MOEEBIUS end-users – people who will be empowered not only to test, evaluate and report their own experience with the MOEEBIUS framework, but mainly to live with it and smoothly accept and incorporate the MOEEBIUS system in their everyday lives.
At a glance Full Project Title Modelling Optimization of Energy Efficiency in Buildings for Urban Sustainability (MOEEBIUS) Project Objectives MOEEBIUS introduces a Holistic Energy Performance Optimization Framework that delivers innovative tools which deeply grasp and describe real-life building operation complexities in accurate simulation predictions. The system reduce the “performance gap” and enhance optimization of building energy performance. Contact Details Fundación TECNALIA Research & Innovation Parque Científico y Tecnológico de Bizkaia - C/Geldo, Edificio 700 E-48160 Derio (Bizkaia), Spain Project coordinator: Ander Romero E: ander.romero@tecnalia.com T: +34 946 430 069 W: www.moeebius.eu Pablo De Agustin E: pablo.deagustin@tecnalia.com Dissemination Leader: Dawid Krysiński, E: d.krysinski@asm-poland.com.pl
Dr Pablo De Agustin (Left) Ander Romero (Centre) Dawid Krysiński Ph.D. (Right)
Ander Romero - MASc in Thermal Engineering, MSc Industrial Engineer - is a Project Manager in the Sustainable Construction Division of TECNALIA. He joined TECNALIA in 2007 as a senior researcher in the field of energy efficiency in building design and retrofitting, focusing on energy modelling and integration of innovative and sustainable solutions to optimize urban and building energy performance. He is currently coordinating several H2020 and FP7 projects related to energy efficiency and demand response. Dr Pablo De Agustin is an energy efficiency researcher at TECNALIA, he has worked on energy efficiency in buildings and renewable energies’ integration since 2011. His professional background includes simulation and experimental works focused on energy efficiency, electric and thermal metering, buildings as energy storage systems, and trigeneration and self-consumption solar heating and cooling systems. Dawid Krysiński Ph.D. – Junior Project Manager at ASM – Market Research and Analysis Centre. His activity focuses on preparation of market and social analysis, exploitation plans, IPR strategies and business models in the field of energy efficiency and green construction practices. Currently, he is involved in several HORIZON 2020 projects leaded by ASM.
41
Next generation detectors to look beyond the standard model The discovery of the Higgs Boson at the Large Hadron Collider (LHC) at CERN represented an exciting step forward in particle physics, now further experimentation is being conducted and searches made for new physics beyond the standard model. That will require improved detectors, explains Professor Sir Tejinder Virdee A high-energy physics experiment is typically shaped like a cylindrical onion, with four principal layers, which together allow researchers to measure the energy, direction and identity of the particles produced in a collision. “The particles go through these layers: the first one, submerged in a magnetic field, measures the curvature of charged tracks and the next two measure the energy of particles. The first one of the energy measuring layers is called the electromagnetic calorimeter, the second layer is the hadronic calorimeter,” explains Professor Sir Tejinder Virdee. “The particles that go through these – like electrons, photons and some other charged and neutral particles – create showers of secondary particles by interacting inside the dense material”. Based at Imperial College in the UK, Professor Virdee is the Principal Investigator of an ERC-backed project developing a novel approach to calorimetry. “We propose an approach to energy measurement that combines state of the art techniques so far only used independently, either in charged particle tracking or in conventional calorimeters,” he outlines. The proposed calorimeter is based on large-scale use of silicon sensors with fine cell size. In addition to the measurement of the energy of high-energy particles it is envisaged to use information about their precise timing and the path they follow to provide sufficient information to be able to cope with the extreme rates associated with the High Luminosity LHC (HL-LHC). Amongst the technologies to be advanced are powerful electronics in emerging fine-feature size technologies; low-cost silicon sensors in 8” wafer technology; high performance and fast decision making logic using new more powerful Field Programmable Gate Arrays (FPGA’s), all to be produced at an industrial scale.
42
One hexagonal silicon sensor module comprising a silicon sensor mounted on a copper/tungsten backing plate topped with the PCB that has mounted on it the front-end electronics. In this test the silicon sensor is cut out from a 6” diameter wafer. Particles usually deposit energy by either exciting or ionising the atoms in the traversed material. “Excited atoms emit light upon de-excitation, and the total amount of light picked up is proportional to the energy of the incident particle,” says Professor Virdee. In ionisation, the electron breaks free from the atomic bond. “As the ionisation electrons move they induce a current in the detecting medium, which is picked up by a very sensitive amplifier which amplifies the signal. That signal is again proportional to the energy of the incident particle,” explains Professor Virdee. “We’re developing a highperformance calorimeter that will also be suitable for the next generation of High Energy Physics (HEP) experiments.”
New physics This work has been prompted to a large degree by the search for new physics beyond the standard model (SM) of particle physics. Earlier in his career Professor Virdee was instrumental in developing the lead tungstate scintillating crystal
electromagnetic calorimeter used in the CMS experiment at CERN, in which the Higgs Boson was experimentally discovered. “The Higgs Boson can be detected when it decays into two photons, and these photons deposit energy in the electromagnetic calorimeter,” he explains. This discovery has opened up a window into new physics which researchers are keen to explore. When a new particle is discovered the next step is to study it and learn more about the context in which nature has presented it. “For that, we really need to make the particular particle in very large numbers,” continues Professor Virdee. “In order to do that, in 2015 the energy of the LHC accelerator was increased, so that we now make twice as many Higgs bsoons per proton-proton interaction. At the HL-LHC it is foreseen to increase the interaction rate by a factor five and the total number of interactions examined by a factor of ten compared with the original plan.” Despite its immense success in describing all current measurements at the LHC, it is known that the SM is incomplete;
EU Research
©CERN
At a glance Full Project Title Exploring the Terascale at LHC with Novel Highly Granular Calorimeters (Novel Calorimetry)
amongst the questions for which answers are being sought are: What constitutes dark matter? Are there extra dimensions of space? Why is the universe composed of matter and not antimatter? Professor Virdee says that elements of the existing experiment will not be able to function effectively in the new environment. “The replacement elements will have to function with a five-times larger flux of particles going through, and ten times higher radiation levels,” he outlines. The volume of information gathered by the detector is another major consideration. “We need to be able to separate the interesting collisions from the less interesting collisions,” says Professor Virdee. “For every interesting collision there are typically
Next generation detectors These high-performance calorimeters will play a key role in the next generation of detectors. A more powerful instrument will allow researchers to analyse collisions in greater detail, and gain new insights into major questions in physics, including those around dark matter. While much has been achieved over the years at the LHC, Professor Virdee is a strong advocate of continued development, which is central to further experimental advances. “As experimentalists we keep an open mind,” he says. “The hope is to find something completely new, that enhances our knowledge, and guides future research.”
The particles that go through detectors – like electrons, photons and some other charged particles – create showers of other particles by interacting inside the material about 140 less interesting ones superposed.” “When you design new instruments you start from the science you wish to explore and sort out the basic concepts, allowing a degree of design flexibility so that you can benefit from technological advances,” explains Professor Virdee. One area being closely monitored is the development of FPGAs, a commercially available integrated circuit technology that Professor Virdee says could be used to help identify interesting particle collisions. “The performance of these FPGAs is improving. We would like to use the latest generation of FPGAs when we actually build the calorimeter” he says. This is still a few years away. Currently researchers are looking more to prototype the key technologies to be used in the calorimeter, while also remaining open to further potential improvements.
www.euresearcher.com
There are many unanswered questions in physics, including around the fundamental theoretical foundations. The two pillars of modern physics are Einstein’s general theory of relativity and quantum mechanics, yet they cannot co-exist in extreme conditions. “That means we don’t have a unified understanding of nature, where all the forces of nature are considered to be unified,” explains Professor Virdee. Many different conjectures and theories have been put forward and sophisticated technology is essential to experimentally verify or refute them, underlining the importance of the continued development of research infrastructure. “This instrument that we are developing will be able to see things that the current generation in the LHC can’t,” stresses Professor Virdee.
Project Objectives This ERC proposal deals with a novel approach to calorimetry that combines state of the art techniques so far only used independently either in charged particle tracking or conventional calorimeters. New technologies are being developed, including powerful, radiation hard electronics using feature sizes of 130 nm or 65 nm; low-cost silicon sensors using 8” silicon wafers; environmentallyfriendly cooling technologies using liquid C02; high performance and fast decision making logic using new more powerful FPGA’s, all to be produced at an industrial scale. The approach has been chosen by the LHC-CMS experiment for its upgrade of endcap calorimeters. Project Funding ERC ADG Advanced Grant Project Partners • CERN with staff contributions Contact Details Professor Sir Tejinder Virdee Imperial College of Science, Technology and Medicine, South Kensington Campus, Exhibition Road, London SW7 2AZ T: +44 20 7594 7804 E: t.virdee@imperial.ac.uk W: http://cordis.europa.eu/project/ rcn/198709_en.html The Compact Muon Solenod Phase II Upgrade, Technical Proposal, CERN-LHCC-2015-010. First beam tests of prototype silicon modules for the CMS high granularity endcap calorimeter, paper in preparation. The CMS HGCAL detector for HL-LHC Upgrade, presentation by A. Martelli at the 2017 LHCP conference, Shanghai.
Professor Sir Tejinder Virdee
Sir Tejinder Virdee is Professor of Physics at Imperial College, London. He is one of the two founding fathers of the Compact Muon Solenoid experiment at the LHC. He pioneered some of the techniques used in its calorimeters (for the measurement of energies of particles) crucial for the discovery of a Higgs boson announced by the CMS experiment in July 2012, along with the sister experiment ATLAS. Virdee’s is currently developing a novel calorimetric technique for very high luminosity LHC running, due to start in mid-2020’s. He has won numerous prizes from the UK, European and American Physical Societies.
43
Getting to the heart of random operators An ideal wire is a perfect conductor. What about a material copper wire: how is the conductance affected by impurities? What about a copper plate? Researchers in the Spectrum project are studying these types of fundamental questions in theoretical physics from the mathematical perspective, as Professor Sasha Sodin explains The motion of a quantum particle such as an electron is described by the Schrödinger equation. It was understood by P. W. Anderson in the 1950s that the presence of random impurities in the medium dramatically affects the properties of the solutions. The extreme situation is Anderson localisation, in which the impurities turn a conducting medium into an insulator. Mathematically rigorous arguments appeared in the 1970s, when Goldsheid, Molchanov and Pastur proved that Anderson localisation always occurs in one-dimensional systems. Later, Fröhlich— Spencer and Aizenman—Molchanov proved that it occurs in an arbitrary dimension when the density of impurities is sufficiently high. Many questions remain in this area however, which researchers in the ECbacked Spectrum project aim to investigate. “What happens when the density of impurities is low, and the dimension is greater than one, remains a mystery. Is a copper plate with impurities a conductor?” asks Professor Sasha Sodin, the project’s Principal Investigator. “The answer should depend on two key ingredients: the geometry of the problem and the randomness of the impurities.” It is believed that a two-dimensional plate becomes an insulator at arbitrarily weak density of impurities, whereas a three-dimensional bar retains its conductance when the impurities are sufficiently sparse. “Presumably, the difference has to do with the different
44
behaviour of classical random walk in two and three dimension. However, the connection between random walk and quantum dynamics in the presence of disorder is not well understood,” says Professor Sodin. Some central questions remain open even for one-dimensional systems. The rigorous results asserting the absence of conductance in a wire with arbitrarily sparse impurities seems to run contrary to our everyday experience. “The results
Resummation of divergent series Physicists since Richard Feynman have used perturbation theory, in which the quantities of interest are expanded in an infinite series. The terms are labelled by graphs called Feynman diagrams, and the sum of the first few terms is considered to be an approximation to the exact solution. This procedure is very powerful; however, it suffers from serious drawbacks. “Typically, the answer given by perturbation theory is a divergent
We aim to understand the basic properties of quantum particles moving through a disordered environment. We would like to know how the motion is affected by the combination of randomness and geometry of Goldsheid—Molchanov—Pastur, along with most of the subsequent work, pertain to an infinitely long wire, whereas in reality the width of a wire is not negligible with respect to its length,” explains Professor Sodin. “Theoretical physicists, particularly Fyodorov and Mirlin, devised several approaches to this problem, but so far none of these has been mathematically justified.”
series, such as 1 – 3 + 9 – 27 + 81 - …”, says Professor Sodin. “Such an expression has no mathematical meaning. By analogy with a geometric progression, one is tempted to conclude that the sum is ¼. However, it is not clear that such a guess gives the correct answer to the initial problem.” Attaching a mathematical meaning to a divergent series is known as resummation.
Perturbative series in theoretical physics are often labelled by graphs called Feynmann diagrams.
EU Research
At a glance Full Project Title Randomness and geometry in spectral theory and beyond it. Project Objectives Study the interplay of randomness and geometry in the spectral theory of random operators and in other parts of mathematical physics. Project Funding European Research Council start-up grant 639305 (SPECTRUM) The escape of an ant to infinity: in a typical 2D configuration of sufficiently high density, there is an infinite path avoiding gaps wider than 0.3mm. Figure courtesy of Prof. Dietrich Stoyan.
Offer Kopelevitch completed his MSc thesis at Tel Aviv University in 2016 Recently, Offer Kopelevitch, an MSc student at Tel Aviv University and a member of the Spectrum team, devised a rigorous resummation procedure, applicable to several spectral problems involving randomness. “This first step is extremely important,” says Professor Sodin. “Previously, it was thought that there could be no mathematically consistent way to resum divergences of this kind. The next step is to tackle the problems with a stronger geometric component, such as conductance in a wire.”
Gas of hard spheres The famous Kepler conjecture, going back to the seventeenth century and recently proved by Hales, asserts that no arrangement of non-overlapping unit balls in 3D space can occupy more than π/√18 = 0.74… of the volume of the space. The bound is achieved for the so-called FCC packing, which is the one used by costermongers to arrange oranges in a box. From the point of view of mathematical physics, it is important to understand the
www.euresearcher.com
properties of a typical configuration chosen at random among the arrangements with a density below π/√18. “This model goes back to Ludwig Boltzmann, it is called a gas of hard spheres,” comments Professor Sodin. “It is classical rather than quantum, however, the fascinating interplay between geometry and randomness makes it related to the other problems we study.” Typical configurations at low density are gas-like: moving the balls around in one region of space has very weak influence on the balls in distant regions. A mathematical proof was found by Ruelle in the 1960s. As the density gets closer to π/√18, it is expected but not proved that the system undergoes a phase transition, and typical configurations acquire similarity to a lattice. Significant progress was made by Dr. Alexander Magazinov, a postdoctoral researcher at Tel Aviv University and a member of the Spectrum team, who answered a question posed by Bowen— Lyons—Radin—Winkler. “Suppose the balls are 1m wide, and an ant can jump between two balls if the distance between them is less than 1mm.,” explains Professor Sodin. “The result of Magazinov asserts that, if the density is sufficiently close to π/√18, a typical configuration has an infinite cluster of balls such that an ant can reach any of them from any other one.” “Magazinov’s result also holds in two dimensions, for configurations of disks, but not in one dimension. We do not know what happens in dimension four and above. The dimension-dependence is one of the things that made this problem so challenging.”
Professor Sasha Sodin
Sasha Sodin is Professor of Analysis and Mathematical Physics at Queen Mary University London. He completed his PhD at Tel Aviv University in 2010, after which he worked at Princeton in the US, before taking up his current position in 2016. He has held a position in Tel Aviv University since 2014, from where he is on leave from 2016.
Photo courtesy of Professor A. Tikhomirov
Different resummation procedures may lead to different sums, and some divergent expansions are not resummable. Therefore devising the right resummation procedure requires an understanding of the specific nature of that problem.
Contact Details QUEEN MARY UNIVERSITY OF LONDON MILE END ROAD E1 4NS LONDON United Kingdom T: +44 20 7882 5452 E: a.sodin@qmul.ac.uk W: http://www.maths.qmul.ac.uk/~s_sodin/
45
Getting to the heart of nanomaterial risks Nanomaterials are being used increasingly wider in semi-conductor processing, as European companies seek to maintain their place at the forefront of the nanoelectronics industry, yet the associated occupational hazards are not fully understood. The NanoStreeM project aims to help assess the health risks posed by specific nanomaterials, as Dr Dimiter Prodanov explains A variety of nanomaterials are used today in the development of semi-conductors, as companies seek to further improve the performance of integrated circuits. While these materials are increasingly central to modern semi-conductors, their hazardous properties and the associated risks are not fully understood, an issue scientists at the nanotechnology research centre in Leuven are investigating. “We are regularly asked about the safety of different materials used in semi-conductor development, and we identified that we needed to intensify research in this area,” says Dr Dimiter Prodanov. From this, researchers established the NanoStreeM project, a consortium which brings together partners from both the public and private sectors to build a deeper understanding of the hazards and risks associated with using nanomaterials. “The consortium is uniquely well positioned to look into the safety of nanomaterials used in the semi-conductor industry,” outlines Dr Prodanov.
Nanomaterials The nanomaterials themselves are broadly defined as those that have a certain percentage of particles at the nanoscale, between 1 and 100 nanometres. Particles of this size can pass easily through alveoli, get deposited in
Processed 200 mm silicon wafer (courtesy Imec).
46
the lungs, or even disperse in the body; Dr Prodanov says these nanoparticles can act as carriers of other materials. “The majority of bulk materials are not really toxic as such. We are looking into the modification of the primary hazards, by considering the process residues on one hand, and also new materials that may find their way into future applications. These are mostly carbon-based materials, for example nanotubes,” he explains. Nanoparticles are increasingly used in polishing slurries in semi-conductor development, which can leave certain workers exposed to potentially hazardous materials during the fabrication process. “Most of these
more quantitative approaches,” he explains. The main problem with respect to quantitative models is that typically they need access to parameters which are not directly accessible. “This means that we have to monitor the exposure of nanomaterials in processing, over a certain period of time, in order to assess the emissions and then derive Occupational Exposure Limits (OELs),” says Dr Prodanov. “However, there is not enough high-quality toxicological information to derive rigorous OELs. So the typical approach is to divide the available OELs for a bulk material by a certain factor, based on expert opinion.”
The majority of bulk materials identified so far in the project are not really toxic as such. We are looking into the modification of the primary hazards, by considering the process residues on one hand, and also into new materials that may find their way into future applications processes are done in closed environments. So it’s only when you clean the chambers, or if you deal with residuals from spills, that you come directly into contact with these particles,” continues Dr Prodanov. There are five workpackages within the project, with one workpackage dedicated to investigating risk assessment methodologies for the use of these nanomaterials, as well as risk management and control strategies. Dr Prodanov says there are three main types of approaches to risk assessment. “There are categorical or qualitative approaches, there are semiquantitative approaches, then there are
The approach on the semi-quantitative side is effectively to devise some scores for each of the processes, to classify the processes accordingly, and then to assign specific control measures to protect the health of workers. However, similar problems apply as with the quantitative models. “If you don’t have data then you have to apply some modelling. The question then is if the assumptions in these models actually fit the application. Then there’s the problem of the lack of toxicological information - only a few nanomaterials have been studied sufficiently well to justify a quantitative or semi-quantitative approach,” explains Dr
Kick of meeting NanoStreeM, 28 Jan 2016.
EU Research
Prodanov. The third approach is qualitative - or categorical - risk assessment. “Most industrial practitioners are more familiar with qualitative approaches. One of the tasks in the project is to devise a comprehensive risk assessment methodology and identify the best tools for the semi-conductor industry,” says Dr Prodanov.
Generic methodology This methodology could be applied to a new material with mostly unknown properties, providing a basis to establish initial control measures and exposure limits. These measures may at times be more stringent than actually required, but the precautionary principle applies here, and they could be relaxed in future when more is known about the toxicity of the specific material. “When more information becomes available, certain control measures can be relaxed to a more appropriate level, reflecting the application and state of knowledge,” says Dr Prodanov. This methodology will be one important outcome from the project, while Dr Prodanov says their research will also have an impact in other areas. “We have developed an inventory of the nanomaterials that are currently used, analysed the trajectory of nanomaterial use; we will also build a database of materials that could be used in future,” he outlines. Researchers aim to identify a set of OELs, which will be used as an input for the risk assessment methodology, while there is also a training element to the project. The nanoelectronics industry is central to Europe’s economic future, and technological innovation is key to its ongoing development, yet this must not
come at the cost of compromising environmental standards. “We have identified some information about nanomaterials which is important for safety professionals and process engineers,” outlines Dr Prodanov. Beyond nanoelectronics, Dr Prodanov says the project’s research could also hold relevance for other industries. “Some outcomes from the project are sufficiently generic to be translated to other industries; for example, the suppliers of reagents and also possibly in waste treatment industries. We think that the risk assessment methodology and the training packages can be translated across different industries,” he continues. The focus in NanoStreeM has been mainly on coordination and sharing information which was already collected. In future, Dr Prodanov would like to pursue further research, taking new measurements using the tools that have been developed in the project to build a fuller picture of the hazards associated with the use of nanomaterials. “We can monitor how these tools are actually being used and cover the whole lifecycle. At the moment, we are looking into the fabrication, but we’re not looking at the whole lifecycle of a nanomaterial which enters the semi-conductor industry, as there are other projects doing so,” he says. The project is collaborating with other initiatives in complementary areas, building relationships and laying the foundations for continued research. “Our project is part of the EU NanoSafety cluster, so when we have different events we advertise them and try to interact with different stakeholder communities,” continues Dr Prodanov.
At a glance Full Project Title Nanomaterials: Strategies for safety Assessment in advanced Integrated Circuits Manufacturing (NanoStreeM) Project Objectives 1. Build inventories of materials, research topics and directions relevant for nanomaterial use and exposure in nano-electronics manufacturing. 2. Identify gaps in knowledge and methodologies to assess the risk of engineered nanomaterials used in semiconductor manufacturing or incidentally released as by-products of the manufacturing process. 3. Apply results for better governance, dissemination and outreach. Project Funding This project receives funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 688194. Project Partners The NanoStreeM consortium includes 14 partners from six European countries: Belgium, France, Germany, Italy, Ireland and the Netherlands. The consortium combines the expertise of industry, research organizations in material science, toxicology, environmental assessments, and occupational medicine, providing a critical mass for a coordinated action on a European scale. Contact Details Project Coordinator, Dr Dimiter Prodanov, PhD IMEC vzw Kapeldreef 75 3001 Heverlee, Belgium T: +32 162 81840 E: Dimiter.Prodanov@imec.be W: http://www.nanostreem.eu/
Dr Dimiter Prodanov, PhD Semiconductor facility (courtesy Imec).
Dr Dimiter Prodanov, PhD is the coordinator of the NanoStreeM project and a senior scientist at IMEC, a position he has held since 2008. He is a biosafety coordinator and a nanosafety specialist, and he supports actively IMEC’s research programs in life sciences, health care and nanotechnology. Dimiter Prodanov is also an affiliated researcher in Neuroscience Research Flanders.
©Imec
www.euresearcher.com
NANOSTREEM 47
Colouration – the natural way Colour affects our perception, our idea of quality, influences our senses and our mood – it is the visual marker that is weighted with meaning. In Industry, the norm is to chemically synthesise pigmentation in laboratories but there is demand for a more natural method of colouring everything from the food we eat to products we use. That’s why a project headed by Dr Silvia Vignolini is focusing on innovation in more natural colouration techniques Traditionally,
the colouring of commercial products relies on chemical and inorganic materials which have been artificially synthesised. The SeSaMe project is taking a different direction, by exploring the use of natural materials like cellulose and chitin to obtain the desired pigmentation. There is a growing demand for more environmentally friendly ways to add colour – especially in food products. Colour dyes used to enhance the look of food and food packaging and dyes in general have long been tainted with controversy, with concerns over toxins and health impacts – to a point where consumers are increasingly scrutinising ingredients or how things are made, for anything that looks unnatural and potentially harmful. However, colour is important as a visual indicator and something we are used to using as a gauge for quality, attractiveness, freshness or taste. This is why the pioneering science undertaken by the researchers working on the SeSaMe project could have far reaching appeal as they examine ways to mimic nature’s methods for producing colours. “We use natural materials to create a novel type of pigment called photonic pigments. These pigments can be used in many everyday life applications (such as for cosmetics or food colourations) – so
Figure 1: Cellulose can self-assemble inside aqueous droplets to form coloured Microparticles, which can be observed with an optical polarising microscope (a), and a scanning electron microscope (b, c). The cross section of the microparticles shown in (d) reveals the cellulose helical architecture responsible of the coloration. they have a strong commercial relevance,” explains Dr Silvia Vignolini of the SeSaMe project.
The nature of colour SeSaMe’s approach is based around how nature creates colour, using a technique called self-assembly. Nature has been using colour since the beginnings of life itself, as an indicator of sexual health, to warn off predators or to encourage pollination. Creating colour is something nature does very well and so it follows that it should be possible to
let nature teach us how to produce colour without resorting to toxic or synthetic solutions. “A lot of our research is inspired by nature and how it creates and uses colour. The strategies that have been developed in nature are incredibly optimised. We really study the natural structures produce coloration using material like cellulose and polysaccharides and how the plants can control and assemble these material in order to form such incredible structures. “The idea is that instead of trying to re-invent this, we are trying to copy what nature does. We observe how this structure, this cellulose for example, is organised in nature, then we try to reproduce it in the lab with the same material, to achieve a similar type of optical response,” said Vignolini. “We have two aims, one is to try and understand how these natural structures are made and the other is to develop pigments from the optical response. We aim to learn how to make copies in more detail, in order to make a functional material.”
The full spectrum By replicating this natural process and embedding it in materials, a range of desired colours can be achieved. The researchers will produce colours across
Examples of structural coloration with natural materials: (from left to right) Microscope Image of structurally coloured cellulose film: different colour can be obtaining changing the evaporation conditions, microscope Image of the epicure of Pollia Japonica fruits, microscope image of cellulose suspension in cross polarisation between capillaries, revealing the typical textures of liquid crystalline systems.
48
EU Research
the entire spectrum, from ultraviolet to infrared, including white. “The optical appearance of white does not necessarily mean a lack of pigment. Often white is obtained with chemical additives that enhance scattering of light,” explains Dr Vignolini. Researchers are also able to tailor the optical response of the material in multiple ways, not only by changing the colour but by determining the full extent of the optical appearance. “The full optical appearance of an object is a concept which is more complicated than just the colour itself. Colours can look matt, glossy or very metallic, and we can directly embed this functionality in our material,” said Dr Vignolini. “There is a growing interest in the use of natural materials in industry,” asserts Dr Vignolini. “We are trying to understand the fundamental aspects of these materials, as such understanding will
Macroscopic images of coloration obtained with cellulose: photograph of a macroscopic cellulose film with Structural coloration.
Below: Self-Assembly process of cellulose Microparticles: Comparison between (a) theoretical and (b) experimental images obtained from the confinement of a cholesterol cellulose suspension within a spherical geometry, when viewed through cross-polarisers (top row) and upon addition of a first-order tint plate (bottom row). Upon loss of water the Maltese cross typical of chiral nematic ordering is retained, until drying.
www.euresearcher.com
Full Project Title Bio-Inspired Photonics (SeSaMe). Project Objectives SeSaMe is an ERC study where researchers attempt to understand and replicate nature’s methods for producing colours and optical response. The research could lead to the production of low-cost, biodegradable photonic materials and in turn, replace the potentially hazardous, traditional colourants, that have often proven to be controversial in product development. Project Funding SeSaMe is an ERC funded project.
environment. If we want to have a wider impact in society then we need to make these materials on a large scale. To do that, we need to have scalable technology.”
We use natural materials to create a novel type of pigment called photonic pigments. These pigments can be used in many everyday life applications (such as for cosmetics or food colourations) – so they have a strong commercial relevance open up new technological possibilities, which is why we are collaborating with several companies to explore how to take our technology forward. From my perspective as a scientist, I’m interested in the challenge of how to make this material from a fundamental point of view, but this research could provide a benefit to society too, enabling a new technology that is cleaner and better for the
At a glance
To this end, those close collaborations established with the commercial sector will ensure the best chance for exploiting new and exciting opportunities, born from the project’s research.
Contact Details Project Coordinator, Dr Silvia Vignolini, Ph.D University of Cambridge Department of Chemistry Lensfield Road Cambridge CB2 1EW UK T: +44 1223 761490 E: sv319@cam.ac.uk W: http://www.ch.cam.ac.uk/group/vignolini/
SEM image of a cross section of a cellulose film revealing the helical architecture responsible of the coloration.
Dr Silvia Vignolini, Ph.D
Photograph by Gabriella Bocchetti. Dr Silvia Vignolini was awarded a PhD in Physics at the European Laboratory for non-Linear Spectroscopy and the Physics Department at the University of Florence. During her PhD, she studied light-matter interaction in the near-field regime. In 2010, she joined as a Research Associate the Cavendish Laboratory in Cambridge, UK, working on optical properties of soft-materials. She joined the Chemistry Department in Cambridge as an academic in 2014, and in 2015 she has been awarded an ERC Starting Grant.
49
Magnets that don’t cost the earth Rare earth magnets play a crucial role in many everyday applications, yet Europe does not currently enjoy an independent supply of the materials used in production, leading researchers to investigate how existing resources can be used more efficiently. We spoke to Professor Carlo Burkhardt about the REProMag project’s work in developing a new processing route The strongest type of magnets currently available, rare earth (RE) magnets are used in a wide range of everyday applications, including medical devices, motors and various consumer electronics devices, and demand is correspondingly high. There is typically a choice here between either samarium-cobalt magnets, which are relatively expensive, or neodymium-ironboron magnets. “Both these types of magnets are based on a rare earth material. However, when people talk about rare earth magnets, they’re usually talking about neodymium-iron-boron magnets,” outlines Professor Carlo Burkhardt, the coordinator of the REProMag project. These magnets are integral to many high-tech industries, in particular electro-mobility, electronics and wind energy, yet Europe is largely dependent on external sources of RE metals, leading Professor Burkhardt and his colleagues in the REProMag project to look at how existing resources could be used more efficiently. “The main idea is to develop a closed material loop and to reuse the waste material that’s currently available in the EU, so that we can then gain a greater degree of independence in supply,” he says.
Rare earth metals This starts with recycling existing RE metals, building on the work of the REMANENCE project in developing new processes to recover these materials, with researchers looking primarily at hard disc drives as a source. Recovering RE metals from hard disc drives is a complex task however, as when subjected to conventional recycling methods the magnet will shatter and stick to other bits of metal, so the project is instead using a different approach. “The magnet is separated as completely as possible from the electronic
device, but it doesn’t have to be detached completely, because we then do a hydrogen treatment in a closed vessel,” explains Professor Burkhardt. This treatment can be compared to the effect of ice forming on a road in winter. “The ice has a larger volume than the water, and causes cracks in the road. Something similar happens in a magnet with this hydrogen treatment,” says Professor Burkhardt. “Neodymium forms a hydride, which causes volume expansion – so this cracks the material apart and breaks it into powder. The electronic component is introduced into this vessel with hydrogen, and the magnet breaks up into a powder.” This represents a relatively easy way of separating the magnet from the other components in a hard disc drive, which can then be re-cycled in the conventional manner, while researchers can look to use the re-cycled magnets in production. It’s important here to consider the composition of the magnet. “One big issue is the dysprosium content. Dysprosium is very expensive, as the majority of the supply is located in Southern China. Dysprosium is required if you want high temperature stability in a magnet. So if you want a magnet to operate at higher temperatures, like in an electrical motor for example, then you need a higher dysprosium content,” outlines Professor Burkhardt. These are important issues in terms of the circular economy and the eventual applicability of a re-cycled magnet, while Professor Burkhardt says there are also other factors to consider, including any corrosion on the magnet. “The more corroded a magnet the more oxidised the neodymium, and the worse the eventual performance of the recycled magnet,” he explains. “The addition of extra neodymium can help to restore the original performance.”
Shaping, debinding and sintering
REProMag M18 meeting at Sennheiser, Germany.
50
The next step in the project is the development of a more efficient method for producing RE magnets. Current production routes are not always efficient in their use of
EU Research
RE metals, so Professor Burkhardt and his colleagues in the project are working to develop a new, waste-free route for the manufacturing of high specification permanent magnets, the Shaping, Debinding and Sintering (SDS) process. “The SDS process starts with a feedstock, which is a blend of the magnetic material and some polymers. This is effectively the starting material in the SDS process,” he outlines. High-quality materials are used in the SDS process, a highly innovative, automated process that produces magnets without any waste. “The SDS process is designed for the production of small magnets, but in very large quantities,” continues Professor Burkhardt. “We have several commercial partners in the project, who hold deep expertise about the geometries of these magnets and their potential applications. We cover the whole manufacturing chain, from the powder production, through to the production of the magnets.” The SDS process boosts energy efficiency along the whole manufacturing chain, and can also be used to produce RE magnets in all kinds of sizes and shapes, which represents a significant development in terms of their wider applicability. Currently, these RE magnets are only available in conventional shapes, as they are very difficult to manufacture and it’s extremely expensive to grind them afterwards into a specific shape, which affects the ways in which they are used. “Today, applications are often built around the geometry of the magnet - companies may build their specific application around a certain shape of magnet. If the magnet could be in another shape, for example in a U profile, then this could open up other commercial applications,” says Professor Burkhardt. The project is developing new methods to shape these magnets, which could open up the possibility of using them in further potential applications in future. “This could be very interesting, particularly with respect to very small magnets. We’ve been looking at miniaturisation, and the development of very tiny magnets, which could be directly shaped,” outlines Professor Burkhardt.
Feedstock development.
Microstructural analysis.
Production of recycled magnets.
Commercial marketplace This approach helps to ensure that research is relevant to the wider commercial marketplace, which is a key part of the project’s overall agenda. This takes on even greater importance given the high level of demand from industry for RE magnets, which Professor Burkhardt says is set to rise even further in future.
www.euresearcher.com
51
At a glance Full Project Title Resource Efficient Production Route for Rare Earth Magnets (REProMag) Project Objectives REProMag is a European project, financed by the European Commission under the call programme ‘Factories of the Future.’ The project was presented under the topic FoF-02-2014: Manufacturing processes for complex structures and geometries with efficient use of materials. Project Funding Project funded by the European Union’s Horizon 2020 research and innovation programme under Grant Agreement no 636881. Project Partners • OBE Ohnmacht & Baumgärtner GmbH & Co. KG (Ispringen, Germany) – REProMag Coordinator. • FOTEC Forschungs- und Technologietransfer GmbH (Wiener Neustadt, Austria). • PT+A GmbH (Dresden, Germany). • HAGE Sondermaschinenbau GmbH & Co KG (Obdach, Austria). • Lithoz GmbH (Vienna, Austria). • TEKS SARL LTD (France and the UK). • SIEMENS AG (Munich, Germany) • Sennheiser electronic GmbH & Co. KG (Wedemark, Germany) • Vienna University of Technology (Vienna, Austria). • University of Birmingham (Birmingham, UK). • Montanuniversitaet Leoben (Leoben, Austria). • Jožef Stefan Institute (Ljubljana, Slovenia). • NPL Management Limited (Teddington, UK). • Steinbeis 2i GmbH (Karlsruhe, Germany). Contact Details Head of Technology & Innovation, Professor Carlo Burkhardt OBE Ohnmacht & Baumgärtner GmbH & Co. KG Turnstraße 22 75122 Ispringen Germany T: +49 7231 802215 E: CBurkhardt@obe.de W: www.repromag-project.eu
Professor Carlo Burkhardt
Carlo Burkhardt is professor for manufacturing technology and head of the institute for material development and testing (STI) at Pforzheim University (Germany). Previously he worked as head of development at Witzenmann GmbH, a German automotive supplier with 4.000 employees, and he has also held other roles in both the public and private sectors.
52
Processing parameter studies. “Take a conventional car for instance – about 50 years ago there were relatively few permanent magnets in a car, but these days the Mercedes S-class has 200 electric motors, for things like seat adjustment, windows, rear mirrors and light controls. Then we see that more and more consumer electronic devices have electrical motors,” he points out. There is a correspondingly high level of demand for these RE metals, yet the supply is currently strategically controlled by China, which has important implications for European industry. “The RE metals are rated very highly in the critical materials strategy of the EU, as electro-mobility, wind energy, kinetic energy, electronics – industry 4.0 as we call it – is not thinkable without RE magnets,” says Professor Burkhardt.
Development of sds processing technology. magnets. “We have to achieve certain standards. Neodymium-iron-boron magnets were first developed in 1982, and over the years they have been quite rapidly optimised,” he outlines. This remains an active area of research, yet the focus within the project is more on the SDS processing route and using resources more efficiently. While hard discs are the main current source of RE metals there are also many others to consider, and Professor Burkhardt has identified three in particular. “In future we will be able to gain large quantities of RE metals from hybrid cars, electric cars and wind turbines,” he outlines. The REProMag project will conclude towards the end of 2017, yet Professor Burkhardt plans to pursue further research in this area in
We have several commercial partners in the project, who hold deep expertise about the geometries of these magnets and their potential applications. We cover the whole manufacturing chain, from the powder production, through to the production of the magnets A more reliable, independent supply of these metals is therefore central to commercial development, while there are also environmental considerations to take into account. The chemical separation methods currently used in RE metal extraction are not very environmentally friendly, and carry a risk of soil contamination, underlining the wider importance of developing a more sustainable process. “Recycling makes a lot of environmental sense in terms of goals around the circular economy, even without the pressure on the material supply side,” stresses Professor Burkhardt. While recycling is central to efforts to improve sustainability, quality standards still have to maintained, so in future Professor Burkhardt says researchers will look both at using resources more efficiently and developing stronger
future, with a view to demonstrating the technical feasibility of the SDS processing route. “We are currently applying for a further project, for upscaling production to larger quantities, while we are working together with customers on certain applications,” he says.
SDS processed NdFeB magnets.
EU Research
A new window into quantum gravity Three of the four fundamental forces are described within the standard model of particle physics, yet the challenge of describing gravity within the framework of quantum mechanics remains unresolved. Studies of the thermodynamic behaviour of black holes can reveal important insights into the microstructure of gravity, as Dr Sameer Murthy of the QBH project explains The theoretical basis of the standard model of particle physics is quantum mechanics, a framework which describes how elementary particles behave at a fundamental level. Three of the four fundamental forces which govern the interactions of particles are described within the standard model – the strong, electromagnetic and weak forces – yet the problem of writing down the quantum theory of gravity remains unsolved. “Bringing together gravity and quantum mechanics is a major challenge in physics. So far there has been no experiment in which researchers have been able to probe the quantum properties of gravity,” says Dr Sameer Murthy, the Principal Investigator of the QBH project. With no direct experimental guide, Dr Murthy is using a novel approach in the QBH project. “I want to use black holes and their thermodynamic properties as a guide to understand the quantum properties of gravity,” he outlines. A historical analogy can be drawn here with the work of physicists in the nineteenth century. At the time, researchers were studying the thermodynamics of gases; while they could make gross measurements on a container filled with gas, scientists didn’t understand the microscopic properties. “They could measure things like heat transfer, pressure, temperature and entropy, the macroscopic, large-scale variables. But what you want to understand is the microscopic properties – what is the nature of the constituents?
How do they interact? What are their properties?” explains Dr Murthy. From measurements of entropy – a measure of disorder of a system, or the number of ways in which a system could exist –
to gain new insights into the structure of gravity. “By thinking carefully about the macroscopic phenomena, we can deduce non-trivial aspects of the microscopic physics,” he outlines.
We are now in a situation similar to that faced by nineteenth century physicists – we know that a black hole is made up of something. We don’t know what it is, but we can compute various macroscopic quantities scientists were able to learn about the microscopic properties of gases, indeed they were able to deduce fundamental quantum physics concepts, like the indistinguishability of elementary particles and the fundamental cutoff on energy excitations. Now Dr Murthy aims
Black holes Black holes are regions of space-time that are surrounded by one-way surfaces called event-horizons. According to classical general relativity nothing can come out from behind this horizon, not even light, yet the findings of Jacob Bekenstein and
Mock theta functions, discovered by the brilliant Indian mathematician S. Ramanujan in 1920, seem to make an unexpected but important appearance in describing aspects of the physics of black holes and quantum gravity in string theory (see Page 3).
www.euresearcher.com
53
Illustration: CXC/M. Weiss.
Stephen Hawking seem to run contrary. “They found that a black hole behaves like a thermodynamic object when one takes quantum mechanics into account. It has temperature and entropy, and emits thermal radiation – in a sense it behaves like a cylinder of gas. This was a remarkable finding,” says Dr Murthy. This suggests that a black hole is actually made up of many microscopic states, so they could be used to probe quantum gravity. “We are now in a situation similar to that faced by nineteenth century physicists – we know a black hole is made up of something. We don’t know its microscopic constituents, but we can compute various macroscopic quantities,” continues Dr Murthy. Bekenstein and Hawking developed a formula to calculate the thermodynamic entropy of a black hole, effectively the number of ways in which it can exist, now Dr Murthy aims to use this to probe their microscopic structure. This research again builds on historical foundations. “As Boltzmann taught us, entropy is really a measure of the number of microscopic ways in which a macroscopic state of a system can exist. If we know the entropy exactly, we know the most basic fact of the quantum behaviour of a system, the dimension of the possible space of quantum states of the system,” he says. These ideas of Boltzmann underlie all of physics and are key to understanding the relationship between the micro and the
54
macro worlds. He gave the precise mathematical formulation of how the various constituents of the microscopic world come together to form what looks like a macroscopic object, when viewed from afar without a sharp lens. “If we can move away from the macroscopic approximations of Boltzmann, we begin to get back details about the microscopic world,” says Dr. Murthy. Together with his collaborators Prof. A. Dabholkar and Dr. J. Gomes, Dr. Murthy applied this idea to a specific type of black hole to extract clues about its microscopic structure. “We considered a certain type of black hole, and, based on a beautiful idea of A. Sen, we developed new methods to compute the quantum corrections to the thermodynamic entropy of black holes,” he explains. This gave researchers a basis to understand quantum effects in black holes at an unprecedented level of accuracy; Dr Murthy is now looking to verify this approach. “We have developed this new technique to compute quantum corrections to black hole entropy, but what are we learning about the black hole? How can we check that our results are correct? As this is a new technique, it’s important to check it,” he stresses.
String theory This is where a theoretical framework called string theory comes in. In string theory one can describe the microscopic constituents
of a certain class of black holes. “If you understand what the black hole molecules are, then you can count how many states they can be in, and check if the Boltzmann equation also applies to quantum gravity,” says Dr Murthy. Researchers A. Strominger and C. Vafa, and A. Sen were able to show in the 90’s that the Bekenstein-Hawking entropy formula could be explained via the Boltzmann equation, as the logarithm of the number of microscopic states of the system – in the same way that entropy is explained in the rest of physics. Dr Murthy and his colleagues wanted to investigate this further. “We said; ‘ok, let’s test this beyond the approximation that the system size is very large’. We now know how to extend the Bekenstein-Hawking formula and sum up all the corrections to it, so let’s try to see if, in these examples, we can get not just an approximate agreement, but an exact agreement,” he says. The number of configurations of the molecules in a black hole is an important quantity to understand in quantum gravity, and it is these numbers that Dr Murthy and his colleagues have been working to compute. “We have managed to compute the number of microscopic states of a black hole, starting from the semi-classical approximation of Bekenstein and Hawking, making it better and better, until it’s exactly correct,” explains Dr Murthy. Researchers can then look to gain deeper insights into the microscopic structure of quantum gravity.
EU Research
At a glance Full Project Title Quantum Black Holes: A macroscopic window into the microstructure of gravity (QBH)
String theory gives us two pictures of black holes - the traditional macroscopic picture, and a “dual” microscopic picture in terms of the fundamental objects of the theory called strings and branes. The number of microscopic constituents turn out to arrange themselves in extremely interesting patterns governed by deep underlying symmetry structures. Studying them systematically gives us clues about quantum effects in gravity. “The fact that we can compute these numbers means we now have some information about how those quanta are reached, about how the quanta are arranged in a black hole, and about how forces interact to make these quanta. This is very exciting,” continues Dr Murthy. “Now that we have derived these numbers, we want to look for patterns. Pursuing these ideas in quantum gravity led us to some intriguing relations with an unexpected field, the theory of numbers, which deals with the properties and relationships of positive integers.” The research of earlier number theorists and mathematicians holds importance here, in particular that of G.H Hardy and Srinivasa Ramanujan, a renowned Indian mathematician who left a vast number of results following his death. “Hardy and Ramanujan were trying to answer questions in combinatorics like; ‘if you take number N, how many ways can it be divided out into smaller numbers?’” says Dr Murthy. This eventually resulted in the development of an analytic approximation to solve these types of problems, called the Hardy-Ramanujan-Rademacher method, an important tool in analytic number theory. “What was very curious was that the mathematical process by which classical gravity approximates the quantum degeneracy of states of a black hole is exactly the same as the process that Hardy and Ramanujan found,” explains Dr Murthy. “We think there is a deep connection between black holes, string theory, and number theory.” This connection was realized in a remarkable development which related black holes and another Ramanujan discovery called mock theta functions, a type of function he investigated late in his
www.euresearcher.com
life and wrote about in his last letter to Hardy. While Ramanujan gave several examples of these mock theta functions in this last letter, he didn’t provide a precise definition, leaving later generations of mathematicians many puzzles to ponder. “Mathematicians have worked very hard to try and discern what he could have meant,” says Dr Murthy. The problem was cracked in 2002 by Dutch PhD student Sanders Zwegers, who discovered the key idea for a theory of mock-theta functions that included all the examples of Ramanujan, which led to many new developments in number theory. A four-year collaboration between Dr Murthy, Prof. Atish Dabholkar, and Prof. Don Zagier then revealed a completely unexpected connection between black holes in string theory and mock theta functions. The power of the theory of theta functions comes from an underlying symmetry called modular symmetry. A mock-theta function doesn’t quite have the same symmetry as an ordinary theta function, but it still inherits that symmetry in a very subtle way. It turned out that a large class of black holes in string theory also display exactly the same symmetry structure as a mock theta function, thus opening up new ways of thinking about black holes as well as mock theta functions. One intriguing possibility, that both physicists and mathematicians are excited about, is a link to unexpected discrete group-theoretical structures called ‘moonshine symmetries’. “We could now enlarge the story of exact black hole entropy from just one example to a large class of black holes in string theory,” says Dr Murthy. “Mock theta functions seem to be the correct mathematical basis that encodes the set of quantum states of these black holes.”
Project Objectives The first major aim of the project is to construct a systematic treatment of quantum effects in black hole entropy – which tells us, quantitatively, what fundamental quantum gravity could be. A second major aim is to advance the theoretical understanding of quantum black holes by identifying the microscopic symmetry (and symmetry breaking) principles of quantum gravity, and in particular, investigating the deeper origins of mock modular symmetry. Project Funding Funded by the European Commission, ERC consolidator grant. Total Budget: EUR 1,759,064.00. Contact Details Dr Sameer Murthy Department of Mathematics, King’s College London The Strand, London WC2R2LS T: +44 20 7848 2219 E: sameer.murthy@kcl.ac.uk W: https://nms.kcl.ac.uk/sameer.murthy “Quantum black holes, wall crossing, and mock modular forms”, A. Dabholkar, S. Murthy, D. Zagier, (appendix by M. Cheng). arXiv:1208.4074 [hep-th]. To be published in Cambridge monographs in mathematical physics, (Cambridge University Press). “Localization & Exact Holography” Atish Dabholkar, Joao Gomes, Sameer Murthy. arXiv:1111.1161 [hep-th] 10.1007/JHEP04(2013)062 JHEP 1304 (2013) 062. “Quantum black holes, localization and the topological string” Atish Dabholkar, Joao Gomes, Sameer Murthy. arXiv:1012.0265 [hep-th] 10.1007/JHEP06(2011)019 JHEP 1106 (2011) 019.
Dr Sameer Murthy
Sameer graduated from the Indian Institute of Technology Bombay, and got his PhD at Princeton University under the supervision of Nathan Seiberg. He subsequently held a research position at the Abdus Salam ICTP Trieste, a Marie Curie fellowship at the University of Paris, and a senior post-doctoral research position at Nikhef Amsterdam where he was awarded the NWO VIDI research grant by the Dutch organization for scientific research. In September 2013 he moved to King’s college where he is currently a Reader in Theoretical Physics. In 2016 he was awarded the ERC consolidator grant for a project on Quantum Black holes.
55
Intelligent Machines – our Enlightenment or Downfall? Artificial Intelligence (AI) is being billed as the tale of two possible futures. On the one hand, it’s touted as a means for vastly improved services and on the other, the seed of our potential destruction. There’s no better a way of illustrating the stark contrast of expert opinions on this subject than in a recent online spat between the two most prominent Silicon Valley innovators. Elon Musk believes AI in the wrong technology could threaten us all, whilst Mark Zuckerberg sees the opportunities for Mankind it can offer. So, who is right about this – if anyone? By Richard Forsyth
L
et’s begin by looking at the concerns. We can start with the viewpoint of a scientist we all know and trust, Stephen Hawking – the genius of our time for his groundbreaking work on theoretical physics and cosmology. His take on AI surprised the media with his warning of possible Armageddon. He told the BBC: “The primitive forms of artificial intelligence we already have, has proved very useful. But I think the development of full artificial intelligence could spell the end of the human race… Once humans develop artificial intelligence, it would take off on its own and redesign itself at an ever-increasing rate… Humans who are limited by slow biological evolution, couldn’t compete and would be superseded.” Those primitive forms of AI he’s referring to that are already here, have been successfully rolled out across our society. Anyone with a smart phone will probably have access to some form of AI – such as Cortana or Siri and owners of the popular Amazon Echo, will no doubt have become used to AI in this permanent assistant
56
role. These internet connected devices now listen – are voice controlled and reach out to interact with third party services, whether you require help with the shopping or finding out next week’s weather forecast. Chatterbots, programs that mimic and respond to conversation are a big development goal in AI innovation, for creating practical industry-friendly applications. These programs are about making a computer able to talk like a human and respond to the natural flow of language. One Russian made chatterbot, named Eugene Goostman, has already passed the so-called Turing Test, to fool human beings they were talking to a 13 year old Ukrainian boy, when engaging in text conversations. These bots are however, just clever programs – sophisticated simulators of conversation with scripts. They do not possess true cognitive ability. AI in this respect, is limited, as it only performs one task well, it cannot explore, ‘look up’ or think beyond the narrow area it is assigned to. It’s the same with Chess AI, which is amazingly good at chess but nothing else.
EU Research
Full artificial general intelligence
The threat of AI warfare
But the advanced AI that it often held up as the future – the ‘full AI’, is a step further. Developing this version of AI would mean unleashing a self-managing, self-learning and adapting ‘intelligence’ that can to some extent perceive, change and even recreate itself. The question around AI has always been around making an AI aware of its environment and its place in that environment – being able to freely and comprehensively explore its world. In one way, it’s a question of making it comprehend the problems we, as humans, face – like survival, using common sense to overcome challenges, to know what really matters – even to a point of making it sentient. The argument could be said to be as much about making artificial life as artificial intelligence. This version of AI frightens a lot of people because a sentient machine could potentially be dangerous. It has long been a subject of science fiction horror stories, from the HAL 9000 that refuses to be switched off in 2001: A Space Odyssey, the killer cyborg in The Terminator and the android femme fatale of Ex Machina. And this brings us to the issue that is sparking real and urgent fear with scientists – what happens when we install an advanced AI, or any autonomous AI, in machines that have access to weapons?
In recent news, it transpired that in 2015, Elon Musk, head of pioneering companies Space X and Tesla, led 115 scientists, including Mustafa Suleyman, Google’s DeepMind co-founder, Stephen Hawking and Apple co-founder Steve Wozniak, to add their signatures to a letter to the United Nations, in a call to ban AI robots in warfare. Hold on… Killer robots – it sounds alarmist, the stuff born from fantasists and conspiracy theories. Yet apparently, we are only a step away from a whole new weapon of destruction that will revolutionise warfare. There’s been a considerable mainstream media buzz around AI in the wrong machines and when considered, respected scientists use language like ‘our destruction’ and ‘Armageddon’ it’s apparent there is a genuine, serious unease about AI capabilities being exploited and backfiring – so what exactly is everyone afraid of? Elements of the aforementioned letter spelled out the fears: “Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend… “These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways…we do not have long to act.”
Beyond the mesmerising robot creations by Boston Dynamics, there are other pioneering robotic companies making everything from robot ants, butterflies and penguins, as far-fetched as that sounds. Robots are being built to cope with all sorts of environments with gyroscopes, sensors and cameras, and to use learning techniques to adapt
www.euresearcher.com
57
The letter’s aim is to add the new AI robots of warfare to the list of so called ‘morally wrong’ weapons banned under the UN Convention on Certain Conventional Weapons (CCW). Such autonomous ‘killer robots’ would be able to engage a target and kill without human intervention or control. The concept of these martial machines is something military organisations are invested in – the US, Russia and China are reported to have been looking into developing autonomous missiles which can decide on targets, using AI. Similarly, there are stories of using AI for military drones – the remote weapon of choice in modern warfare for strategic strikes. Whilst armies could arm conventional weapons like drones, war planes and tanks with autonomous AI, to envisage a further reaching future world of robotic technology, just look at some of the developments by companies such as Boston Dynamics, currently making robots of all shapes and sizes. Take for example, the robot called Big Dog, a four-legged machine which can be pushed over and recover, run, navigate terrain or the startling abilities of a robot called Handle. Handle is a bipedal robot on wheels that stands over 6ft, can travel at 9 mph, pick up objects of 100lbs in weight and jump 4 feet vertically. There’s also Atlas, human-like in it construction – with arms and legs, which can walk through a forest autonomously. Beyond the mesmerising robot creations by Boston Dynamics, there are other pioneering robot research companies making everything from robot ants, butterflies and penguins, as farfetched as that sounds. Robots are being built to cope with all sorts of environments with gyroscopes, sensors and cameras, and to use learning techniques to adapt. If you combined this high-level functionality with something like a thinking mind and a killer instinct, then it’s easy to understand the fears that we could one day create a modern-day Frankenstein that can act against us. Beyond robots – our digital infrastructure could allow AI to cause havoc. Put malicious AI into systems that connect remotely
58
to each other via for example, The Internet of Things and all sorts of terrifying scenarios unravel in the imagination. For instance, imagine if all autonomous vehicles were hijacked at once. When AI has a goal and humans interfere, it might strategically turn against humans, and if it becomes smarter than humans then developing an ability to control humans seems a feasible outcome. Musk’s urgency and fear of AI is particularly concerning when he’s a world class innovator of self-driving automobiles. Tesla are trailblazers when it comes to autonomous automobiles that selfnavigate and drive without human assistance. What’s more, Musk recently created a non-profit AI research firm called OpenAI. His prophesy for doom and gloom therefore, seems a little at odds to this visionary who leads from the front with AI technology – and scarier for it. There’s plenty of fear mongering in the media without top scientists waving their hands in the air. For example, it was reported that Facebook shut down their own AI engine when developers at Facebook AI Research lab (FAIR) realised chatterbots had gone off their script and were communicating in a language they had created without human input and that their creators could not understand. Perhaps, this makes it all the more interesting that CEO of Facebook, Mark Zuckerberg shakes off fears of AI as a force to be used against us and, against the grain of many scientific heavyweights, expresses more positive opinions of the future. The Facebook founder’s run-in with Musk went along the lines of Zuckerberg calling Musk ‘irresponsible’ for his warning about ‘the fundamental risk to the existence of civilisation’ of AI in the wrong hands. Musk replied with a put down, saying Zuckerberg’s understanding of the subject was ‘limited’.
A brighter vision But behind these ‘handbag swings’ online, Zuckerberg was trying to point out that AI has enormous potential for good and for progress which, perhaps, is getting less media attention.
EU Research
Atlas from Boston Dynamics.
And this brings us to the other-concept of true AI, the idea of an AI utopia where it serves our interests in ways we can only begin to imagine. For example, comprehend almost any task we currently do and then imagine it can be done a lot better with truly smart AI. A positive-focused AI could potentially improve everything in our lives. Advanced AI could help solve some of our major problems like climate change, wars, healthcare, resources. Imagine a robot doctor or surgeon that never loses a patient and never gets tired, or a flawless transport system. You could use AI to create efficiencies in design and engineering, solutions to space exploration, for innovation, as a mind for finding diplomacy in difficult situations, it could even help in teaching us, developing our own capabilities in better ways. The possibilities are endless. It would spawn a whole new era – more than that – an ‘age’, where every aspect of life is transformed. Let’s take a look at some positive examples of today’s state of the art AI that have already reaped real world rewards for society. Google’s AI, DeepMind, for example was recently (and controversially) given access to 1.6 million patient records from the National Health Service in Britain, to monitor and diagnose acute kidney injury, a condition that accounts for around 40, 000 deaths and costs over £1 billion a year. Beyond the data management politics that shadowed this story in the press, a trial started in 2015 which tracked patients’ symptoms, reporting dramatic changes in health by sending alerts to kidney doctors, via an app called Streams. The result was that the efficiency of the system was a life saver, reducing time to treatment and in addition it saved pressured NHS nurses an average of two hours a day, which they used for other essential healthcare work. Now, whilst this AI demonstration is by no means full AI, imagine an advanced AI with the same focus – on saving lives in conjunction with healthcare organisations. The results could be incredibly positive. What if it could find a new way to cure cancer, for example,
www.euresearcher.com
by having access to all the cancer research and patient data? Zuckerberg’s enthusiasm for AI is based on a view of its ability to improve standards, to be quicker to smart solutions. As he put it in an online defence of his point of view: “One reason I’m so optimistic about AI is that improvements in basic research improve systems across so many different fields — from diagnosing diseases to keep us healthy, to improving selfdriving cars to keep us safe, and from showing you better content in News Feed to delivering you more relevant search results. Every time we improve our AI methods, all of these systems get better. I’m excited about all the progress here and it’s potential to make the world better.” Simply put, Zuckerberg is recognising a trend, something he is good at, which indicates AI is a positive force for good, a way of improving everything. Whichever side you take on the great AI debate, there is a feeling that true AI is going to happen – it’s just a matter of time – whether that is 30 years or 200 years away. However, as an important footnote, there is also the view that it might never happen at all. The argument goes, that for a machine to develop something like a true consciousness, it needs a few things that the natural world has taken hundreds of millions of years to evolve – and these are flesh based – they require nerve endings, skin and feelings. The subtle intricacies of our incredibly mysterious brains, our complex bodies, the way we manage raw input delivered and analysed from our senses, these things are all necessary for us to be what we understand as intelligent and conscious. Without these basics, how will a computer ever know what it means to feel real pain or real love and without those emotions – it will still just be a box of tricks. Another way to put it would be, whilst a computer could compose beautiful music based on the rules of creating beautiful music, it needs to really hear, interpret and feel the emotions in that music, before it can truly understand what beautiful music is.
59
Robots that remove health risks The harmful effects of asbestos inhalation are well-known, yet the material is still present in many locations at potentially hazardous levels. The Bots2ReC project aims to develop a robotic system for the automated removal of asbestos without exposing workers to health risks, as Professor Burkhard Corves and Tim Detert explain The adverse health
effects of asbestos exposure are well-documented, with inhalation of the fibers known to cause several serious illnesses, yet the material is still present in many locations, for example private flats, public buildings and offices, which can represent a risk to health. Manually removing asbestos when these sites are later re-developed or refurbished leaves workers exposed to potentially hazardous levels of the material, so researchers in the Bots2ReC project are working to develop an alternative solution. “The aim of the project is to validate a process for the automated removal of asbestos from a rehabilitation site using a robotic system. We want to be able to remove asbestos from a site without exposing workers to hazardous levels,” explains Professor Burkhard Corves, the project’s Principal Investigator. This approach would also be more efficient than manually removing asbestos. “It’s very time-consuming for a worker to get in and out of protective clothing, then there are also limits on the hours they can work,” points out Professor Corves. A robot by contrast could be operated on a site for 24 hours a day, with only relatively limited involvement from an on-site operator, greatly improving efficiency. The project aims to develop a robotic system capable of removing asbestos from sites where apartments are being refurbished in line with modern standards. “We’re mainly interested in the refurbishment jobs, on what we call rehabilitation sites,” says Tim Detert, the Scientific Project Manager. In this type of situation, asbestos needs to be thoroughly stripped out, from plaster, tiles, and any other parts of the site where it was previously used. “The standard procedure is to thoroughly look through an apartment. The professionals know from experience where asbestos is most likely to be, such as under the tiles or in plaster, or on the floor. They identify all these different areas in the apartment, and the presence of asbestos in the material is checked. Then the material is removed before refurbishment begins,” outlines Detert.
60
Scenario sketch of the robotic system and removal process.
Robotic system The robotic system being developed in the project is designed to offer an effective, reliable and thorough method of removing asbestos. The Bots2ReC system is comprised of multiple units, each consisting of a mobile platform and robotic arm with an abrasive tool. “The idea is to use technology to grind the material down into small particles,” explains Professor Corves. An aspiration unit is then used to remove the asbestos. “It’s basically a big vacuum cleaner, with
This task could be on the floor, in the ceiling, or several other points inbetween, which is an important considerations in terms of the design of the robot. The robot needs to able to reach down to the floor and also up into the air, to a height of around 3 metres, while researchers are also aware of likely operational space constraints within rehabilitation sites. “These apartments may have small, narrow corridors and tight corners,” explains Professor Corves. The robot has
The aim of the project is to validate
a process for the automated removal of asbestos from a rehabilitation site using a robotic system. We want to be able to remove asbestos from a site without exposing workers to hazardous levels very good dust protection and a filtering system. The robotic system puts the asbestos into a bag using a certified filtering system, which is then sealed, before being removed manually later on and decontaminated,” says Detert. “The idea is to have an operator close to the site to control the robotic system, using an interface and a camera sensor signal. The operator will allocate an asbestos removal task on a semantic map that is generated automatically for the particular rehabilitation site.”
been designed to operate in this type of environment, balancing key considerations around performance, weight and the reach of the robotic arm. “We knew we had to adapt to these constraints, and that was at the forefront of our thinking from the early stages of the project. This was one of the reasons we decided against using a standard robot arm,” continues Professor Corves. “The complete unit has a very small footprint not only in terms of geometry, but also in terms of weight.”
EU Research
Automated process The robot will always work on pre-defined jobs specified by the operator, such as the grinding of a certain segment of wall, or the removal of tiles from a specific location. Once the robot has started on the particular job, there is regular feedback between the operator and the robot, to help make sure the contaminated material has been fully removed. “This is a kind of double check, as the operator is far better at recognising the wider environmental situation and the removal of material. The final and full robustness of the process comes from the feedback from the user to the robot, which helps with decisionmaking,” explains Detert. “The robot can then start removing asbestos on its own, as it knows where the wall is.” The movement of the robot within a refurbishment site is another important consideration, as asbestos may have been used in multiple locations across different floors. The robot is designed to move between floors using the elevator, a process which is controlled by a teleoperator, again improving efficiency. “The gross weight of the robot has been limited to the maximum load that could be carried by an elevator,” says Professor Corves. The longer-term goal is to apply this type of system more widely on rehabilitation and indoor construction sites, so alongside developing the technology, Detert and his colleagues in the project are also
First version prototype at the Bots2ReC Testing Site.
considering likely use cases and the whole process around how such a robot would be used. “Soon we will start inviting operators to see the system, to operate it, and to give us their feedback,” he outlines. “We want to learn more about where they might want to intervene or provide feedback for example.” The primary focus for the project at the moment is on the removal of asbestos, as it’s an area where there is significant scope for efficiency gains, yet this approach could potentially be applied more widely in future to remove other contaminants and for other jobs on construction sites, such as applying new plaster, finegrinding plaster and painting walls. While robotic systems are currently underutilised in construction, Detert believes this will change in future as more sophisticated systems are developed. “Robotic systems are on the cusp of being more widely applied,” he says. Construction sites are typically very complicated environments, with a lot of different companies working in the same location; on the other hand, contained asbestos rehabilitation sites are relatively straightforward, so represent a good opportunity to introduce robotic systems. “The economic advantage of using robotic systems on rehabilitation sites is significant, because they are so challenging for manual workers, that’s why they’re a good starting point,” says Detert.
At a glance Full Project Title Robots to Re-Construction (Bots2ReC) Project Objectives The Bots2ReC project aims to develop a robotic system to perform these tasks. The proposed robotic system will consist of multiple robotic units, a central aspiration and energy supply and a central process control system, that allows easy programming and the supervision of the automated process and optional remote control. Sensor systems will allow the environmental perception of the system and local monitoring of the asbestos-removal-tasks. Project Funding The project started in February 2016 as an H2020 Innovation Action. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 687593. Contact Details Project Coordinator, Department of Mechanism Design, Dynamics of Machines and Robotics Kackertstraße 16-18 52072 Aachen T: +49 241 80 95553 E: bots2rec@igm.rwth-aachen.de W: http://www.bots2rec.eu Journal: Construction Robotics, Vol. 1, Issue 1, 2017. ISSN: 2509811X (print version) ISSN: 2509-8780 (electronic version) DOI: 10.1007/s41693-017-0007-1 Title: Bots2ReC: introducing mobile robotic units on construction sites for asbestos rehabilitation “
Tim Detert
Burkhard Corves
Burkhard Corves, a PhD in robotics from RWTH Aachen University is head of the department and has experience both in academia and industry. Among other duties he is the chairman of the Association of German Engineers (VDI) Advisory Board “Mechanism and Machine Science” and memberof the Executive Council of the International Federation for the Promotion of Mechanism and Machine Science (IFToMM). Tim Detert is the leader of the robotics and mechatronics group at the department. His research focused on reconfigurable, object-integrative handling technology and self-calibration.
www.euresearcher.com
61
A new light on Hendrik Arent Hamaker The Dutch scholar Hendrik Arent Hamaker made an important contribution to the development of philology and oriental studies in the early nineteenth century. Yet, his role, if not neglected, has at least been undervalued in the historiography, says Hendri Schut. Mr Schut now plans to re-examine Hamaker’s life and work, placing his ideas within the wider context of the period The early nineteenth
century was an important period in the development of not only oriental studies, but science and scholarship at large, as methodological changes laid the foundations of the more specialised academic disciplines that we know today. As Professor of Oriental Languages at Leiden University and Keeper of its treasure-trove of oriental manuscripts during part of this period, Hendrik Arent Hamaker made an important contribution to the development of philology and oriental studies. “Hamaker was a philologist, he focused on language as fixed in ancient texts,” says Hendri Schut, an independent researcher at Leiden University. While Hamaker was widely known in Holland at the time, Schut believes his role has been somewhat neglected in the historiography of philological scholarship. “I think that his role in the development of oriental studies during this period is more significant than has been acknowledged,” he says, adding: “Hamaker introduced the up-to-date critical methodology in Dutch oriental philology, thereby lifting it to what was then the modern scholarly standard.” Reviving a Keeper of an Oriental Treasure: H.A. Hamaker (1789-1835). This contribution raises the issue of the feasibility of using the term ‘romantic’ in describing and analysing the work of scholars from around 1800. It tries to do so by the example of the important but neglected Dutch orientalist Hendrik Arent Hamaker (1789-1835). Can he be seen as a ‘romantic’ scholar? Hendri Schut Affiliation: Independent Ph.D. – candidate Leiden University Graduate School of Humanities History Department E: hendri.schut@kickmail.nl E: h.w.schut@hum.leidenuniv.nl W: https://www.nwo.nl/en/research-andresults/research-projects/i/88/9588.html W: https://www.universiteitleiden.nl/ en/staffmembers/hendri-schut#tab-1 Hendri Schut is a teacher of History and Classics. In 2012 he successfully applied for a “teachers-scholarship” to pursue scholarly research, which will also help broaden, deepen and improve his teaching. He studied General History and Ancient History at Nijmegen University, taking courses in Classical Literature, Semitic Languages, Archaeology and Egyptology.
62
Work in progress: trying to carefully unfold an early 19th-century charter in Arabic script.
Hendrik Arent Hamaker (1789-1835) This is a topic that Mr Schut now aims to re-examine, placing Hamaker’s views and ideas within the wider context of the cultural, social and political climate of the time. From the eighteenth century, oriental studies had started to become valued as more than just the handmaiden of theology, and beyond that of interest only for interpreting in international diplomacy. Alongside the dominant languages of Hebrew and Arabic, other semitic and nonsemitic oriental languages, like Persian and Sanskrit, came into the European scope. Interest in oriental cultures, societies and languages became more widespread. At the same time these languages came to be studied in a more systematic and methodical way. “In the Netherlands at the start of the nineteenth century Hamaker clearly takes part in this development, inaugurating a reflourishing of Dutch oriental studies which lasts until the present day,” says Mr Schut, assessing Hamaker’s role. Hamaker’s work in this area is notable for its variety, encompassing different areas of scholarly research. In particular, he played a major role in exploring the structure of different oriental languages and the relationship between them. “He explicity described himself as the first person in the Netherlands to seriously study the relationship between Sanskrit and certain European languages,” continues Mr Schut. “He also published several medieval Arabic manuscripts on geography and history in the collection of the Leiden University library.” The importance of his contribution and
his pioneering work in what would later become sub-disciplines of oriental studies has not been widely acknowledged in the subsequent historiography and analysis of the period however. Some scholars have argued that the motivation behind the development of oriental studies was closely linked to the emergence of western imperialism; Schut likes to take a less axiomatic-conceptual and more narrative approach. “I think that taking an inductive method in looking at the development of oriental studies, brings one closer to the realities of that development. Investigating an individual orientalist lessens the risk of drawing too far reaching conclusions about Europe’s encounter with the Orient too soon,” he explains. While Hamaker was of course a child of his time, and labelling phenomena within that time may be of help, Schut believes it’s important to look at individual motivations behind studying oriental languages. “I think that this does more justice to what Hamaker’s reality was than trying to fit him into a bigger, pre-conceived picture,” he says. This reality was informed by the intellectual environment at the time. The early nineteenth century was a time of intense scholarly and research activity, which also coincided with the emergence of the romantic movement, yet Schut is wary of describing Hamaker as a romantic scholar. “There is an undeniable connection between the romantic movement and oriental studies, but it’s complex, we can’t simply say: ‘with the rise of the romantic movement a new type of scholar comes to the fore, distinctly different from earlier types of scholars, and for which the label ‘romantic scholar’ would be justified’,” he stresses. Nevertheless, the romantic movement is an important factor when considering the period; Schut plans to continue his research in this area, filling a gap in our understanding and bringing Hamaker’s name to wider prominence. “I plan to write a doctoral thesis about Hamaker based on my research, and to establish his role in the development of oriental studies.”
EU Research
EU Research
For more information, please visit: www.euresearcher.com
EU
ACOSAR – specifying an open interface for system integration Simulation frameworks are commonly used in industry to assess specific components, providing an efficient and effective method for development and testing. We spoke to Martin Benedikt about the ACOSAR project’s work in developing an interface to integrate real-systems, work which holds important implications for the transport sector in particular Many companies use
virtual systems and simulation frameworks during development, allowing staff to assess specific components more efficiently and in the long-term bring new products to market faster. However, integrating real-time (RT) systems like HiL (hardware in the loop) testbeds with simulation frameworks remains a complex task, an issue that the ACOSAR project aims to address. “We aim to develop an Advanced Co-simulation Interface (ACI) to methodologically integrate these realtime systems,” says Martin Benedikt, the project’s leader. Currently the integration process is quite intensive and error-prone, as configuring the communication protocols and the overall system requires a lot of work; Benedikt and his colleagues aim to develop a more efficient approach. “This is a major part of the motivation behind the project, to reduce configuration complexity and related effort,” he outlines.
Automotive industry This research holds important implications for the automotive industry in particular, in which vehicles are growing ever more complex, at the same time as budgets are being squeezed. There are of course many complex individual components in a vehicle, and they are typically developed in different locations before the car is eventually brought together and reaches the market. “With the automotive pathway, we have an engine running in a specific test-bed, a gear-box is
To enable effective and efficient RT-System integration, ACOSAR will provide innovations on different levels:
virtually running in a specific 3rd party simulation tool, we have a battery system running on a remotely located test-bed. The idea is to define the interfaces between these different components,” he explains. “Once we have data on all the different components from multiple components, we can look to simulate the transient interactions and dynamics of the vehicle itself. This could be done with modular and (soft) real-time capable co-simulation approaches.” A wide variety of components need to be assessed during the development process, and the number is set to increase further as the prospect of autonomous vehicles being widely used on the roads moves ever closer. Alongside established components like the engine and the gear-box, manufacturers also increasingly need to consider technologies like sensors, cameras and control systems. “Car manufacturers want to integrate different components, so we need to know the determinants of those
components,” says Benedikt. A more efficient method of sharing information takes on clear importance in this context, allowing car manufacturers and their suppliers to work together more easily during development, and to identify any potential issues at an earlier stage. “We can seamlessly integrate components into simulation environments,” he outlines. This could lead to improvements in test efficiency, helping to accelerate the system development process and enabling new business models. There are still some major technical challenges to overcome however, particularly in terms of the communication protocols used in integrating these components. “A lot of different communication protocols were used to interlink the different components. In order to develop a standardised method of data exchange, we have to extract these communication protocols, and to learn more
With the automotive pathway, we have an engine running in a specific testbed, a gear-box virtually running in a specific 3rd party simulation tool, we have a battery system running on a remotely located test-bed. The idea is to define the interfaces between these different components
64
EU Research
At a glance about their underlying nature,” he explains. Researchers in the project aim to specify an additional communication protocol which will work alongside existing protocols, helping to improve the efficiency of tests and simulations. “We aim to specify an additional communication protocol on top of existing protocols,” continues Benedikt. The project consortium brings together a number of partners from both the academic and commercial sectors to pursue this work, testament to the wider industrial relevance of this research and the level of interest in it. Along with major transport companies, the consortium also includes some original equipment manufacturers (OEMs); Benedikt says this collaboration will help ensure the project’s work is relevant to practical problems, in particular the technical and economic challenges facing industry. “A lot of work is being done on the ACI Specification, and this is being fed in by companies on different sides of the automotive domain. They will add context to each other’s work during development, during the course of the project, and help to spread knowledge and expertise to other participants,” he says.
Project consortium This research holds important implications for the car industry, yet its relevance is not limited solely to the automotive domain. Alongside car companies, Benedikt says the project’s research could also bring benefits in other areas of the transport industry looking to improve system development processes. “For example, many systems also need to be integrated in the aviation domain,” he points out. The project is working towards three key
outcomes in terms of the wider applicability of their research. “The first is the classification, where we evaluate the specification of this interface, we want to embed it into an existing standard,” he outlines. “The second is a methodology to support the integration of systems. The third outcome is to establish a community, which supports our research after the project.” Researchers plan to collaborate with other industries on areas of common interest, while work will also continue on specification and the integration of the project’s findings with existing standards. This is a technically complex area, and researchers are in discussions with standardisation organisations, including Modelica Association and ASAM e.V.. “We’re looking at how we can extend existing standards, as well as how we can establish new ones,” he outlines. As the project progresses, Benedikt and his colleagues are looking towards demonstrating the project’s findings and publicising their research. “Over the latter part of this year we want to have a public demonstration of the standard. Along with this, we also aim to attract further industrial partners at a user event, where we invite other vendors, improve our ideas and get further feedback from other industries,” he says. This could in the long-term lead to a more economical development process, potentially opening up the market to new entrants and leading to the emergence of new business models. The impact of this may eventually trickle down to consumers in the form of lower prices, something we would probably all welcome!
Full Project Title Advanced Co-simulation Open System ARchitecture (ACOSAR) Project Objectives The ITEA 3 ACOSAR project will enable effective and efficient Real Time (RT)-System integration through a modular co-simulation approach that supports flexible system development to facilitate an efficient system development process and create new business models. Differentiation to FMI and added value: • Distributed Co-Simulation by integrating tools • Interactive Co-Simulation without model export • Integration of (multiple) testbeds and simulations Project Partners AVL List GmbH • Robert Bosch GmbH • dSPACE GmbH • ETAS GmbH • Ilmenau University of Technology • ESI ITI GmbH • Leibniz University of Hannover • ks.MicroNova GmbH • Spath MicroElectronicDesign GmbH • Dr. Ing. h.c. F. Porsche AG • Renault SAS • RWTH Aachen University • Siemens Industry Software SAS • TWT GmbH Science & Innovation • Virtual Vehicle Research Center • Volkswagen AG Contact Details Project Coordinator, Dr Martin Benedikt Virtual Vehicle Research Center Inffeldgasse 21/A, 8010 Graz, Austria T: +43 316 873 9048 E: martin.benedikt@v2c2.at W: http://acosar.eu/ W: https://itea3.org/project/acosar.html W: http://ceur-ws.org/Vol-1675/paper4.pdf
Dr Martin Benedikt
Modular, open system architectures for virtual design to integrated tests with real components Dr Martin Benedikt is team leader of the group ‘Co-Simulation & Software’ at the VIRTUAL VEHICLE Research Center and received his Ph.D. degree in Control Engineering from Graz University of Technology in 2013. His main research interests include control system design, system modelling and holistic system simulation.
www.euresearcher.com
65
DeveloperSpace to help bridge the widening digital divide We are, unintentionally, in the process of excluding more and more people from participation in education, employment, and daily living by digitizing all aspects of our society and putting digital interfaces on everything. Prosperity4All seeks to help reverse this digital exclusion as Dr Gregg Vanderheiden and Dr Matthias Peissner explain The rapid evolution
of technology and the introduction of digital interfaces on an ever wider range of products and services is helping to widen the digital divide, leaving more and more people at risk of being excluded from key services. Technology interfaces are increasingly essential to accessing health and travel services for example, and people who cannot use them are unable to fully participate in the emerging digital society. To address this issue an international group of scientists, developers, consumers and companies have come together in the Prosperity4ALL (P4ALL) project to work on the development of a Global Public Inclusive Infrastructure (GPII), aiming to help make technology more widely accessible. “Our goal is to make it easy, affordable and efficient to produce or create solutions that help everyone to access technology,” says Dr Matthias Peissner, the Project Coordinator. A key component of the GPII is the DeveloperSpace, a website to help developers design more inclusive features in mainstream products and new types of assistive technology (AT). The wider, longerterm goal is to ensure that accessible solutions are available for everyone, including those on the edges of the digital society, and the DeveloperSpace will play a significant role. “We aim to create and gather things in the DeveloperSpace that would help developers,” outlines Dr Gregg Vanderheiden, the project’s technical coordinator. “So, providing the parts, resources and tools to understand specific problems around digital inclusivity, and then to find better resources and information about strategies to deal with them.”
DeveloperSpace The DeveloperSpace is intended to be a place for collaboration on grand challenges, enabling researchers to locate other people who might be able to help in development, such as technical experts and consumers
66
Picture of the DeveloperSpace homepage. that understand the specific problem around digital inclusivity. Researchers and people working in the field can use the DeveloperSpace to share ideas on how to more effectively personalise online materials and interfaces. “For example, if a PhD student is looking for a project, they can find something that uses and develops their expertise and represents a real user need with real world impact,” says Dr Vanderheiden. There are many different reasons behind digital exclusion; Dr Vanderheiden says the project has identified four key groups. “We talk about barriers due to a person’s literacy, digital literacy, disability or age,” he explains. “People with reduced literacy or digital literacy face problems in understanding both how digital technologies work and the language used. Auto translation and self-voicing technologies are two solutions being facilitated.” The disability community itself is also very diverse, encompassing a wide range of physical disabilities that may hamper an individual’s ability to access technology. Older people are treated in the project as a separate group, even though in some cases
they face similar challenges to people with disabilities. “The problem is that older people acquire functional limitations, but they don’t view themselves as being disabled,” says Dr Vanderheiden. Modern technology tends to be aimed at a broad market, often excluding those who find it difficult to access it, such as the disability population, people with low levels of digital literacy and the older generation who may be less familiar with today’s technologies. Now researchers aim to help technology developers address the full range of users. “Our main goal is not to develop new solutions for people with impairments, but to provide a platfom where different things can be brought together. Where, for example, developers can find stuff from other developers and researchers, that can then be re-used,” says Dr Peissner. “We aim to bring all the existing research together to help develop solutions that really work for the market.” This work is part of a wider international initiative which aims to help widen access to technology among people who face accessibility barriers. Alongside the DeveloperSpace, the P4ALL project is also
EU Research
working on other elements of an eco-system aiming to encourage inclusive design. This includes a second key component of the GPII, a unified listing, which brings together information from several databases into a single unified listing of products and services. “The unified listing provides a pipeline between consumers and developers,” explains Dr Vanderheiden. This can act as a channel for the exchange of information and views, so that developers are aware of the needs of the disabled community. “With feed-forward information, consumers can push developers to do something new, while with feedback information, consumers can give their views on products and technologies,” continues Dr Vanderheiden. “Then users can also share information with each other, about what works with what, and about how to use certain technologies effectively.”
have trouble using technology,” says Dr Vanderheiden. The ideal long-term outcome of the project would be that accessibility is no longer a peripheral consideration in development, as Dr Vanderheiden explains. “In the long-term we want there to be no such thing as accessibility, so that when people are designing something to be usable, their usability testing would include everybody – they don’t think about accessibility as an additional feature,” he outlines.
Social and economic impact This could have a significant social and economic impact as more and more civic, health and education services move online. With over 2 billion people limited in varying degrees in their ability to access technology, there is not only a strong moral case for widening accessibility, but also an economic one. “The more things people can’t do for
to make it easy, affordable and efficient to produce or create solutions that help everyone to access technology Our goal is
Accessible technology The goal of making technology more accessible is widely shared, yet it must of course also fit in with commercial objectives. While it has previously been argued that an overt focus on serving the disabled community stifles technical innovation, recent history demonstrates otherwise. “When Apple introduced the iPhone, initially they had accessibility problems, so they went in and added some features. Now the most innovative and profitable phones and tablets in the world are chock-full of accessibility features. This has given the lie to the notion that you can’t combine accessibility with technical innovation,” points out Dr Vanderheiden. Ultimately companies of course want to maintain profitability; Dr Peissner says the project aims to help them achieve this, while also making their products more accessible. “At the moment, accessibility is mainly driven by regulations and external pressure. We want to turn it into a positive thing, that everybody who is engaging can really benefit from this eco-system,” he explains. A greater focus on accessibility can bring commercial dividends to technology companies. Going back to Apple’s products, their accessibility has opened up new markets. “All sorts of people that wouldn’t have been able to use their products are now able to use them. Some of these markets are huge, as there’s an awful lot of people that
www.euresearcher.com
themselves, the less able they are to live independently, and the quicker care costs are going to accumulate,” points out Dr Vanderheiden. A key challenge now is to engage developers more widely and encourage them to share information. “We’re using ‘gamification’, breaking technical challenges down into little bits so that the initial engagement is very rewarding,” continues Dr Vanderheiden. “There are lots of developers who would like to create solutions for people with disabilities, the challenge is to work out how to do it more easily, and how to make it more economically sustainable over the longer-term.”
At a glance Full Project Title Prosperity4All - Ecosystem infrastructure for smart and personalised inclusion and PROSPERITY for ALL stakeholders (PROSPERITY4ALL) Project Objectives The task of PROSPERITY4ALL is to build ‘behind-the-screens’ technical infrastructure that allows these users to access assistive technologies and services. This also involves bringing together software and component developers to get them to innovate on behalf of people with special needs. Project Partners Please see website for full details of project partners. Key Project Links W: http://Prosperity4All.eu W: http://GPII.net Contact Details Project Coordinator, Dr Matthias Peissner Director, Head of Business Unit HumanTechnology Interaction Fraunhofer Institute for Industrial Engineering IAO Nobelstr. 12, 70569 Stuttgart, Germany T: +49 711 970 2311 E: matthias.peissner@iao.fraunhofer.de W: www.iao.fraunhofer.de
Dr Matthias Peissner (Left) Dr Gregg Vanderheiden (Right)
Dr Gregg Vanderheiden is a Director and Professor of Trace R&D Center at the University of Maryland, as well as a Director and co-founder of Raising The Floor (RtF-I). Dr Vanderheiden has worked in technology and disability for over 45 years, and is work is found in Windows, Mac and Linux OSs as well as many other ICT products. Dr Matthias Peissner is Director and Head of the Human-Technology Interaction Business Area at the Fraunhofer Institute for Industrial Engineering (IAO). He has expert knowledge as a project leader in a number of diverse areas, such as adaptive user interfaces, user experience engineering, and user interaction in intelligent environments.
This project was funded by the European Commission 7th framework under grant 610510. However no endorsement of the results by the funding agency should be assumed.
67
A middleware for collaboration between IoT platforms The Internet of Things (IoT) promises to dramatically change our everyday lives, yet it’s not always easy for developers to use available smart devices. Professor Ivana Podnar Zarko and Dr Sergios Soursos tell us about the symbIoTe project’s work in developing middleware that will both ease the development process and open up new commercial opportunities for IoT providers The development of
the Internet of Things (IoT) promises to dramatically change the way we live and work, with closer interaction between systems and devices set to have a major impact on industry. Many companies have established Cloud and IoT platforms to ease the development of new applications, yet it is not always easy to share information between them. “Different protocols and standards need to be followed in order to make interaction possible. In the device layer, there are many network protocols and messaging protocols to send the data, which have to be supported by the gateways; then on the Cloud layer there are different platforms that can host data collection and decision-making processes,” says Dr Sergios Soursos. Based at Intracom Telecom in Greece, Dr Soursos is the coordinator of the symbIoTe project, an EC co-funded initiative which aims to help simplify the IoT application development process. “symbIoTe is looking to develop middleware that will allow IoT platforms to interoperate and collaborate, in order to share and exchange IoT resources to achieve common goals,” he explains.
IoT platforms This work centres around developing software components that can enable interactions between these different platforms. The latest analysis shows that there are more than 300 IoT platforms on the market from various companies, yet it is not typically possible for them to interoperate.
68
“Basically you have vendor lock-in of the platforms and devices. It’s not only the big players who are developing their own Cloud solutions either – there are also many IoT platforms which have been built, supported and provided by SMEs,” points out Professor Ivana Podnar Zarko, the project’s Technical Manager. This affects how people use IoT applications. “If you have one IoT solution at your home, you need to use the application which works for the devices within your home. Then when you go to work, to the office, you have another mobile IoT application which works there, and is specific for a particular
developers also need to identify the right resources. “If a mobile application developer wants to identify the right sensors, to integrate into the applications, then they need to find them first,” points out Professor Zarko. The project is developing a kind of search engine for sensors and actuators to work across these different platforms, which Professor Zarko says will help ease application development. “symbIoTe is going to provide services so that developers can easily find the right resources, and then integrate them into their applications,” she outlines. “But
We want to minimise the digital footprint of each platform, so as not to increase the costs of deployment. We want to make sure that newly deployed platforms can take advantage of the systems that already exist in the smart space purpose,” continues Professor Zarko. “We would like to have just one application that can talk to devices in various IoT environments. Through this approach, IoT platforms could offer application developers the opportunity to access, in the same way, their devices and their resources.” Researchers are developing components which act almost like a glue between the platforms, helping to support the development of innovative IoT applications. This is challenging work, as alongside the technical complexity of building a new application,
when you start integrating them, basically you are like an intermediary, as the application will get the data from the platform. The symbIoTe framework is not designed to store platform-specific data within its services, the aim is to be an intermediary, to help hook the applications to the right platforms.” This is designed to fit alongside the existing, hierarchical IoT stack, without necessarily disrupting the way platforms work. The goal is rather to make these platforms more cooperative in the way that they interact, which Dr Soursos says will lead to changes in applications.
EU Research
At a glance Full Project Title Symbiosis of smart objects across IoT environments (symbioTe) Project Objectives symbIoTe aims at establishing an interoperability middleware that will allow IoT platforms to open their resources for 3rd party applications and/or other IoT platforms to use. Based on this concept, symbIoTe will facilitate the federation of platforms, the roaming of smart devices and the creation of cross-domain IoT applications. Project Funding RIA - Research and Innovation action. H2020-ICT-2015 Project Partners https://www.symbiote-h2020.eu/index. php/consortium/
Figure 1. symbioTe concept “Imagine that you have a security system installed at your house, with motion sensors and so on. At the same time, you also have a platform that manages the energy consumption of your house, so you also have other types of sensors installed, then you may also have an entertainment system,” he outlines. There may be three different independent platforms relating to these systems, but they are typically colocated in the same space, and some of them may use the same types of sensors. “We want to minimise the digital footprint of each platform, so as not to increase the costs of deployment. We want to make sure that newly deployed platforms can take advantage of the systems that already exist in the smart space,” explains Dr Soursos. The end-user would not then require three different applications to interact with those systems. Ideally, they would need only one cross-domain app, managing security, entertainment, and energy for example. “Currently there are closed platforms that do not allow cooperation, which is what we are trying to address in symbIoTe,” says Dr Soursos. The project is taking a layered approach to this work, looking across the application, Cloud, smart space and device domains. “We have designed and we are implementing middleware between the application domain and the Cloud domain, which includes discovery, management and optimisation. This is a first level of interoperability across
www.euresearcher.com
platforms,” continues Dr Soursos. “The next level is to make platforms federate, and the third level would be on the gateway, on the smart space domain, to take advantage of the co-location that I mentioned earlier. We aim to develop a demonstratable middleware and we already have some industrial partners in the project consortium, coming either from use-case perspective or from a more integration perspective.” There are a number of IoT platform providers in the consortium, reflecting the wider relevance of the project’s research, now partners are looking to explore the potential commercial benefits of the symbIoTe middleware. The project is keen to work with more companies, and Dr Soursos says a second open call will be launched in October 2017. “We invite SMEs, start-ups and companies to apply so as to make their IoT platforms symbIoTe compatible. This is another means of investigating potential commercial collaborations between both partners and external companies,” he outlines. Close collaboration between the academic and commercial sectors can also help ensure that new technologies are tailored to commercial needs, which is an important consideration in the project. “We are also looking to assess market needs, and the level of demand for the product that we are developing. Alongside looking at the implementation of the software, we’re also investigating the market potential, and trying to find the right business offerings,” says Professor Zarko.
Contact Details INTRACOM S.A. TELECOM SOLUTIONS 19.7 km Markopoulou Ave. Peania Athens, Greece, GR-19002 T: +30-210-66 71 043 E: souse@intracom-telecom.com W: https://www.symbiote-h2020.eu S. Soursos, I. Podnar Zarko, P. Zwickl, I. Gojmerac, G. Bianchi and G. Carrozzo, “Towards the Cross-Domain Interoperability of IoT Platforms,” in the Proceedings of the European Conference on Networks and Communications (EUCNC) 2016, 27-30 June, Athens, Greece. P. Reichl, I. Gojmerac, I. Podnar Zarko, S. Soursos, “Bridging IoT Islands: The symbIoTe Project”, e&i, vol. 133 no. 7, special issue on “Internet of Things - Quo Vadis”, Springer, November 2016.
Dr Ivana Podnar Zarko Dr Sergios Soursos
Dr Sergios Soursos is a Master R&D Engineer at Intracom Telecom, Greece, working on IoT interoperability and Network Management. His interests include management of overlay traffic, Cloud computing, informationcentric networking, smart grids and Big Data analytics. Dr Soursos holds a PhD on network economics from the Athens University of Economics and Business. Dr Ivana Podnar Zarko is an Associate Professor at the University of Zagreb, Faculty of Electrical Engineering and Computing, Croatia, where she leads the Internet of Things Laboratory. She is the Technical Manager of the H2020 project symbIoTe: Symbiosis of smart objects across IoT environments.
69
Harnessing the power of bioinformatics
Large volumes of biological data are constantly being generated as high-throughput sequencing technologies become increasingly accessible; yet there is often a backlog in the processing and analysis of this data. The TrainMALTA project provides training in bioinformatics analysis, enabling researchers to gain new insights into the genetic causes of disease, as Dr Rosienne Farrugia explains A wide range of bioinformatics techniques and software tools are available today, enabling researchers to draw new insights from biological datasets. The volume of data being generated and the rapid development of bioinformatics techniques means there is an ongoing need to provide high-quality training, an issue which lies at the core of the TrainMALTA project. “We saw a need to provide training, to improve local expertise in bioinformatics, with a specific focus on the analysis of high-throughput sequencing data including genomics, RNA transcriptomics and epigenetics work. We also aim to tie that in with our other ongoing research into the background of disease,” says Dr Rosienne Farrugia, the project’s Principal Investigator. Bioinformatics is a key element of modern medical research, enabling scientists to analyse biological data in greater depth. “We focus on providing training in the use of informatics, command line open-source tools and high-throughput analysis pipelines, to query biological datasets. The main type of biological data sets that we are looking at are those generated from high throughput sequencing,” continues Dr Farrugia. This could be whole exome or genome
70
sequences from the DNA of a group of people, many with a specific condition. The power of informatics can then be applied to sieve through these huge volumes of data and investigate certain research questions; Dr Farrugia and her colleagues are looking at several biological datasets. “In one of the projects I work on together with Dr Stephanie Bezzina Wettinger, we study relatively rare diseases, and we try to identify the mutation or mutations giving rise to each disease,” she outlines. In another project researchers are investigating myocardial infarction (MI), a highly complex condition. “We usually look at pathways. Which pathways are affected? Which pathways have accumulated changes? We look at people who have had a heart attack and people who haven’t, and compare the data,” says Dr Farrugia. “In both of these cases, the rare diseases and MI, we need to query these big volumes of data in different ways, but always within a biological context of the specific condition. Informatics is being used in this way to gain a deeper understanding of biological processes. This cannot be done manually – nobody could manually sieve through all that data in a reasonable timeframe.”
High-throughput sequencing This is due not only to the volume of the data being generated, but also the complexity of it. With high-throughput sequencing, the DNA of an individual is effectively chopped up into many small fragments, which are then ‘read’ using appropriate chemistry. The data from all the fragments then have to be put together again, from which point researchers can start to analyse the data. “Once the sequence has been put together again, we can start to draw comparisons and pull out a list of variants, a list of differences,” explains Dr Farrugia. With the project on MI, Dr Farrugia says researchers have a collection of approximately 1,000 individuals, including people who have had an MI and people who haven’t. “There may be millions of differences between a patient sample and the control. So you need to pull out these differences, and then find out the difference that is the cause of the condition,” she says. “For a complex condition like MI, it will very rarely be a single causative mutation. So we look at individual pathways instead of the entire genome, and we investigate whether patients have differences in that pathway when compared to the controls.”
EU Research
A number of factors are known to increase the risk of MI, including diabetes, high lipid profile and obesity, which is an important consideration in terms of research into the underlying causes of the condition. Researchers can look to identify the pathways which control these processes and then investigate them in greater depth; with the project looking at rare diseases, the risk factors are not always as clear. “A patient may present with the characteristics of a particular condition, but the differences between cases and control are not always clear. Sometimes you have to sift through the data, identify differences, and see whether it fits the phenotype, the presentation of the patient,” outlines Dr Farrugia. Bioinformatics techniques have an important role to play here, enabling researchers to analyse large volumes of data in great depth and under different models. “We can design different scenarios and run them all individually. A computer can run different scenarios and give us different possible outcomes after which we can identify which is the most likely,” says Dr Farrugia.
can understand the underlying biology, then you can design your algorithms, scripts or pipelines to address specific biological questions,” explains Dr Farrugia.
Training The priority in the project is to equip researchers with the inter-disciplinary skill sets that they need to analyse data. This includes not only the data from the University of Malta, but also other, publicly available datasets. “The beauty of the way genetics research is going right now is that many of these datasets are publicly available. So you don’t always need to be tightly interlinked to a consortium to have access to data,” points out Dr Farrugia. Analysis of these data-sets could help researchers learn more about the underlying causes of disease, marking another step towards the wider goal of personalised medicine. “Identifying the genetic basis of a disease will mean that a patient will get put onto the ideal treatment earlier, rather than there having to be a trialand-error approach,” says Dr Farrugia. “There have also been a lot of studies and
We focus on providing training in the use of informatics, command line open-source tools and high-throughput analysis pipelines, to query biological datasets. The main type of biological data sets that we are looking at are those generated from high throughput sequencing This is not what many of us would imagine as a core part of traditional medical training, underlining the importance of the project’s work and their commitment to sharing knowledge and expertise. The project is providing researchers with the opportunity to learn both how to analyse these datasets and to create novel analytical approaches, which Dr Farrugia hopes will help lay the foundations for future bioinformatics research at the University of Malta. “We see this as a starting point, to set up a core of bioinformatics expertise locally. We are looking into the possibility of setting up a number of post-graduate degree courses, so that we can continuously train more and more people in the bioinformatics field,” she says. This training needs to combine elements of different disciplines. “It’s important that people with an interest in bioinformatics also have an interest and a background in biology. If you understand biological processes, if you
www.euresearcher.com
investment in identifying novel pathways within particular diseases, so that the pharmaceutical industry can then target those pathways. Sometimes that pathway can be targeted by medication which is already available.” This is a fast-moving area, and with more data being generated and new techniques emerging, Dr Farrugia believes there is a long-term need for continued training. Along with training the next generation of scientists, Dr Farrugia also plans to do some functional work on their findings so far. “When you do genetic testing, generate the data and analyse it, you then need to confirm that your findings actually have the expected functional effect.” she outlines. “So the next stages of the project – besides the training in bioinformatics and the actual analysis – is to then pull out interesting candidates and test them in functional assays, which the project is equipping us to do.”
At a glance Full Project Title Interdisciplinary Training in HighThroughput Sequencing, Bioinformatics and Model Systems: Moving towards Clinical applications of Genomics (TrainMALTA) Project Objectives The TrainMALTA Twinning action aims to enable capacity building in Malta through training on best-practices in bioinformatic analysis, integration of high-throughput sequencing (HTS) data and robust quantitative analytical methods within a biological and clinical context. The use of model systems: the zebrafish model and induced pluripotent stem cells (iPSCs) to validate HTS findings is also part of this training action. Project Funding This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 692041. Project Partners • The University of Malta • The University of Cambridge • The Katholieke Universiteit Leuven Contact Details Dr Rosienne Farrugia Researcher & Senior Lecturer Dept. Applied Biomedical Science Faculty of Health Sciences University of Malta T: +356 2340 1107 T: +356) 2340 3281 E: rosienne.farrugia@um.edu.mt E: trainmalta@um.edu.mt W: http://www.um.edu.mt/project/ trainmalta http://www.timesofmalta.com/articles/view/20160731/ life-features/My-genome-reading-the-entire-DNAsequence-of-an-individual.620594
Dr Rosienne Farrugia
Dr Rosienne Farrugia is a researcher and senior lecturer at the Department of Applied Biomedical Science, University of Malta. Her ongoing research interests focus on the application of high throughput sequencing to elucidate the genetic basis of disease; both rare diseases as well as common, complex diseases prevalent in the Maltese population.
71
Plotting a path through crime data Vast amounts of data are available to the police in the fight against crime, yet it is not always easy to sift through it and identify key points. We spoke to Professor William Wong, Dr Chris Rooney and Dr Neesha Kodagoda about the VALCRI project’s work in developing an intelligent system to support police analysts and help them work more effectively The
police and law enforcement agencies today have access to large volumes of data in the fight against crime, yet sifting through it and identifying the key points relevant to a specific incident or tactical policing operation can be a challenge. The effective use of intelligence and historical data can help police analysts gain important insights when investigating crime, a topic that lies at the core of the VALCRI project. “We aim to develop a simple and intuitive interface to help police analysts drill down into what sorts of crimes are interesting with respect to a certain investigation. We can then provide additional support on the similarities between these crimes, and which of them show certain attributes,” explains Dr Chris Rooney, the project’s Technology Lead. This builds on the knowledge that criminals can be predictable in the traits they reveal and the way they operate,
72
their modus operandi (MO). “They may like to enter a property in a certain way for example, because they know that it works for them. So it’s about detecting those specific traits and identifying other crimes where those traits have also occurred,” says Dr Rooney.
Reasoning workspace A lot of this kind of data has been amassed over the years, so a key challenge now is ensuring it can be navigated effectively and presented to police analysts in an accesssible, easy-to-digest way. This is not always the case at the moment, as it can be difficult for analysts to locate and identify the information relevant to an investigation. “Data is often spread across different systems and databases, so it can be quite hard for analysts to piece the data together,” outlines Dr Rooney. The project aims to help address this issue by
integrating a number of technologies into a coherent working environment for police analysts, called the Reasoning Workspace, which enables an analyst to see the key information on a single interface. “Currently analysts use lots of different Screen shot of a crime card that is used to collate possibly relevant background data from many different sources into a single source. This includes (i) details about the crime and its victims and known offenders, (ii) others crimes and incident logs that occurred in a similar time and space, (iii) the modus operandi with the analyst’s search terms highlighted, and (iv) relationships to similar crimes based on extracted traits.
EU Research
systems to do different tasks. The idea of integrating this together is that you can see the data in many different forms, all within the same interface,” says Dr Rooney. “Similarly, we plan to run a repository over multiple data sets, so again you don’t need to have these different systems running for each
remove and draw in other sets of information as required, the system enables analysts to explore it quickly and efficiently, while also drawing on their own associative capacities. “We have these interactive visualisations, which we call crime cards, which can be manipulated and moved around the canvas. It gives an
We aim to develop a simple and intuitive interface to help analysts drill down into what sorts of crimes are interesting with respect to a certain investigation. We can then provide additional support on the similarities between these crimes, and which of them show certain attributes different dataset, you can bring them together into one index.” This will give analysts access to large amounts of detailed information on individual crimes, which is also regularly updated with new data that is analysed in real time within the system. If an analyst decides that particular excerpts of data are relevant to their task, they can then use the reasoning workspace to lay them out and manipulate them in a way that suits their own working processes. “The analyst is not necessarily forced into a particular workflow, they can use the space as they choose,” stresses Dr Rooney. By presenting data within an individual’s field of vision, and allowing them to
www.euresearcher.com
analyst this free-form interaction, which allows them to put data where they think it’s important,” explains Dr Rooney. “When an analyst thinks that data excerpts are related, they can then use the space to lay them out together.” The wider goal in this research is to help analysts reconstruct situations and build a deeper picture of crime in a particular area, combining their own local knowledge and experience with the available data to generate new insights and uncover possible leads. This encompasses not only operational policing and investigations of specific crimes, but also strategic and tactical analysis. “If an analyst is primarily involved in tactical analysis,
and they need to see an overview of the crime data over the past week or so for example, then they can build up a dashboard to do that,” says Dr Rooney. This type of data can be highly important in terms of strategic policing and understanding emerging patterns of crime. “Sometimes a tactical or operational analyst might be assigned a specific task, asking for example; ‘There’s a lot of knife crime going on, what’s happening? Is there a gang or is it individuals? Can we find anything similar that’s going on?’” outlines Dr Neesha Kodagoda, the project’s Operations Manager. “With the VALCRI system, an analyst can do a keyword search and generate a map, which shows them where knife crimes have happened.” The crimes recorded within the system can also be clustered together according to different attributes, as determined by the analyst themselves, helping them draw connections that might otherwise have been missed and providing an evidence base to support their reasoning. Once the key points relevant to a specific crime have been identified, analysts can then look to build a deeper picture, to effectively tell a story about the analysis that’s been conducted; Dr Kodagoda says this is an important aspect of the project’s work. “One thing we are encouraging is early hypothesis generation, so that an analyst could test a hypothesis more rigorously,” she says. The police need to put in a lot of
73
work to test a hypothesis using current systems; Dr Kodagoda says the VALCRI system is designed to help analysts develop and test a hypothesis around a crime more rapidly. “If we can make it easier to test a hypothesis, that means there is a way to check on the possible explanations behind a crime and test alternative ideas before the police commit to a specific explanation, when they need a high degree of certainty,” she explains.
Encouraging imagination This encourages analysts to consider a wider range of perspectives during an investigation, which is a major driver behind the work of the project. Many historical failures in crime investigation and detection have later been attributed to a lack of imagination in analysis, an inability to use the available intelligence effectively, a shortcoming that the VALCRI system could help to address. “The problem in intelligence-led policing is not so much the availability of data – the problem is more about trying to figure out which of the dots are useful and relevant,” says Professor William Wong, the coordinator of the project. By helping analysts draw links with historical incidents, and enabling them to drill down to gain more detailed information on specific crimes,
74
the VALCRI system encourages analysts to be imaginative and think of possible outcomes that might otherwise have been overlooked. “We aim to give the police a tool that could potentially make their lives easier and help them do their job more efficiently,” says Dr Rooney. This must not come at the cost of compromising data security however, while there are also ethical concerns around the use of this kind of information.
system is designed to support analysts in their decision-making, not to make decisions on their behalf,” stresses Dr Rooney. The longer-term goal for the project is to apply the system in policing. The project consortium includes three end-user partners – West Midlands Police Force in the UK, and both the federal and local police forces in the Belgian city of Antwerp – each of which face different
Currently analysts use lots of different systems to
do different tasks. The idea of integrating this together is that
you can see the data in many different forms, all within the same interface
Dr Rooney and his colleagues in the project are well aware of these kinds of issues. “We have a body of researchers within the project addressing SEPL (Security, Ethical, Privacy and Legal) issues, and their focus is on maintaining transparency in the system and mitigating the risks around misuse of data,” he explains. There is also a team within the project working on biases, aiming to mitigate the risk of any biases emerging and supporting objective policing. “The
challenges, and therefore require different types of data. “The strategic team in the West Midlands are primarily interested in volume data – so things like general increases or reductions in crime and monitoring any emerging trends. For example, is there a significant increase in crime in September because students are arriving in a city? How can we mitigate that? Whereas the federal police in Belgium are a bit more focused on the operational side of things, so data like
EU Research
witness statements,” outlines Dr Rooney. At the moment the VALCRI system is being tested by these end-users with anonymised data; Dr Kodagoda says the feedback so far has been positive. “Rather than looking at data in silos, they can now see this overall picture. So they have this ability to ask questions and gain answers, without having to move between different systems, leaving more time for analysis,” she explains. The system has not yet been integrated with each force’s data however, and discussions are ongoing over whether this can be achieved within the timeframe of the project. Looking beyond the funding term of the project, Professor Wong hopes to exploit the system’s commercial potential in future. “We have demonstrated the system to over 40 law enforcement agencies across Europe already, and there’s been a lot of interest in using it. We can’t sell it quite yet though,” he says. Additional funding will be required to further develop the system to a point where it can be applied in policing. “We will be at Technology Readiness Level 5 (TRL5) at the end of the project – we need to reach TRL9 for it to be considered a deployable system,” outlines Professor Wong.
Over the remaining period of the project the focus will shift towards design and implementation, as researchers move into the later stages of the development process. There is still scope for further changes and modifications however. “There are features that we’ve planned that we’ve not yet put into the system,” stresses Dr Rooney. Feedback from the end-user partners will play a large part in informing the ongoing development of the system and tailoring it to police needs. The challenges facing police in future are unlikely to be the same as those of today, so Professor Wong says that as the nature of crime evolves, so the systems used to analyse crime also need to evolve. “In future there will be a lot more information on crime available, there will be a lot more problems interfacing between crimes that occur in the physical world and crimes that occur in the cyber world. This convergence of the cyber and physical worlds will change the fundamental nature of crimes,” he predicts. “Police officers will then have to think very differently about how they bring data together in order to construct a case against a particular individual.”
At a glance Full Project Title Visual Analytics for sense-making in Criminal Intelligence Analysis (VALCRI) Project Objectives VALCRI is an Integrating Project whose goal is to develop an integrated, multi-function system prototype at TRL-5, validated in a user environment. VALCRI is developing a suite of integrated functions that are intended to facilitate human reasoning and analytic discourse. By being tightly coupled with semi-automated human-mediated semantic knowledge extraction, this will enable VALCRI to respond to human analysts in both a proactive and reactive manner, and work with analysts as a human-technology team, responding and anticipating needs as a Joint Cognitive System. Project Funding The research leading to the results reported in this work has received funding from the European Union Seventh Framework Programme to Project VALCRI under the EC Grant Agreement N° FP7IP-608142 awarded to Middlesex University and partners. Project Partners There are 17 project partners. Full details can be found at: http://valcri.org/partners-of-valcri/ Contact Details Project Coordinator, Professor William Wong Middlesex University London The Burroughs, Hendon London, NW4 4BT United Kingdom T: +44 (0)208 411 2684 E: w.wong@mdx.ac.uk W: www.valcri.org “Real world problems are multi-disciplinary. To cope with the variety of situations that may arise, real world solutions also need to be multi-disciplinary. In VALCRI, we have adopted a cognitive-behaviouraltechnological approach - cognitive engineering - where we combine technology in ways that enable humans to do what they are good at - reasoning and sense-making with ambiguity; and letting the machine do the heavy lifting, such as when given one report and to find others similar to it.”
Professor William Wong
William Wong is Professor of HumanComputer Interaction at Middlesex University. His main research interest is in the representation and design of information and the interaction of user interfaces to support decision making in complex dynamic environments.
www.euresearcher.com
75
Finance for normal people Many private investors try to chase trends in a rising market, before they lose faith in their initial decision and sell at the bottom price, leading to financial losses. The Behavioral Finance for Private Banking project aims to help everyday people make better, more informed financial decisions and make their money work harder, as Professor Thorsten Hens explains Observations of millions
of trades done by thousands of investors show the typical investment behavior given in Figure 1. Private investors try to chase trends, lose trust in their investment ideas once severe losses are incurred, and return to the markets when it is too late. Effectively they tend to buy when prices are high and sell when prices are low. Various studies have quantified the financial loss incurred by this sub-optimal behavior to be around 4 to 7 percent p.a.; which means that after 10 years the typical private investor has lost half of their money – if not more. For more than thirty years, researchers in behavioral finance have amassed evidence of investment mistakes similar to those shown in the roller coaster. Only recently have researchers started wondering, how could private investors be helped to avoid those mistakes? If an investor is sufficiently rich, worth €1 million or more, then they can get help from a private bank, which will either invest for the client or assign a financial adviser to consult with them. In our previous work ‘Behavioral Finance for Private Banking’ we described how the results of behavioral finance are used to help wealthy private investors. The cost of these solutions is however at least €10,000 p.a., which is not attractive for less wealthy private investors.
Behavioral Finance for Retail Banking The purpose of our research project “Behavioral Finance for Retail Banking’ is to help normal people make better financial decisions. Fortunately, in parallel the so-called Fintech revolution has led to the development of new channels in terms of how financial advisors can interact with clients. Using the internet and big data tools, the cost of good financial advice could be reduced to pennies. Good financial advice is based on four aspects of risk: • Risk Need (How much risk do you need to take for achieving your goals?)
76
Figure 1: Typical investment behavior along the ups-and-downs of the financial markets. • Risk Ability (How much risk are you able to take given your financial situation?) • Risk Awareness (Do you know which assets bear which risks?) • Risk Tolerance (How much risk can you hold through along the ups-and-downs of the markets?) Based on laboratory experiments, largescale online surveys and field studies the
experience sampling. This means that the investor can get accustomed to risk by visualizing it with a tool that draws return paths for differently risky assets. • The risk tolerance of an investor can best be assessed by a simple gain-loss trade off. Step-by-step the investor will be asked whether a certain amount of risk is acceptable for a target gain.
For more than thirty years, researchers in behavioral finance have amassed evidence of investment mistakes similar to those shown in the roller coaster. Only recently have researchers started wondering, how could private investors be helped to avoid those mistakes? main results of our research project are: • For balancing the risk need with the risk ability it is important to apply an asset split. This means that saving assets should be reserved for very important needs and more risky assets can be held for needs that are more flexible. • Investors` perception of risk is mainly driven by the probability of them losing money. It can be considerably improved by
Case study Our findings are best illustrated by a case study. Mrs. and Mr. Fisher are in their mid-thirties, married and have two children, Amelie 7 years and Ben 5 years old. The family income is €100,000 p.a.. They live in a house worth €1 Million, of which €200,000 has been paid and €800,000 needs to be borrowed as a mortgage. The annual living expenses are €80,000. Besides paying back the
EU Research
At a glance mortgage the family wants to finance a good education for their children (Oxford, Stanford or ETHZ) and increase the pension they would eventually get when they retire. Given the age of the kids the education will be needed in about 10 years. The family sets the following priorities: 1. House 2. Education 3. Retirement. They can currently lock in an interest rate of 1 percent for 10 years of mortgage and education (including living expenses) costs (depending on the choice of the university), at between €20,000 and €50,000 p.a. for 3 years. The family wants to plan ahead for 10 years. The residual income of 20,000 p.a. can be spent on the mortgage, an investment plan for the education of their children and a retirement plan. Suppose for simplicity that they live in a country like Switzerland where for tax reasons it does not make sense to reduce the mortgage by repayments other than the interest payments. Applying the asset split, €8,000 p.a. are fixed for the mortgage. But how much risk should the family take to finance the education and the retirement top-up? Our research shows that this question is best assessed by a combination of experience sampling and loss-tolerance. The idea of experience sampling is comparable to flight simulators for aircraft pilots. Potential financial outcomes such as investment returns are randomly drawn (simulated) interactively by the investor, and the distribution of possible outcomes builds up step by step on the screen, as Figure 2 shows. Investors can increase or reduce their risk, observe the changes in the distribution of results and interactively adjust the risk to achieve a distribution that they feel comfortable with. Experience sampling is the best method to increase risk awareness. It should be combined with a method to assess the risk tolerance, which is best achieved with the gain-loss method. It fixes a potential gain and asks for the maximal loss the investor is willing to accept for that gain. The latter should be done iteratively – starting from a loss that is as high as the gain, which is then reduced step-by-step. In each step the investor is asked whether the investment is now acceptable and the iteration stops
www.euresearcher.com
once it is. Combining the two methods one can assess the risk awareness and the risk tolerance - and to be safe one should base the advice on the aspect of risk in which the investor is more conservative.
Full Project Title Behavioural Finance for Retail Banking Project Objectives The purpose of the project is to help normal people making better financial decisions. Fortunately, in parallel the so-called Fintech revolution has developed new channels how financial advisors can interact with clients. Using the internet and big data tools the cost of good financial advice could be reduced to pennies. Project Funding The project is funded by Swiss national fund (SNF) http://p3.snf.ch/ project-149934
Figure 2: Experience sampling. Having done these risk assessments the Fisher family decides to invest the remaining €12,000 p.a. into two savings contracts – one for the education of the children and one for their retirement. The former requires €8,000 p.a. and targets a total wealth of €100,000 (Oxford) which could however also end up as €60,000 (ETH) or €125,000 (MIT). In the latter case, they could invest the remaining €4,000 which after 10 years would accumulate to €60,000 or €20,000. Thus, this investment is considerably more risky – but since Mrs and Mr Fisher still have 30 years before retirement, appropriate adjustments can be made after 10 years.
Project Participants • Professor Thorsten Hens (main applicant), University of Zurich and NHH, Bergen, Norway. •D r. Kremena Bachmann, University of Zurich and University of Applied Science Zürich. • Professor Stefan Zeisberger, Radboud University Nijmegen and University of Zurich. • Ferdinand Langnickel, University of Zurich. Contact Details Project Coordinator, Professor Thorsten Hens Plattenstrasse 32 8032 Zürich T: +44 634 37 06 E: thorsten.hens@bf.uzh.ch W: http://www.bf.uzh.ch/cms/de/hens. thorsten.html
Professor Thorsten Hens
Professor Thorsten Hens is a Swiss Finance
Figure 3: Assessment of risk tolerance by the gain-loss trade-off. We are pleased that our research has not only been published in academic journals, but also picked up by financial advisors, mostly notably by fintech companies in Germany and Switzerland. So we hope to have contributed to making leading-edge research useful for normal people so that they can take better financial decisions and have a more rewarding life.
Institute Professor at the University of Zurich and Adjunct Professor of Finance at NHH in Bergen, Norway. He studied at Bonn and Paris and previously held professorships in Stanford and Bielefeld. His main research area is behavioural finance, which he also actively applies in practice.
77
Searching for patterns in labour data Vast amounts of information about job vacancies are available online, yet the skills and abilities of workers are not always closely matched to the demands of their role, which has a significant influence on employment patterns. We spoke to Philipp Kircher about his work in investigating unemployment and building a deeper understanding of labour markets The employment market
can seem daunting to job-seekers, with many thousands of people at any one point competing for the positions that best suit their skills and abilities. The individual motivations behind choosing a job and the route taken to get there may vary widely of course, a topic that Philipp Kircher has explored in the Labourheterogeneity project. “I want to know more about how people choose their occupation, or arrive at the occupation they end up in,” he outlines. The workforce is heterogenous in nature, in that people have different skills, educational histories and training backgrounds, all factors which affect the employment opportunities available to them, while there’s also an element of serendipity involved in finding a job. “A person might be in a region where it’s easier to find certain jobs than others for example, there are many factors to consider,” continues Kircher. “I aim to understand more deeply how this works, to sift through various ideas and to identify which work better in terms of explaining patterns we see in the data.” This rests to a large degree on the wider economic conditions. The European labour market experienced a major upheaval following the financial crisis, and the impact continues to be felt. “There are indications that since the great recession there has been a greater mis-match in the labour market in specific occupations. That suggests people may still be searching for the wrong kinds of jobs – the economy has shifted and people are not reacting,” explains Kircher. The ‘wrong kinds of jobs’ could be those for which an applicant is over-qualified for example, or that are in a less productive sector of the economy, raising a number of questions that Kircher and his colleagues in the project investigate. “Are people searching for jobs in occupations with few available jobs and low productivity, even though there are other occupations out there that are more promising?” he asks. “Once we’ve established that, we can think about why this is the case. Is it that
78
people just can’t do these other jobs? Or could they do them with training, but aren’t aware that these jobs are available?”
Job market The wider aim in this research is to develop a deeper understanding of the root causes of unemployment and to help people find suitable jobs. Alongside the more theoretical research, Kircher and his colleagues are working on two sub-projects within Labourheterogeneity, the first of which builds on the idea that people still have quite a lot of learning to do when they choose an occupation. “We know that many people look for work in occupations where there are not many jobs on offer. We wondered – could we advise job-seekers about alternatives?” he outlines. A job search website has been developed to help people identify alternatives suited to their skills. “We don’t want to push people into unsuitable jobs, we want to provide relevant information,” explains Kircher. “We did a focus group with jobseekers, and while we found that they had
skills to do certain jobs, those jobs weren’t always available. There was a lack of understanding about where else those skills could be applied in the job market.” This is where analysis of large datasets like Understanding Society, a study following the lives and careers of UK residents, can hold relevance. The rate of occupational change today is quite high, and analysis of job changes can help researchers identify which employment opportunities would be well-suited to an individual’s skills. “We can ask – are there people who have worked as pipe-fitters; where else have they worked? What did they do after they stopped working as pipe-fitters? Some of them may have gone on to be plumbers for example,” says Kircher. Researchers have analysed both UK and Danish datasets, gaining insights which can then inform the development of a new job search interface. “We have programmed our own job search interface. In the base version, you type in your own key-words, the engine does a key word
Table 1: Effect of intervention on interviews
Each column represents two separate regressions. All regressions include group fixed effects, period fixed effects, individual random effects and individual characteristics. Columns (1)-(3) are Poisson regression models where we report [exp(coe cient) - 1], which is the percentage effect. Standard errors clustered by individual in parentheses. * p < 0:10, ** p < 0:05, *** p < 0:01.
EU Research
search, and it returns items that are related to your search,” continues Kircher. “If you want to search for a different job, you have to decide what key word will get you closer to the desired job.” The new job-search interface has an additional feature however that is designed to help job-seekers identify alternative employment opportunities. Instead of a jobseeker simply putting in key words, they are prompted to describe what occupation they’d like to work in, from which Kircher says other options can be identified. “Somebody might say; ‘I’m looking for pipe-fitting jobs’. Then we look at where pipe-fitters have previously found jobs, and we generate a list, saying for example that previously pipe-fitters found work as plumbers or technicians,” he outlines. This helps job-seekers broaden their horizons, away from their own specific area of expertise, to think about other areas of the employment market where their skills could be relevant. “What we are aiming for is to help people who know their own skill-set, then to move from there to identify what other occupations they might also be good at,” says Kircher. “It is fairly simple – jobseekers give us some information, and we tell them related information.” The website also offers other information, such as on skill transferability and on how tight competition for jobs in different occupations are. However, identifying where other people with similar skills have previously found jobs seems an easy path that other websites could follow. This system has been tested so far on 300 job-seekers in Edinburgh, using jobs gathered from Universal Jobmatch, a website run by the UK Government. One part of the group spent the entire 12-week trial period searching for jobs in the conventional way, while the other had the opportunity to use the new system during the second half of the trial; Kircher says the results so far are positive. “On average the latter group looked at a broader set of job openings and seemed to get more interviews. Those people who had very narrowly defined search criteria at the beginning of the period clearly seemed to benefit, they got a lot more interviews,” he says. The group was not large enough to draw wider conclusions about job finding rates and general equilibrium effects however, so Kircher plans to set up a bigger trial in the future, investigating the major issues around unemployment in greater depth. “If you divide too finely between young and old, poorly educated v highly educated, etc.,
www.euresearcher.com
Figure 1: Screenshot of the tool (for preferred occupation ‘cleaner’). then you have only a few people in each category, and you can’t really draw wider inferences,” he explains. He believes that the relevance of this work reaches beyond academia, and should interest both governments and the private sector. In particular, governments have been pushing job seekers to search hard for jobs, and if easy ways to provide information turn out to be useful, they might want to invest in them. The key for success is for this information to be correctly integrated with the jobs people are looking for, as most job seekers do not want to read booklets before they get on with their search.
continues Kircher. “Switching is very costly for people who have invested a lot of human capital in an occupation, as they’ve built up their expertise, so they will stay.” By contrast, in this context the people who have invested less human capital are more likely to leave, as they’ve spent less time, energy and money in building up their skill set. In this type of theory, the low earners would again leave an occupation and the high earners would stay. “The low earners would be the ones with low occupation-specific human capital,” explains Kircher.
We see there is a high probability that low earners will change occupations. That probability drops towards the middle earners, and then picks up again towards the higher earners Occupational mobility A second sub-project centred on analysing data from Denmark on how often people change occupations, and investigating the underlying reasons why. A number of theories are based on the idea that an individual in an occupation with low wages is more likely to want to change jobs, whereas if wages are high, then that person will want to stay. “One such theory would be where a person isn’t sure about their skills, so they pick an occupation at random then try to find out whether they’re sufficiently capable or not. In this model, the high earners in each occupation would stay and the low earners would leave,” outlines Kircher. Another theory also takes the wider economic climate into consideration. “For example, there may be less demand for plumbers in a recession. Which people leave the plumbing occupation and which stay?”
Typically an individual’s wages rise the longer they stay in an occupation, Kircher is now investigating why this is the case. “Is it because they become better at the job? Or is it because only the good people stay in an occupation and the bad ones leave?” he asks. “We’ve been looking at data from Denmark across many different occupations. We are trying to look at specific occupations and ask of individuals; ‘within that occupation, how well are they paid? Are they at the bottom, the middle or at the top? Are they well paid in comparison to others?’ Then we look at the next year’s data – are they still in that occupation?” Many of the conventional theories would suggest that the high-earning people would be highly likely to remain in the same occupation, while the lower earners would leave. However, Kircher says his findings show a different picture. “What we see is that there is a high
79
At a glance Full Project Title Labor Heterogeneity in Search Markets (LABORHETEROGENEITY) Project Objectives The work laid out in this proposal aims to change our understanding of labor markets by viewing both the mobility as well as the frictions in the market as a consequence of long-term worker heterogeneity. Despite the advances in information technology which substantially reduce the costs of sending information (job advertisements, job applications) extracting the relevant information about worker quality remains hard. Long-term differences in ability coupled with screening frictions are proposed as the main reason for mismatch, for mobility, and for the presence of unemployment.
Figure 2: Non-parametric plot of probability of switching occupation by worker’s percentile in the relevant wage distribution.
Project Funding Funded under: FP7-IDEAS-ERC. ERC-SG ERC Starting Grant. EU contribution: EUR 317 034,67 Project Partners • European University Institute, Italy • The University of Edinburgh, UK Contact Details Project Coordinator, Professor Philipp Kircher European University Institute and University of Edinburgh Department of Economics European University Institute Villa la Fonte, Via della Fontanelle 18 50014 San Domenico - Fiesole T: +39 055 4685 429 E: philipp.kircher@eui.eu W: http://homepages.econ.ed.ac. uk/~pkircher/
Professor Philipp Kircher
Philipp Kircher is a professor at the European University Institute in Florence and at the University of Edinburgh. He graduated from the University of Bonn and held faculty positions at the University of Pennsylvania, Oxford, and LSE. He is Chairman of the Review of Economic Studies and sits on the Executive Committee of the European Economic Association.
80
Figure 3: Non-parametric plot of direction of occupational mobility, conditional on switching occupation, by worker’s percentile in the relevant wage distribution before the switch. probability that low earners will change occupations. That probability drops towards the middle earners, and then picks up again towards the higher earners,” he outlines. The lower-earners have a clear incentive to change occupation, but it’s less clear why a significant proportion of high-earners choose to move; Kircher and his colleagues are exploring this topic further. “People have to learn about their capabilities, to discover whether they’re good at an occupation or not,” he says. “The question then is whether you should stay in a high earning position, or move on to something where the labour market returns are even
higher. If you find that you are good at a specific job, then you may still leave it to find something better.” This research holds important implications for our understanding of employment and wages, and Kircher is keen to build strong links with policy-makers to help inform future policy. The introduction of new technologies is set to further disrupt the employment market, an issue that Kircher intends to explore in future. “I would like to think more deeply about structural change, the introduction of robots and artificial intelligence for example, which will lead to massive upheaval in the labour market,” he outlines.
Figure 4: Illustration of the proof of Propositions 1 and 2.
EU Research