EU Research Autumn 2024

Page 1


NASA find cosmic question mark in space

Human longevity in the 21st century

Researchers make robots controlled by mushrooms

The History of water on Mars

Designed by Freepik

Editor’s Note

As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.

Something interesting happened in my family recently, my wife swam the English Channel to France in a relay as part of a team. It wasn’t long before she was swimming through sea-tossed plastic and discarded rubbish, but the most astonishing part was that the rubbish was very old, certainly not freshly blown from nearby shores. A chocolate bar wrapper that spiralled past her goggles, was from the early 1960s and of course, it’s been rolling around the sea as part of a ‘shoal’ of vintage human leftovers for decades, hidden to all but the fish and now a swimmer in an unlikely place. Plastic bags tumbled about the waves in higher numbers than the jellyfish.

Thankfully we are now taking this seriously in the EU, but the damage has been real, the pollution is a tide that’s not going away. We are having a brutal lasting impact on nature due in part to the way we have been using non-degradable and even toxic materials in, and for, our products.

There is one chilling prediction, that by 2050, the sea will have more plastic in it than fish by weight. And it’s not just the seas that are our dumping ground, our rubbish is a modern weed everywhere you look. Further, we are now finding out about the far-reaching impact of hidden toxic chemicals like the per- and poly-fluoroalkyl substances known as PFAS, revealed prolifically in everyday goods, like paint, waterproof clothing and kitchenware.

We need to change the way we think about products and materials.

The year 2050 has become something of a deadline for Europe. It’s a date that the EU has fixated on for achieving the kind of transformation that is hard to envisage when we are talking about a geography made up of 27 countries.

Ursula Von Der Leyen, President of the European Commission, stated in 2019 that she envisions Europe to be the first climate-neutral continent in the world by 2050 with this ambition made into practical steps through the European Geen Deal. Steps include creating a fully circular economy. “It futureproofs our union” she said. As part of this, there is a drive to banish the so-called ‘take-make and dispose of’ tradition of manufacturing to instead nurture a culture in Industry that develops sustainable and toxic-free manufacturing, with environmentally friendly, degradable, recyclable and sustainable materials and processes.

This is very ambitious and be in no doubt, a big practical and logistical challenge for many businesses. Nevertheless, it is the kind of thinking and leadership that’s necessary for a cleaner future, and a Europe and world that we would want to live in and that we can sustain.

Contents

4 Research News

EU Research takes a closer look at the latest news and technical breakthroughs from across the European research landscape.

10 NORTH

Researchers in the NORTH project are developing hybrid nanoparticles designed to combine temperaturebased diagnostics with other functions such as drug delivery, as Professor Anna Kaczmarek explains.

12 SCANnTREAT

The SCANnTREAT project is combining a new imaging modality with X-ray activated photodynamic therapy, which could lead to more efficient treatment of cancer, as Dr Frédéric Lerouge explains.

15

Novel Immunotherapies for Neurodegenerative Diseases

What drives the progression of neurodegenerative diseases like ALS, Alzheimer’s, and MS?

Unravelling the mechanisms of neuroinflammation is at the heart of Professor Bob Harris’s research at the Karolinska Institutet.

18 EarlyLife

We spoke with Prof. Dr. Mathias Hornef about the ERC-funded EarlyLife project, which investigates how early-life infections impact the gut epithelium, microbiome, and immune system development in neonates.

20 FORGETDIABETES

Researchers in the FORGETDIABETES project are working to develop a fully implantable bionic invisible pancreas, which will relieve the burden associated with managing type-1 diabetes, as Professor Claudio Cobelli explains.

23 Longevity in the 21st Century

In the 21st century, scientists are exploring the mechanisms of aging, from telomere degradation to epigenetic clocks and senolytics, to extend both lifespan and healthspan and revolutionize how we age. By Nevena Nikolova

26 INTERLACE

We spoke to McKenna Davis and Benedict Bueb about the work of the transdisciplinary INTERLACE project, which has developed an array of tools, guidance materials, and further resources to build capacities and empower cities and other stakeholders to implement urban nature-based solutions.

28 A novel instrument for the accurate and direct measurement of saturation vapor pressures of lowvolatile substances

Henrik Pedersen and Aurelien Dantan tell us about their work in developing new instruments for measuring saturation vapour pressure and its wider relevance to understanding the influence of aerosols on the climate system.

30

Carbon sequestration in Swedish cropland soils

We spoke to Professor Thomas Kätterer about his research into the impact of different management practices on soil carbon sequestration, and its wider relevance in helping meet emissions reduction targets.

32 IRMIDYN

Iron is an important electron acceptor in soil environments for organisms that respire organic carbon to gain energy. Professor Ruben Kretzschmar is investigating the influence of these processes on nutrient and contaminant behaviour.

35 Knowledge, Magic and Horse Medicine in Late Antiquity

The Hippiatrica brings together prescriptions for treating different ailments in horses, now Dr Elisabet Göransson is working to bring them to a wider audience.

36 At the Edge of LanguageAn Investigation into the Limits of Human Grammar

Researchers at the University of Aarhus are looking at how people process different types of sentence structures and probing the limits of grammar, as Professor Anne Mette Nyvad explains.

38 MarsFirstWater

The team behind the MarsFirstWater project are investigating the characteristics of water on early Mars, research which holds important implications for future space missions to the planet, as Professor Alberto Fairén explains.

42 OLD WOOD NEW LIGHT

We spoke to Professor Dan Hammarlund about his work as the coordinator of a research project dedicated to making dendrochronology data available to researchers.

44 MetalFuel

Professor Philip de Goey is exploring the potential of iron powders as an alternative carrier of energy in the MetalFuel project, part of the goal of moving towards a more sustainable society.

46 IPN RAP

The IPN-RAP project team is creating new knowledge on how to assess the properties of certain steels, ready for their use in artillery products, as Andreas Gaarder and Knut Erik Snilsberg explain.

48 Wide-angle neutron polarisation analysis to study energy and quantum materials

We spoke to Professor Elizabeth Blackburn about how she and her team are using neutron scattering techniques to investigate energy and quantum materials.

50 AUTHLIB: Neoauthoritarianisms in Europe and the liberal democratic response

The AUTHLIB project team is investigating the factors behind the growing appeal of illiberal forces, which can then inform the development of tools to defend liberal democracy, as Professor Zsolt Enyedi explains.

52 The establishment, growth and legacy of a settler colony

We spoke to Professor Erik Green about his research into how the settler economy in the Cape Colony evolved, its impact on indigenous people, and its long-term legacy.

54 MERCATOR

Professor Youssef Cassis and his colleagues in the Mercator project are looking at the way past financial crises are remembered and how memories of them influence the bankers of today.

EDITORIAL

Managing Editor Richard Forsyth info@euresearcher.com

Deputy Editor Patrick Truss patrick@euresearcher.com

Science Writer Nevena Nikolova nikolovan31@gmail.com

Science Writer Ruth Sullivan editor@euresearcher.com

PRODUCTION

Production Manager Jenny O’Neill jenny@euresearcher.com

Production Assistant Tim Smith info@euresearcher.com

Art Director Daniel Hall design@euresearcher.com

Design Manager David Patten design@euresearcher.com

Illustrator Martin Carr mary@twocatsintheyard.co.uk

PUBLISHING

Managing Director Edward Taberner ed@euresearcher.com

Scientific Director Dr Peter Taberner info@euresearcher.com

Office Manager Janis Beazley info@euresearcher.com

Finance Manager Adrian Hawthorne finance@euresearcher.com

Senior Account Manager Louise King louise@euresearcher.com

EU Research

Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom

T: +44 (0)207 193 9820

F: +44 (0)117 9244 022

E: info@euresearcher.com www.euresearcher.com

© Blazon Publishing June 2010 ISSN 2752-4736

Cover image designed by Freepik

RESEARCH NEWS

The EU Research team take a look at current events in the scientific news

Iliana Ivanova is leaving European Commission after just one year

The European Commissioner for Innovation, Research, Culture, Education and Youth is standing down for personal reasons.

The current Bulgarian EU Commissioner cannot stay on in the new European Commission for personal reasons. Iliana Ivanova, who has been responsible for Innovation, Research, Culture, Education and Youth in the first von der Leyen cabinet, announced on her Facebook profile: “I have been immensely honoured and privileged to serve as a commissioner in the last year of the term of the current European Commission!”

Ivanova thanked the Bulgarian authorities and citizens for the trust, the broad support and the opportunity to represent Bulgaria at the highest European level. “During this dynamic period, I have made every effort to discharge the duties assigned to me within a portfolio of key importance for the future, with a considerable budget. I am proud of everything that my colleagues and I have been able to

achieve in such a short time,” she said.”I am convinced that Bulgaria will remain well-represented in the coming years in making the most important decisions for the European policies,” she said.

Iliana Ivanova was appointed by the Council of the European Union as the new European Commissioner from Bulgaria on September 19, 2023. She replaced Mariya Gabriel for the remainder of the Commission’s term. Gabriel resigned from the post to become deputy prime minister and foreign minister of Bulgaria, with the prospect of taking over as prime minister under a rotation arrangement. Ivanova had been a member of the European Parliament from 2009 to 2012. From 2013 until being proposed by the Bulgarian government for European Commissioner in 2023, she represented Bulgaria at the European Court of Auditors in Luxembourg.

Research Commissioner Iliana Ivanova © European Union.

Scientists create a Mushroom controlled robot

Biohybrid robot with fungal electronics could help usher in an era of sustainable robotics.

Building a robot takes time, technical skill, the right materials -- and sometimes, a little fungus. In creating a pair of new robots, Cornell University researchers cultivated an unlikely component, one found on the forest floor: fungal mycelia. By harnessing mycelia’s innate electrical signals, the researchers discovered a new way of controlling “biohybrid” robots that can potentially react to their environment better than their purely synthetic counterparts.

The team’s paper published in Science Robotics. The lead author is Anand Mishra, a research associate in the Organic Robotics Lab led by Rob Shepherd, professor of mechanical and aerospace engineering at Cornell University, and the paper’s senior author. “This paper is the first of many that will use the fungal kingdom to provide environmental sensing and command signals to robots to improve their levels of autonomy,” Shepherd said. “By growing mycelium into the electronics of a robot, we were able to allow the biohybrid machine to sense and respond to the environment. In this case we used light as the input, but in the future it will be chemical. The potential for future robots could be to sense soil chemistry in row crops and decide when to add more fertilizer, for example, perhaps mitigating downstream effects of agriculture like harmful algal blooms.”

Mycelia are the underground vegetative part of mushrooms. They have the ability to sense chemical and biological signals and respond to multiple inputs. “Living systems respond to touch, they respond to light, they respond to heat, they respond to even some unknowns, like signals,” Mishra said. “If you wanted to build future robots, how can they work in an unexpected environment? We can leverage these living systems, and any unknown input comes in, the robot will respond to that.”

Two biohybrid robots were built: a soft robot shaped like a spider and a wheeled bot. The robots completed three experiments. In the first, the robots walked and rolled, respectively, as a response to the natural continuous spikes in the mycelia’s signal. Then the researchers stimulated the robots with ultraviolet light, which caused them to change their gaits, demonstrating mycelia’s ability to react to their environment. In the third scenario, the researchers were able to override the mycelia’s native signal entirely. Experts believe that this advance lays the groundwork for building sturdy, sustainable robots. In the future, these hardy, light-activated cyborgs could be deployed to harsh environments on Earth or even on missions outside our planet.

Research sector gives Draghi report broad initial welcome

The research and innovation sector has broadly embraced the long-awaited report on EU competitiveness from former Italy prime minister Mario Draghi.

Research and innovation have never been more present in mainstream policy debates after this week former Italian prime minister, Mario Draghi, placed research and innovation at the heart of his recommendations for boosting EU competitiveness. That followed on from an earlier report by another former Italian prime minister, Enrico Letta, on the future of the single market, in which he called for a ‘fifth freedom’ for research and innovation.

The Draghi report is expected to have a major role in shaping EU policy over the next five years, and the research community is naturally delighted by its call for more public research and innovation funding, including €200 billion for the successor to Horizon Europe to run from 2028-2034, Framework Programme 10 (FP10). Draghi also calls for better coordination of public research and innovation expenditure across EU member states, via a ‘Research and Innovation Union’ and a ‘European Research and Innovation Action Plan.’

This more Europe-centric approach recommended by the report would prioritise collective EU-wide ambitions, rather than national value for money. Björnmalm hopes this message will be picked up by the expert group report on FP10, due next month. “This outdated mindset of seeing the EU funding programme as primarily a means to redistribute member states’ funds limits Europe’s potential to become a global leader in science and technology,” he said.

Another ambition that’s regularly floated making it into the Draghi report is the creation of a new funding instrument modelled on the US advanced research project agencies. It is suggested this would be done by reforming the European Innovation Council’s (EIC) pathfinder instrument which supports deep tech projects. Björnmalm notes strong similarities with the European Competitiveness Research Council proposed by CESAER.

The bulk of the full 328-page report is dedicated to detailed policy recommendations for specific sectors including clean technologies, semiconductors, defence, and pharmaceuticals, and addresses issues industry has been complaining about for years, such as burdensome regulations and access to capital. It also includes recommendations to support the development of AI models in several strategic industries.

Cecilia Bonefeld-Dahl, director general of the trade association DigitalEurope, said the report provides “a long list of positive ideas” including simplifying AI regulation, better commercialisation of research, and integrating technology into strength industries. Nathalie Moll, director general of the European Federation of Pharmaceutical Industries and Associations, said the publication “shows ambition at the highest level” to address the issues facing the pharma industry, with Europe struggling to attract R&D investment.

NASA’s James Webb Space Telescope Captures a Cosmic Question Mark

Astronomers have found clues in the form of a cosmic question mark, the result of a rare alignment across light-years of space.

NASA’s James Webb Space Telescope has captured a stunning image of a cosmic ‘question mark’ formed by distant galaxies, offering astronomers a unique glimpse into the universe’s starforming past. This rare alignment, caused by a phenomenon known as gravitational lensing, provides valuable insights into galaxy formation and evolution 7 billion years ago. The image, taken by Webb’s Near-Infrared Imager and Slitless Spectrograph (NIRISS), reveals a pair of interacting galaxies magnified and distorted by the massive galaxy cluster MACS-J0417.5-1154. This cluster acts as a cosmic magnifying glass, allowing astronomers to see enhanced details of much more distant galaxies behind it.

Gravitational lensing occurs when a massive object, such as a galaxy cluster, warps the fabric of space-time around it. This warping can magnify and distort the light from distant objects behind the cluster, creating optical illusions in space. In this case, the lensing effect has produced a rare configuration called a hyperbolic umbilic gravitational lens. This unusual alignment results in five images of the galaxy pair, four of which trace the top of the question mark shape. The dot of the question mark is formed by an unrelated galaxy that happens to be in the right place from our perspective.

While the galaxy cluster has been observed before by NASA’s Hubble Space Telescope, the dusty red galaxy that forms part of the question-mark shape only became visible with Webb. This is due to Webb’s ability to detect infrared light, which can pass

through cosmic dust that traps the shorter wavelengths of light detected by Hubble. This capability allows Webb to peer further back in time and observe galaxies as they appeared billions of years ago, during the universe’s peak period of star formation. The galaxies in the question mark are seen as they were 7 billion years ago, providing a window into what our own Milky Way might have looked like during its “teenage years.”

The research team used both Hubble’s ultraviolet and Webb’s infrared data to study star formation within these distant galaxies.

Vicente Estrada-Carpenter of Saint Mary’s University explains the significance of their findings: “Both galaxies in the Question Mark Pair show active star formation in several compact regions, likely a result of gas from the two galaxies colliding. However, neither galaxy’s shape appears too disrupted, so we are probably seeing the beginning of their interaction with each other.” This observation provides valuable information about how galaxies evolve through interactions and mergers, a process that has shaped the universe we see today.

Marcin Sawicki, one of the lead researchers, emphasizes the broader implications of this discovery: “These galaxies, seen billions of years ago when star formation was at its peak, are similar to the mass that the Milky Way galaxy would have been at that time. Webb is allowing us to study what the teenage years of our own galaxy would have been like.” As Webb continues to reveal new wonders of the cosmos, each discovery brings us closer to understanding our place in the vast tapestry of the universe.

Why some people itch more from Mosquito bites and allergens

Researchers identify mechanism underlying allergic itching, and show it can be blocked.

Why do some people feel itchy after a mosquito bite or exposure to an allergen like dust or pollen, while others do not? A new study has pinpointed the reason for these differences, finding the pathway by which immune and nerve cells interact and lead to itching. The researchers, led by allergy and immunology specialists at Massachusetts General Hospital, a founding member of the Mass General Brigham healthcare system, then blocked this pathway in preclinical studies, suggesting a new treatment approach for allergies. The findings are published in Nature.

“Our research provides one explanation for why, in a world full of allergens, one person may be more likely to develop an allergic response than another,” said senior and corresponding author Caroline Sokol, MD, PhD, an attending physician in the Allergy and Clinical Immunology Unit at MGH, and assistant professor of medicine at Harvard Medical School. “By establishing a pathway that controls allergen responsiveness, we have identified a new cellular and molecular circuit that can be targeted to treat and prevent allergic responses including itching. Our preclinical data suggests this may be a translatable approach for humans.”

When it comes to detecting bacteria and viruses, the immune system is front and foremost at detecting pathogens and initiating long-lived immune responses against them. However, for allergens, the immune system takes a backseat to the sensory nervous system. In people who haven’t been exposed to allergens before, their sensory nerves react directly to these allergens, causing itchiness and triggering local immune cells to start an allergic reaction. In those with chronic allergies, the immune system can affect these sensory nerves, leading to persistent itchiness.

Previous research from Sokol and colleagues showed that the skin’s sensory nervous system -- specifically the neurons that lead to itch -- directly detect allergens with protease activity, an enzyme-driven process shared by many allergens. When thinking about why some people are more likely to develop allergies and chronic itch symptoms than others, the researchers hypothesized that innate immune cells might be able to establish a “threshold” in sensory neurons for allergen reactivity, and that the activity of these cells might define which people are more likely to develop allergies.

The researchers performed different cellular analyses and genetic sequencing to try and identify the involved mechanisms. They found that a poorly understood specific immune cell type in the skin, that they called GD3 cells, produce a molecule called IL-3 in response to environmental triggers that include the microbes that normally live on the skin. IL-3 acts directly on a subset of itch-inducing sensory neurons to prime their responsiveness to even low levels of protease allergens from common sources like house dust mites, environmental molds and mosquitos. IL-3 makes

sensory nerves more reactive to allergens by priming them without directly causing itchiness. The researchers found that this process involves a signaling pathway that boosts the production of certain molecules, leading to the start of an allergic reaction.

Then, they performed additional experiments in mouse models and found removal of IL-3 or GD3 cells, as well as blocking its downstream signaling pathways, made the mice resistant to the itch and immune-activating ability of allergens. Since the type of immune cells in the mouse model is similar to that of humans, the authors conclude these findings may explain the pathway’s role in human allergies. “Our data suggest that this pathway is also present in humans, which raises the possibility that by targeting the IL-3-mediated signaling pathway, we can generate novel therapeutics for preventing an allergy,” said Sokol. “Even more importantly, if we can determine the specific factors that activate GD3 cells and create this IL-3-mediated circuit, we might be able to intervene in those factors and not only understand allergic sensitization but prevent it.”

Study finds limits to storing CO2 underground to combat climate change

New research has found limits to how quickly we can scale up technology to store gigatons of carbon dioxide under the Earth’s surface.

Underground CO2 storage, a key component of carbon capture and storage (CCS) technology, is often viewed as a vital solution to combat climate change. In light of the urgency to address global warming, many potential methods of carbon capture have been meticulously investigated. While the concept of storing CO2 underground is promising, recent research from Imperial College London highlights significant limitations and challenges associated with scaling up this technology.

Current international scenarios for limiting global warming to less than 1.5 degrees Celsius by 2100 rely heavily on technologies that can remove CO2 from the Earth’s atmosphere at unprecedented rates. These strategies aim to remove between 1 and 30 gigatons of CO2 annually by 2050. However, the estimates for how quickly these technologies can be deployed have been largely speculative. The Imperial study indicates that existing projections are not likely feasible at the current pace of development.

“There are many factors at play in these projections, including the speed at which reservoirs can be filled as well as other geological, geographical, economic, technological, and political issues,” said Yuting Zhang, lead author from Imperial’s Department of Earth Science and Engineering. “However, more accurate models like the ones we have developed will help us understand how uncertainty in storage capacity, variations in institutional capacity across regions, and limitations to development might affect climate plans and targets set by policymakers.” The experts found that storing up to 16 gigatons of CO2 underground annually by 2050 is possible, but would require a substantial increase in storage capacity and scaling that current investment and development levels do not support.

“Although storing between six to 16 gigatons of CO2 per year to tackle climate change is technically possible, these high projections are much more uncertain than lower ones,” noted co-author Dr. Samuel Krevor, also from Imperial’s Department of Earth Science and Engineering. “This is because there are no existing plans from governments or international agreements to support such a largescale effort. However, five gigatons of carbon going into the ground is still a major contribution to climate change mitigation.”

The team’s analysis suggests a more realistic global benchmark for underground CO2 storage might be in the range of 5-6 gigatons per year by 2050. This projection aligns with growth patterns observed in existing industries, including mining and renewable energy. By applying these historical growth patterns to CO2 storage, the researchers have developed a model that offers a more practical and reliable method for predicting how quickly carbon storage technologies can be scaled up. “Our study is the first to apply growth patterns from established industries to CO2 storage,” explained Dr. Krevor. “By using historical data and trends from other sectors, our new model provides a realistic and practical approach for setting attainable targets for carbon storage, helping policymakers to make informed decisions.”

The research, funded by the Engineering & Physical Sciences Research Council (EPSRC) and the Royal Academy of Engineering, highlights the importance of setting realistic goals in the global effort to combat climate change. While underground carbon storage presents a promising strategy, understanding its potential and limitations is crucial for making the best use of this technology.

New species of Antarctic dragonfish highlights its threatened ecosystem
The species highlights both the unknown biodiversity and fragile state of the Antarctic ecosystem.

A research team at the Virginia Institute of Marine Science at William & Mary has uncovered a new species of Antarctic fish, which could reshape how scientists view biodiversity in the Southern Ocean. The newly discovered species Akarotaxis gouldae, also known as the banded dragonfish, was identified during an investigation of museum-archived larvae samples. While the research team examined the samples, they noticed key differences in some fish, including two distinct dark vertical bands of pigment on their bodies, a shorter snout and jaw, and a longer body depth. These observations led them to suspect the presence of a new species. To confirm their hypothesis, the researchers used mitochondrial DNA analysis and constructed a phylogenetic tree to illustrate the relationship between A. gouldae and other Antarctic dragonfish species.

The banded dragonfish is limited to a small area along the Western Antarctic Peninsula. This region is also targeted by the Antarctic

krill fishery, raising concerns about the potential impact of human activities on this rare species. Because the fish produces very few offspring, it is particularly vulnerable to environmental changes.

“The discovery of this species was made possible by a combination of genetic analysis and examination of museum specimens from across the world, demonstrating the importance of both approaches to the determination of new species and of the value of specimen collections. The work also highlights how little we know of the biodiversity of Antarctica in general and the west Antarctic peninsula in particular,” says NSF Program Director Will Ambrose.

Image by Andrew Corso

Over a fifth of the world’s plastic waste is either burned or littered

A new study shows 57 million tons of plastic pollution is pumped out yearly and most comes in Global South.

A new study from Leeds University shines a light on the enormous scale of uncollected rubbish and open burning of plastic waste in the first ever global plastics pollution inventory. Researchers used A.I. to model waste management in more than 50,000 municipalities around the world. This model allowed the team to predict how much waste was generated globally and what happens to it. Their study, published in the journal Nature, calculated a staggering 52 million tonnes of plastic products entered the environment in 2020 -- which, laid out in a line would stretch around the World over 1,500 times.

It also revealed that more than two thirds of the planet’s plastic pollution comes from uncollected rubbish with almost 1.2 billion people -- 15% of the global population -- living without access to waste collection services. The findings further show that in 2020 roughly 30 million tonnes of plastics -- amounting to 57% of all plastic pollution -- was burned without any environmental controls in place, in homes, on streets and in dumpsites. Burning plastic comes with ‘substantial’ threats to human health, including neurodevelopmental, reproductive and birth defects. The researchers also identified new plastic pollution hotspots, revealing India as the biggest contributor -- rather than China as has been suggested in previous models -- followed by Nigeria and Indonesia.

Dr Costas Velis, academic on Resource Efficiency Systems from the School of Civil Engineering at Leeds, led the research. He said: “We need to start focusing much, much more on tackling open burning and uncollected waste before more lives are needlessly impacted by plastic pollution. It cannot be ‘out of sight, out of mind’.” Each year, more than 400 million tonnes of plastic is produced. Many plastic products are single-use, hard to recycle, and can stay in the environment for decades or centuries, often being fragmented into smaller items. Some plastics contain potentially harmful chemical additives which could pose a threat to human health, particularly if they are burned in the open.

According to the paper’s estimated global data for 2020, the worst polluting countries were: India: 9.3 million tonnes -- around a fifth of the total amount; Nigeria: 3.5 million tonnes; and Indonesia: 3.4 million tonnes. China, previously reported to be the worst, is now ranked fourth, with 2.8 million tonnes, as a result of improvements collecting and processing waste over recent years. The UK was ranked 135, with around 4,000 tonnes per year, with littering the biggest source. Low and middleincome countries have much lower plastic waste generation, but a large proportion of it is either uncollected or disposed of in dumpsites. India emerges as the largest contributor because it has a large population, roughly 1.4 billion, and much of its waste isn’t collected.

The contrast between plastic waste emissions from the Global North and the Global South is stark. Despite high plastic consumption, macroplastic pollution -- pollution from plastic objects larger than 5 millimeters -- is a comparatively small issue in the Global North as waste management systems function comprehensively. There, littering is the main cause of macroplastic pollution.

Researchers say this first ever global inventory of plastic pollution provides a baseline -- comparable to those for climate change emissions -- that can be used by policymakers to tackle this looming environmental disaster. They want their work to help policymakers come up with waste management, resource recovery and wider circular economy plans, and want to see a new, ambitious and legally binding, global ‘Plastics Treaty’ aimed at tackling the sources of plastic pollution.

Dr Velis said: “This is an urgent global human health issue -- an ongoing crisis: people whose waste is not collected have no option but to dump or burn it: setting the plastics on fire may seem to make them ‘disappear’, but in fact the open burning of plastic waste can lead to substantial human health damage including neurodevelopmental, reproductive and birth defects; and much wider environmental pollution dispersion.”

New light on temperature diagnostics

Theranostic techniques enable clinicians to diagnose and treat a condition at the same time, while they can also provide rapid feedback on the effectiveness of treatment. Researchers in the NORTH project are developing hybrid nanoparticles designed to combine temperature-based diagnostics with other functions such as drug delivery, as Professor Anna Kaczmarek explains.

The idea of diagnosing medical conditions on the basis of temperature variations is fairly well established, and a variety of techniques are currently used in medicine to monitor temperatures within the body, such as thermocouples. It is known for example that cancer cells have a slightly elevated temperature in comparison to healthy cells, yet the currently available monitoring techniques are highly invasive. “Putting a thermocouple into the human body involves very invasive techniques,” explains Anna Kaczmarek, Professor in the Department of Chemistry at Ghent University in Belgium. “The research community has been looking for improved, non-invasive ways of measuring temperature at the nanoscale, and luminescence thermometry has emerged as a potential route to achieving this.”

NORTH project

This is a topic Professor Kaczmarek is exploring as Principal Investigator of the ERC-backed NORTH project, in which she and her team are working to develop new multi-functional hybrid nanoparticles. This research centres on periodic mesoporous organosilicas (PMOs), a group of very ordered porous materials which Professor Kaczmarek says can be used in various different ways. “There are a lot of possibilities with PMOs to create materials which are biodegradeable, biocompatible, and also highly porous,” she outlines. Researchers in the project are now looking to combine these PMOs with a group of chemical elements called lanthanides, some of which are well-suited to luminescence thermometry. “They’re not affected by the environment in which they’re being used, which is beneficial for luminescence thermometry,” explains Professor Kaczmarek.

The team behind the NORTH project is investigating these lanthanides with respect to their potential in luminescence thermometry, as well as in other functions. There are 15 lanthanides, many of which have luminescence properties, and researchers have identified several which are particularly interesting for biological applications. “We’ve narrowed it down and are working with a few of these lanthanides,” says Professor Kaczmarek. Researchers typically use combinations of these elements, as this approach tends to be more

Scanning Transmission Electron Microscopy image with high-angle annular dark-field detector (HAADF-STEM) of hybrid PMO-inorganic thermometers developed in project NORTH. The inorganic inner cores generate the thermometry properties, whereas the cavities and porous nature of the PMO walls allows loading it with an anti-cancer drug.

sensitive than using a single lanthanide. “We work for example with combinations of holmium and ytterbium. We’re also exploring thuliumbased systems, such as thulium combined with ytterbium, or also with neodymium or erbium,” outlines Professor Kaczmarek.

A lot of thermometers are currently made using lanthanides, yet these are purely inorganic particles that can’t really be loaded with drugs or used to produce an effective photodynamic therapy (PDT) agent. This is where the PMOs come into play. “The PMO not only makes the hybrid material more biocompatible, but it also adds porosity. You can then load a PDT agent or an anti-cancer drug such as doxorubicin for example,” says Professor Kaczmarek. The project team is now looking to develop effective nanoparticles using these materials; Professor Kaczmarek says size and shape are important considerations. “Larger particles can be toxic, while we also don’t want the nanoparticles to be too small, as that causes some retention problems. Around 100 nanometres would be

ideal,” she continues. “We’re looking to develop spherical particles, as we also know that rod shapes can be toxic.”

The idea is that the particle would be activated with light once it reaches a specific location in the body, such as the site of a tumour for example. This would then allow researchers to monitor temperatures, and potentially release an anti-cancer drug. “We aim to use two wavelengths of light simultaneously, one of which could be used to activate the material to show a temperature read-out,” explains Professor Kaczmarek. A lot of progress has been made in this area over the course of the project, with Professor Kaczmarek and her colleagues demonstrating that their hybrid particles can combine thermometry and drug delivery. “We see that there’s some signal interference, which is related to the presence of spectral overlaps. But we also see that we can still use the nanoparticles as a thermometer and as a drug release agent simultaneously without any issues,” she continues.

Researchers are also investigating the possibility of combining thermometry with PDT, work which is still in its early stages. This is one of the main topics currently on the project’s agenda, alongside research into achieving ondemand drug delivery. “We don’t want a drug to be slowly released while the theranostic material is on the way to a specific site like a tumour, we want the drug to be released when it gets there. We’re still trying to optimise that and to implement PDT,” outlines Professor Kaczmarek. The ultimate aim is to use these nanoparticles to diagnose and treat human patients, yet Professor Kaczmarek says there is still much more work to do before they can be applied clinically. “There are still concerns around the biocompatibility and performance of these materials,” she acknowledges. “There is still a long road ahead before any thermometer materials reach clinics.”

Degradeability and feedback during treatment

These are issues Professor Kaczmarek plans to address in a follow up ERC- funded proof-ofconcept project called LUMITOOLS, building on the progress that has been made in NORTH. Through this project Professor Kaczmarek aims

Luminescence thermometry map of newly developed near infrared thermometers. The blue line represents 20°C, the red line 60°C. The ratio of the two peaks is used to build a calibration curve.

thermometry can also be used as a feedback tool,” continues Professor Kaczmarek. “For example, if you want to combine thermometry and photothermal therapy to treat a tumour then you need feedback.”

The cancerous tissue would need to be heated to a high temperature, but it would also be extremely important to avoid over-heating nearby tissue and causing new problems. These nanoparticles could provide rapid feedback to a clinician in these kinds of circumstances,

“The research community has been looking for improved, noninvasive ways of measuring temperature at the nanoscale, and luminescence thermometry has emerged as a potential route to achieving this.”

to overcome the main concerns around these materials, and move them closer to practical application. “How can we convince the medical community to start using these materials? This in large part comes down to degradeability,” she says. If these concerns can be addressed, then these kinds of multi-functional materials could prove extremely useful for the medical community. “There are many potential applications, and not just in diagnostics -

helping guide treatment and tailor it to the needs of individual patients. “The idea is to provide feedback during therapy,” says Professor Kaczmarek. The project team is now also working towards this objective, with researchers testing combinations and improving the nanoparticles, with the long-term goal of bringing them to practical application. “We’ve developed some very interesting materials and thermometers in the project,” continues Professor Kaczmarek.

NORTH

NanOthermometeRs for THeranostic

Project Objectives

Developing multifunctional nanoplatforms, which combine both temperature sensing (as a diagnostic tool) and the therapy of disease (drug delivery/photodynamic therapy/photothermal therapy), as proposed in the ERC Stg project NORTH, can change the way that certain diseases are treated.

Project Funding

This project is funded by the European Research Council ERC Starting Grant NanOthermometeRs for THeranostics (NORTH) under grant agreement No 945945.

Follow up project: European Research Council ERC Proof of Concept project LUMInescence ThermOmeters fOr cLinicS (LUMITOOLS) under grant agreement No 101137651.

Contact Details

Project Coordinator,

Prof. dr. Anna M. Kaczmarek

Ghent University

Department of Chemistry

Krijgslaan 281-S3, 9000 Ghent

Belgium

T: +32 9 264 48 71

E: anna.kaczmarek@ugent.be

W: https://nanosensing.ugent.be

Prof. Dr Anna M. Kaczmarek studied Chemistry at Adam Mickiewicz University in Poland and acquired a PhD at Ghent University, Belgium in the field of lanthanide materials. After several post-doctoral positions in Belgium and visits abroad she created the NanoSensing Group (Ghent University) which focuses on developing (hybrid) luminescent nanothermometers for biomedical applications. Privately she is mom to 6 month old daughter Marie.

Professor Anna M. Kaczmarek
NanoSensing Group photograph.

New imaging modality for improved cancer treatment

Cancer is one of the biggest causes of death in Europe, and researchers continue to work on improving diagnosis and treatment. We spoke to Dr Frédéric Lerouge about the work of the SCANnTREAT project in combining a new imaging modality with X-ray activated photodynamic therapy, which could lead to more efficient treatment of cancer.

An imaging modality called SPCCT (spectral photon counting scanner CT) could provide detailed information about a tumour (when using specific contrast media), including its volume and location, information which can then be used to guide cancer treatment. As Principal Investigator of the SCANnTREAT project, Dr Frédéric Lerouge is looking to combine this imaging modality with X-ray activated photodynamic therapy, which will provide a powerful method of treating cancer. “With photodynamic therapy patients are injected with a molecule called a photo-sensitiser, which then goes all over the body. This molecule is sensitive to light at a certain wavelength, for example red light,” he explains. “The idea is that once a patient has ingested this molecule, doctors can conduct an endoscopy for example, and effectively shine a light inside the patient. This red light activates the molecule, which is then lethal to the cancer.”

Photodynamic therapy

This approach is already used to treat certain types of cancer, notably esophagus cancer, yet it has some significant limitations. In particular, photodynamic therapy is not currently effective on tumours located deep within the body, as light can’t reach the photo-sensitiser molecule itself and activate it, an issue that Dr Lerouge and his colleagues in the project are addressing. “We are developing a new strategy to bring light close to a photo-sensitiser molecule and activate it,” he says. This work involves the development of nanoprobes, tiny devices with a diameter of between 9-10 nanometres, which generate reactive oxygen species under low energy x-ray irradiation. “When we shine x-rays on the nanoprobes they in turn emit light, which can then be re-absorbed by the photo-sensitiser,” continues Dr Lerouge. “This is related to a mechanism called scintillation.”

The ability to activate the photo-sensitiser molecule in a controlled way, and to turn it on and off at will, opens up the possibility of targeting cancer treatment more precisely than is currently possible. One

approach commonly used to treat cancer is radiotherapy, where high-energy x-rays are shone directly on a tumour to destroy it, yet Dr Lerouge says this also leads to

to image a tumour with the scanner and treat it afterwards, we will therefore be able to reduce side-effects and improve treatment efficiency,” says Dr Lerouge.

“With photodynamic therapy patients are injected with a molecule called a photo-sensitiser, which then goes all over the body. This molecule is sensitive to light at a certain wavelength, for example red light.”

side-effects. “Radiotherapy can destroy a tumour, but it may also destroy surrounding healthy tissue, which can lead to various side effects,” he outlines. The new approach to treatment that researchers are developing in the project is designed to have less complications. “Indeed, with the nanoprobes designed during the project it will be possible

A more precise and targeted method of treating cancer could also reduce the likelihood of a recurrence of the disease, which is a major concern. Even if a cancer is treated successfully with current methods, some cancerous cells may be left in the body, which leaves people vulnerable. “These cancerous cells may effectively lie dormant for a time,

Part of the members of the consortium of the SCANnTREAT project around the spectral scanner.

but they can lead to the development of a tumour in future. If all of the cancer cells can be eradicated, this could prevent the recurrence of the disease,” explains Dr Lerouge.

Photodynamic therapy is already used in cancer treatment, now Dr Lerouge is exploring the possibility of widening its use, particularly in treating pancreatic cancer, which is typically diagnosed at quite a late stage. “The aim would be to provide an effective and efficient treatment method,” he says.

Researchers in the project are currently conducting tests on mice, which are both imaged with a conventional scanner, and also imaged with the SPCCT modality following injection with the nanoprobes. “When you use this specific imaging technique you can choose to see only the areas where the nanoprobes are located, which gives us a very accurate diagnosis,” explains Dr Lerouge.

“When you merge the two images it gives you a more precise picture of where the nanoprobes are, that can then be used to guide treatment. We’re interested not just in improving diagnosis, but also in monitoring the pathology during treatment.”

This will allow clinicians to assess the effectiveness of treatment, and if necessary adapt it to reflect the degree of progress that

has been made. If everything is going well, and the patient is responding positively to treatment, then it might be possible to reduce the dose of x-rays to be delivered for example, or the opposite might be the case. “If we see that things aren’t going our way we can choose to increase the x-rays a little bit. The main consideration is always the wellbeing

of the patient,” stresses Dr Lerouge. The information gained in the process can then be very useful in guiding the treatment of other patients in future. “We want to provide this information in databases to help improve treatment as much as possible, from the very early stages when the pathology is initially diagnosed,” continues Dr Lerouge.

Imaging of nanoprobes after subcutaneous injection in mice. A and B conventionnal scanner imaging, C gadolinium specific kedge imaging, D merging of B and C. (from reference doi : 10.1039/D3NR03710J).
Overall concept of the SCANnTREAT project : combination of spectral scanner and nanoprobes for imaging and treatment of cancer.

SCANnTREAT

Photodynamic therapy triggered by spectral scanner CT: an efficient tool for cancer treatment

Project Objectives

The projects aims at combining two cuttingedge technologies for the treatment of cancer : spectral photon counting scanner CT a ground-breaking imaging modality and a new treatment known as X-rays activated Photodynamic Therapy (X-PDT). The perfect match between these two technologies is ensured with specifically designed probes acting both as contrast media and therapeutic agents.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 899549.

Project Consortium

• The University Claude Bernard Lyon 1 (UCBL) - Chemistry Laboratory

• The University Claude Bernard Lyon 1 (UCBL) - CREATIS

• Inserm UA7 unit Synchrotron Radiation for Biomedicine (STROBE)

• Inserm The UA8 Unit “Radiations: Defense, Health and Environment”

• Maastricht University (UM) - D-Lab

• Philips Medical Systems Technologies, Ltd., Israel (PMSTL)

• Lyon Ingénierie Projets https://www.scanntreat.eu/governance.html

Contact Details

Project Coordinator, Dr Frederic Lerouge

Chemistry Laboratory

UMR 5182 ENS/CNRS/University claude bernard Lyon 1

9 Rue du Vercors

69007 Lyon

T: +33 6 14 14 50 86

E: Frederic.lerouge@univ-lyon1.fr

W: https://www.scanntreat.eu/

Dr Frederic Lerouge is assistant professor at the Claude Bernard Lyon 1 University, a position he has held since 2007. His main research interests focus on the design of inorganic nanomaterials and their surface modifications for applications in health and environment.

Animal models

A major next step for Dr Lerouge and his colleagues will be to test the technology on a broader range of small animal models, gaining deeper insights into its overall effectiveness and identifying any ways in which it could be improved. This work is still at a relatively early stage however, and Dr Lerouge plans to establish a successor project beyond the conclusion of SCANnTREAT later this Summer, in which he intends to pursue further research. “We need to do some more work and gather more data. We also have ideas about how to develop the technology, which will evolve depending on the results of our research, in terms of the quality of diagnosis and efficiency of treatment,” he continues. “We are looking to improve the technology, and to ensure that we will be able to actually use this system.”

The long-term vision is to use this technology to treat cancer in human patients, and with healthcare budgets under strain, costeffectiveness is an important consideration. The SCANnTREAT system itself is expensive, the result of dedicated work in the laboratory, yet Dr Lerouge believes it can lead to financial benefits over the long-term by reducing the need for further action following treatment, which currently is a major cost in cancer care. “Some patients are treated for cancer, but they remain in a fragile state and they are not restored to full health. They may then need further treatment for other conditions,” he explains. “The idea is that once this treatment has been administered it is over, and the patient can then get back to normal life. This is one of the issues that people who work on cancer treatment look at very closely.”

The project’s research represents an important contribution in this respect, and in future Dr Lerouge plans to build on the progress made in SCANnTREAT, and move the new treatment closer to practical application.

“We are looking to establish a new consortium, to push this technology forward and to test it on different cancer models,” he says. Researchers hope to probe the limits of this technology, and also to look at how it can be improved further. “We intend to look at what we can achieve in terms of the efficiency of treatment, and also at what kinds of cancer we could treat with this approach,”

The ultimate goal is to develop a new method of treating cancer that can be administered to human patients. “We hope to treat human patients with this technology within the next five years or so,” says Dr Lerouge.

Dr Frederic Lerouge
3D structure of a melanoma cell derived by ion abrasion scanning electron microscopy. Image by National Cancer Institute
Electronic microscopy photo of the nanoprobes used during the SCANnTREAT project, average sizes are below 10 nm.

Revolutionising Neurodegenerative Disease Treatment: Immunotherapy and Advanced Stem Cell Techniques

What drives the progression of neurodegenerative diseases like ALS, Alzheimer’s, and MS?

Unraveling the mechanisms of neuroinflammation is at the heart of Prof. Bob Harris’s research at the Karolinska Institutet. His team is pioneering new therapeutic approaches by targeting microglial activity and harnessing advanced stem cell techniques. Their goal is to transform treatment strategies for these debilitating conditions.

Neurodegenerative diseases like Amyotrophic Lateral Sclerosis (ALS), Alzheimer’s disease (AD), and chronic Multiple Sclerosis (MS) are marked by a common culprit: neuroinflammation. This inflammation is driven by the chronic activation of microglia, the brain’s resident immune cells, and begins as a protective mechanism. However, when prolonged, it is a cause of destructive cytotoxicity and neurodegeneration. We spoke with Prof. Bob Harris, the leader of the research group Applied Immunology and Immunotherapy at the Centre for Molecular Medicine, Karolinska Institutet. He and his research group are focused on understanding the underlying mechanisms of chronic neurodegenerative diseases and translating this knowledge into practical treatment solutions. Given that no effective treatments currently exist for neurodegenerative diseases, their objective is to develop novel therapeutic platforms to address this significant unmet medical need.

A brain on fire

Normally, microglia play a role in maintaining brain health. They release factors that promote neuronal survival and restoration of neuronal function after injury. But perpetual activation prevents them from executing their physiological and beneficial functions. Chronically activated microglia release harmful cytokines, oxygen radicals, and other molecules that impair neuronal function and threaten cell survival. Additionally, the essential phagocytic function of microglia- their ability to clear out cellular waste and misfolded proteins— is compromised in neurodegenerative diseases. This failure results in the buildup of toxic aggregates in the central nervous system (CNS), which in turn triggers further microglial activation, creating a vicious cycle of inflammation and damage. For example, in ALS, overactive microglia gather around dying motor neurons. Their numbers directly correlate with the extent of neuron damage.

Similarly, in MS, activated microglia are found at sites of demyelination, interacting with T and B cells through factors released behind a closed blood-brain barrier (BBB). Scientific evidence shows that activated microglia and monocytes can have both beneficial and detrimental effects at different disease stages. This means that approaches that modulate microglial activity can potentially be used in the treatment of a wide range of neurodegenerative diseases. “Strategies that modulate microglial activity and clearance function are thus promising for treatment of a range of neurodegenerative diseases” explains Prof. Harris.

Putting the

Fire Out Before Rebuilding

Immunotherapy, which involves modulating immune cells to reduce or stop inflammatory disease processes, has revolutionized the treatment of certain cancers and autoimmune diseases.

However, currently, there is no effective immunotherapy for neurodegenerative diseases. According to Prof. Harris, before a tissue can be healed, the inflammatory process driving the disease must be halted.

“It is impossible to rebuild a house that is still burning, so first the fire must be put out – only then will rebuilding be efficient. The same principle applies to immunotherapy – but in a perfect world the therapeutic intervention would not only reduce the neuroinflammation but also activate the regenerative processes,” he explains.

Unruly cells can be trained to behave

“If we think about how macrophages and microglia can be activated, it is like a Yin-Yang, with one activation state, represented by white, being helpful, while the other, represented by black, is harmful. In disease situations there is a dominance of the harmful cells over the helpful cells,” explains Bob Harris. His research team has developed a microglia microglia/ macrophage cell therapy protocol. A large number of beneficial cells have been specifically activated by exposure to a specific combination of cytokines. This activation successfully induced an immunosuppressive M2 phenotype. By transferring M2 microglia and macrophages into mouse models with Type 1 diabetes and Multiple Sclerosis, they prevented or significantly reduced disease severity. In

their MS model, known as experimental autoimmune encephalomyelitis (EAE), the team transferred M2 microglia intranasally into mice. This resulted in a significant reduction in inflammatory responses and less demyelination in the central nervous system (CNS). Similarly, the team conducted a cell transfer of M2 macrophages into a mouse model for Type 1 diabetes. Remarkably, a single transfer protected over 80% of treated mice from

Folding DNA to create novel parcels for delivery

Though DNA origami nanobiologics are a relatively new technology, they have become a prime focus in biomedical research due to their capability to deliver pharmaceuticals with precision, coupled with the natural biocompatibility of DNA. Additionally, the ability to engineer these structures into complex, combinational biomolecular formations enhances their appeal. The DNA

“Strategies that modulate microglial activity and clearance function are thus promising for treatment of a range of neurodegenerative diseases.”

developing diabetes for at least three months. This was achieved even when the transfer was conducted just before the clinical onset of the disease. The team found that harmful cells recovered from the blood of patients with multiple sclerosis can be retrained using their activation protocol. Once retrained, these cells were able to modulate the function of other harmful cells in cell essays. “Our activation protocol (IL-10/ IL- 4/TGFb) has been successfully used by other researchers in other disease settings and to reduce rejection of transplants, clearly demonstrating its widespread applicability,” says Bob Harris.

origami technique involves using hundreds of short DNA strands to create a single-stranded DNA scaffold, which is then intricately folded into specific three-dimensional nanoscale shapes, akin to the art of folding paper into delicate origami shapes. The development of scaffold DNA routing algorithms now allows for the precise control of nanoscale structures through an automated design process. Modern DNA origami designs, which use open wireframe structures, offer the most flexibility. This technology can incorporate existing FDA-approved drugs, repurposing them for conditions like ALS. Prof. Harris and his collaborators developed a DNA origami construct in which DNA was made into

DNA origami construct.

cylindrical rods, in which a repurposed cancer drug (Topetecan) was loaded. The surface of the construct was modified to express a carbohydrate molecule that would specifically bind to receptors on microglia. “It’s like packing a present in a parcel box, wrapping it in paper, and adding an address and postage. We are able to specifically deliver immunomodulatory drugs to the harmful microglia, forcing them to be less harmful” says Bob Harris. The MS symptoms were significantly improved through a single treatment. In their research, they demonstrated that topoisomerase 1 (TOP1) inhibitors, like topotecan, reduce inflammatory responses in microglia and reduce neuroinflammation in vivo, providing a promising therapeutic strategy for neuroinflammatory diseases. Their lab continues to develop a range of DNA origami constructs, each loaded with different therapeutic cargoes aimed at modulating specific functions of microglia and macrophages.

Harnessing a mother’s protective instincts

Amniotic epithelial cells (AECs) are a type of stem cell derived from human placenta that exhibit strong immunomodulatory properties, contributing to the safe development of the baby. They have the ability to differentiate into a variety of cell types, which makes them very versatile. The clinical use of human amniotic membranes has been recognized for over a century, with the first application in treating burned and ulcerated skin reported in 1910. Instead of using the entire amniotic membrane, more long-term and enhanced effects have been achieved by using isolated AECs. At Karolinska Institutet, a protocol has been developed to recover AECs from the innermost layer of the amniotic sac in discarded placental tissue, and these cells display potent immunomodulatory and immunosuppressive properties. After AEC transplantation, there is no host rejection in either mice or humans. These cells have shown disease-modifying properties in various conditions, including ischemia, bronchopulmonary dysplasia, and liver

diseases, primarily by reducing inflammatory damage. AECs are currently undergoing testing in several clinical trials. In a mouse model of Alzheimer’s disease, intrathecal administration of AECs significantly reduced amyloid plaque burden in the brain, possibly through enhanced microglial phagocytosis.

The Harris lab is currently testing protocols for the adoptive transfer of immunomodulatory AECs in different experimental models of neurodegeneration to stimulate anti-inflammatory and restorative responses in the degenerating CNS. Their unpublished results indicate that a single injection of cells could significantly reduce the severity of an experimental model of multiple sclerosis. ”AECs are a special type of stem cell with particular properties that make them excellent candidates for cell therapy in a wide range of neurodegenerative diseases,” says Bob Harris.

Why 3 is better than 1

As neurodegenerative diseases involve a multitude of cell types including both immune cells and CNS cells, multiple immunotherapeutic approaches will likely be necessary to target these different cell types. There is also significant variability in the disease progression among patients with the same neurodegenerative condition. “Having developed these three separate therapeutic principles, we are now testing them in different combinations in our experimental neurodegenerative disease models. There will likely be an optimal combination that allows different targeting of microglia at different phases of the disease process”, explains Bob Harris.

The Harris lab has a clear vision to Make A Difference, not only by increasing scientific knowledge within the field of neurodegenerative diseases but also by improving the life quality of patients with these incurable diseases. “Meeting neurodegenerative disease patients, their carers, and families is a humbling experience, but it gives us such energy to conduct our research,” he concludes.

NOVEL IMMUNOTHERAPIES FOR NEURODEGENERATIVE DISEASES

Project Objectives

There are currently no effective treatments for any neurodegenerative disease, and this remains a major unmet medical need. The origin and progression of many neurodegenerative diseases are still not clearly understood, and basic research is required to provide a platform for the development of novel and effective therapies. The objective of our research programme is to address this unmet need by developing multiple therapeutic platforms.

Project Funding

This project is funded by the Swedish Medical Research Council, Alltid Litt Sterkere, Neurofonden, Ulla-Carin Lindquist Stiftelse for ALS Research, Karolinska Institutet Doctoral funding.

Project Collaborators

• Prof Björn Högberg, Karolinska Institutet

• Assoc Prof Roberto Gramignoli, Karolinska Institutet

Contact Details

Professor Bob Harris

CMM L8:04, Karolinska University Hospital Visionsgatan 18, S-171 76 Stockholm, Sweden

E: robert.harris@ki.se

W: https://ki.se/en/research/groups/ immunotherapy-robert-harris-research-group

W: https://www.cmm.ki.se/research-groupsteams/robert-harris-group-2/

Bob Harris is Professor of Immunotherapy in Neurological Diseases and leads the research group Applied Immunology and Immunotherapy at the Centre for Molecular Medicine, a designated translational medicine center at Karolinska Institutet. They conduct a strongly interconnected research program aimed at using knowledge gained from projects in basic science to applications in a clinical setting, with a focus on understanding why chronic neurodegenerative diseases of the nervous system occur, and then devise ways to prevent or treat them.

Bob Harris
Origin of amniotic epithelial cells.

How Early Infections Influence Neonatal Gut Development

We spoke with Prof. Dr. Mathias Hornef about the ERC-funded EarlyLife project, which investigates how early-life infections impact the gut epithelium, microbiome, and immune system development in neonates. Using advanced techniques and mouse models, the project aims to uncover the influence of these infections on long-term health and disease susceptibility.

The immediate postnatal period is a critical period for the development and maturation of the immune system. A timed succession of non-redundant phases allow the establishment of mucosal hostmicrobial homeostasis after birth. The enteric microbiota, mucosal immune system, and epithelial barrier are key factors that together facilitate a balanced relationship between the host and microbes in the intestine. The enteric microbiota consists of a diverse community of mainly bacterial microorganisms that colonize the gut. The mucosal immune system represents a defense mechanism that is able to recognize and respond to pathogenic threats while maintaining tolerance to commensal bacteria. The epithelial barrier, formed by a variety of highly specialized cells lining the gut surface, facilitates nutrient digesting and absorption but restricts commensal microorganisms to the gut lumen and prevents enteropathogens from entering sterile host tissues. Early-life infections significantly impact all three components, which may reduce the host’s fitness during the course of the infection and influence the outcome but also lead to long-term consequences.

The ERC-funded EarlyLife project investigates the dynamic changes in the gut epithelium of neonates during early-

life infections. The study seeks to create a comprehensive map of postnatal epithelial cell type differentiation and analyze the impact of early-life bacterial, viral, and parasitic infections on cell differentiation and function. The main long-term goal is to understand how early-life infections influence the development of the epithelium, the immune system, and the establishment of the microbiome, and thereby the risk for inflammatory, immune-mediated, and metabolic diseases later in life.

Uncovering differences in the gut epithelium of neonates

The gut epithelium forms the luminal surface of the intestine and comes into direct contact with the enteric microbiome and potentially pathogenic microorganisms. It is composed of a variety of different cell types with highly specialized function and plays a critical role in maintaining a balance between the host and its microbiota. Traditionally, the gut epithelium was considered to largely represent a passive physical barrier between the sterile host tissue and the microbially colonized gut lumen. However, research during the last decades has shown that epithelial cells actively participate in establishing hostmicrobial homeostasis. Additionally, these

cells trigger an early immune response and communicate intensively with the underlying immune system when the host is exposed to enteropathogenic microorganisms. This interaction is particularly significant in neonates, who transition from a sterile inutero environment to a microbiota-rich external world immediately after birth and mature their tissue and immune system during the postnatal period. Upon birth, neonates undergo rapid colonization by environmental and maternal bacteria. This sudden exposure requires a delicate balance between immune tolerance of beneficial commensal bacteria and activation against potential pathogenic threats. The gut epithelium plays a vital role in this process.

Colonization happens very quickly, reaching adult levels in the small intestine within hours to days. The baby’s body must support and tolerate this rapid bacterial growth. Additionally, neonates are most susceptible to infections during early life, so at the same time, they have to react to infections and try to defeat pathogens. All this happens while they are massively limited by energy supplies because the maternal energy supply via the umbilic cord is suddenly interrupted and they need to establish uptake of breast milk and enteral feeding. The severity of this energetic

bottleneck is illustrated by the fact that neonates often lose weight during the early transition period. This temporal energy deficit may, in turn, explain why neonates react to different to infection.

“Neonates are very different from adults. They are not just small adults; they have unique physiological and immunological needs. Our group works on understanding these differences, particularly in adaptive immune development, infectious diseases, and microbiota establishment, using neonatal infection models in mice. This type of research is difficult in humans because we can’t get tissue samples from healthy newborns” explains Prof. Hornef. His research team aims to understand how neonates are different from adults in terms of energy constraints, tissue maturation, and the balance between immune tolerance and antimicrobial host responses.

adult host. Whereas the adult host releases invaded epithelial cells into the gut lumen in a process called exfoliation to prevent pathogen translocation, infected epithelial cells in the neonate are engulfed by neighboring cells and subjected to lysosomal degradation.

Prof. Hornef and his research team are additionally exploring the innate immune recognition in the gut epithelium. Innate immune receptors in the gut help recognize and respond to microbial threats. When these receptors bind to microbes, they trigger the release of immune mediators that recruit immune cells and produce antimicrobial molecules to eliminate invaders. However, uncontrolled activation can lead to inflammation and tissue damage. In the lab, the researchers are studying the molecular basis and regulation of these receptors in intestinal epithelial cells.

“Infections in early life not only impact immediate health but can also prime the immune system in ways that affect long-term disease susceptibility.”

“We are addressing the link between infection and the gut epithelium. The gut epithelium consists of several cell types, such as goblet cells, Paneth cells, stem cells, tuft cells, M cells, enteroendocrine cells, and enteroabsorptive cells that fulfill different functions. It has a complex structure with crypts and villi and undergoes continuous proliferation. In neonates, many aspects are different. For example, newborn mice lack certain cell types, like Paneth cells, which produce antimicrobial peptides crucial for maintaining the barrier against microbial exposure. Understanding these differences and how infections impact them is critical” says Prof. Hornef.

The EarlyLife project employs advanced techniques such as single-cell RNA sequencing, spatial transcriptomics, and epigenetic profiling to study the gut epithelium. By using mice models, Dr. Hornef and his team have discovered that the emergence of M cells, that transfer luminal particulate antigen to the Peyer’s patch, the site of immune activation, determines the maturation of the early adaptive immune system. Interestingly, certain microbial stimuli present during infection can accelerate the emergence of M cells and thus lead to earlier immune maturation and immune responses. This may be advantageous during infection but may still come with unwanted consequences in respect to the control of inappropriate responses. Also, they found that the fate of intestinal epithelial cells invaded by the enteropathogen Salmonella differs markedly between the neonate and

The Long-term Impact of Early Infections

The main question for the EarlyLife project is how early-life infections affect longterm health. Preliminary findings suggest that neonatal infections can have lasting effects on the gut epithelium, microbiome composition, and immune function. This concept is supported by observations in humans. Children in developing countries with frequent infections often exhibit stunted growth and altered immune responses. “Infections in early life not only impact immediate health but can also prime the immune system in ways that affect long-term disease susceptibility. Understanding these processes is crucial for developing targeted interventions” concludes Prof. Hornef. By translating EarlyLife’s findings from mice models to human health, and identifying key molecular and cellular changes in the neonatal gut epithelium during infections, researchers hope to develop new strategies to prevent and treat infections in newborns and reduce childhood mortality worldwide. This includes potential therapies designed to modulate the gut microbiome and enhance immune responses in vulnerable neonates. Additionally, the project explores the concept of neonatal “priming,” where early life exposures shape immune and epithelial cell functions. These insights could lead to novel approaches for preventing chronic diseases that originate from a dysregulated early immune system maturation.

EarlyLife

Gut epithelial dynamics and function at the nexus of early life infection and longterm health

Project Objectives

Infections of the gastrointestinal tract cause significant childhood mortality and morbidity worldwide. EarlyLife aims to map postnatal intestinal epithelial cell and tissue differentiation. It studies age-dependent differences that explain the enhanced susceptibility of the neonate host to infection and investigates the impact of early life infection on long-term gut health. Using advanced analytical methods and innovative models, it explores how early life infections influence enteric function in the neonate host but also immune development and long-term disease susceptibility, focusing on the gut microbiota, mucosal immune system, and epithelial barrier.

Project Funding

This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant agreement No. 101019157).

Project Collaborators

ERC Work Program • Martin von Bergen, Helmholtz Centre for Environmental Research, Germany • Thomas Clavel, RWTH Aachen University, Germany • Ivan Costa, RWTH Aachen University, Germany Research Group • Ivan Costa • Thomas Clavel • Oliver Pabst • Michael Hensel • Martin von Bergen • Geraldine ZimmerBensch • Jochen Hühn • Lars Küpfer • Frank Tacke • Tim Hand • Jens Walter • Marc Burian

Contact Details

Dr. rer. nat. Christin Meier

Administrative Coordinator

Universitätsklinikum Aachen (UKA)

Pauwelsstraße 30, D 52074 Aachen

T: +49 241 80-37694

E: cmeier@ukaachen.de

W: https://www.ukaachen.de/

Mathias Hornef studied medicine in Tübingen, Lübeck, New York, and Lausanne. He has held positions at the Max von Pettenkofer Institute in Munich, the Karolinska Institute in Stockholm, and the University of Freiburg and Hannover Medical School. Now as a director of the Institute of Medical Microbiology at University Hospital RWTH Aachen, his research focuses on interactions between bacteria, viruses, and parasites, the intestinal epithelium and the mucosal immune system in the neonatal host.

Mathias Hornef

Taking the next step in diabetes treatment

Type-1 diabetes is a chronic, incurable condition which still requires careful management and regular injections of insulin. Researchers in the FORGETDIABETES project are working to develop a fully implantable bionic invisible pancreas, which will relieve the burden associated with managing type-1 diabetes, as Professor Claudio Cobelli explains.

An autoimmune condition which leaves the pancreas unable to secrete insulin, type-1 diabetes affects millions of people across the world, and the number is projected to rise further over the coming decades. Managing the disease is quite an onerous task, as patients typically need to monitor their own carbohydrate consumption and inject themselves with insulin from an exogenous source throughout the day. “Managing the condition can impose a heavy burden on patients,” acknowledges Claudio Cobelli, Professor of Bioengineering at the University of Padova in Italy. The situation has improved over the recent past, with technological progress helping to relieve this burden on patients and improve their quality of life. “One important development has been the ability to continuously monitor glucose

Concept overview of the FORGETDIABETES project.

The device is implanted in the jejeunum, and includes the magnetic system to attract the ingestible capsule and the system to transfer the insulin from (i) the capsule to the reservoir, and from (ii) the reservoir to the body. Electronic components and sensors allow the device to run automaticly, and communicate with the patient.

concentration. This has been a big step forward,” outlines Professor Cobelli. “A second revolution has been in the ability to inject insulin subcutaneously using various technologies, such as insulin pens.”

by a pump. While hybrid closed loop systems are a major advance in the field, Professor Cobelli says they still have some limitations. “Subcutaneous insulin infusion is very practical, but it isn’t entirely

“I’ve been working to develop an intraperitoneal control algorithm, which involves looking

at

how

the

glucose signal can be used to predict the amount of insulin to be infused. The

control algorithm is tailored to the patient, it’s an adaptive control algorithm.”

Hybrid closed loop systems

A further step forward has been in the development of so-called hybrid closed loop systems, in which a sensor monitors an individual’s glucose levels, then an algorithm calculates the amount of insulin that should be subcutaneously injected

optimal, as insulin takes time to get into the blood. So people still have to be careful with meal planning and exercise,” he explains. As Principal Investigator of the EU-backed FORGETDIABETES project, Professor Cobelli is part of a team working to develop a bionic invisible

The ingestible capsule acts as insulincarrier, and travels passively along the GI tract up to the implanted device. The capsule is made by soft biocompatible material embedding two metalling rings to facilitate the docking.

More than 150 actions per month

4 actions per month

2 x Oral Insulin Refil

2 x Charge the pump

Widespread Adoption for Diabetes Treatment in Everyday Life

pancreas (BIP) designed to address these issues and deliver insulin into patients more effectively. “There are three main components of this artificial pancreas, or hybrid closed loop system. These are the glucose sensor, an algorithm which predicts the amount of insulin needed to maintain blood glucose in the target range, and the actual pump,” he says.

The project brings together several partners from across Europe to develop these components, part of a device designed to deliver insulin into the body via an intraperitoneal route. This approach avoids some of the issues associated with subcutaneous insulin infusion, as it closely resembles the normal physiological route, giving people with type-1 diabetes a greater degree of freedom in their daily lives. “It’s like a normal pancreas – insulin

goes in very quickly, and then it also goes out very quickly. The insulin gets to where it needs to go faster,” says Professor Cobelli. The insulin itself comes from a reservoir within the artificial pancreas, which is replenished by ingesting an insulin pill on a weekly basis, a novel aspect of the system developed in the project. “The patient simply ingests a pill of insulin, a smart capsule, which helps to diminish the psycho-social burden of the condition,” continues Professor Cobelli. “This strategy was recognised as a highly novel way of refilling the insulin reservoir by the EUs innovation radar platform.”

A second highly innovative dimension of the project’s work is the development of the control algorithm, which is designed to ensure that a patient with type-1 diabetes receives the appropriate amount of insulin.

Great care needs to be taken here, as insulin is a very potent hormone. “If too little insulin is injected, then glucose levels will go very high (hyperglycaemia) and if too much is injected then glucose levels will go down to below the target range (hypoglycaemia). It’s a classical control problem,” outlines Professor Cobelli. The general consensus is that glucose concentration in the blood should be somewhere between 70-140 mg/ dl (milligrams per 100 millilitres) during the night and 70-180 during the day, with Professor Cobelli working to help keep patients within this range. “I’ve been working to develop an intraperitoneal control algorithm, which involves looking at how the glucose signal can be used to predict the amount of insulin to be infused,” he explains. “The control algorithm is tailored to the patient, it’s an adaptive control algorithm.”

FORGETDIABETES

FORGETDIABETES proposes radically new approach to diabetes treatment

Project Objectives

FORGETDIABETES proposes a radically new therapeutic paradigm resulting from the multidisciplinary combination of innovative technologies (algorithm, miniaturized hardware, smart sensors, experimental surgery). The resulting paradigm has the potential to revolutionize diabetes treatment and to stimulate the emergence of an EU innovation ecosystem.

Project Funding

This project has received funding from European Union’s Horizon 2020 research and innovation programme under grant agreement No 951933.

Project Partners

• University of Padova (UNIPD), Coordinator : Claudio Cobelli

• Scuola superiore di studi universitari e di perfezionamento Sant’Anna (SSSA), PI : Leonardo Ricotti

• Pfützner Science & Health Institute GMBH (PSHI), PI : Andreas Pfützner

• Centre Hospitalier Universitaire de Montpellier (CHUM), PI : Prof. Eric Renard

• Forschungsinstitut der Diabetes-Akademie Bad Mergentheim (FIDAM), PI : Norbert Hermanns

• Lifecare AS, PI : Joacim Holter

• WAVECOMM Srl, PI : Alessio Cucini

https://forgetdiabetes.eu/partners/

Contact Details

Project Coordinator, Claudio Cobelli

Emeritus Professor of Biomedical Engineering IEEE Fellow, BMES Fellow

Department of Woman and Child’s Health University of Padova

Via N. Giustiniani, 3 35128 Padova

Italy

T: +39-335-6055945

E: cobelli@dei.unipd.it

W: https://forgetdiabetes.eu/

Claudio Cobelli is Emeritus Professor of Bioengineering at the University of Padova. His research is focused on modelling diabetes and developing new technology to monitor and treat the condition. He received the Diabetes Technology Artificial Pancreas Research Award in 2010 and is a fellow of both the IEEE and BMES.

Glucose concentration

The international consensus around glucose concentration in the blood has formed on the basis of metrics from subcutaneous insulin injection, but Professor Cobelli hopes that it will be possible to narrow the range with intraperitoneal injection, as this approach more closely reflects normal physiology. When we eat our glucose level rapidly increases, as the pancreas secretes insulin very quickly, and Professor Cobelli says the bionic invisible pancreas is designed to work in a similar way. “Intraperitoneal injection will hopefully mimic the normal situation. The pancreas is a very fast-acting organ,” he says. The next step will be to test the system, and plans are in place to assess its overall effectiveness in the final year of the project. “We will test the overall system in pigs that have been rendered diabetic. The pump and the sensor will be implanted and the control algorithm will be embedded,” continues Professor Cobelli. “If the results are convincing then

we hope to continue our research in this area, and we are considering the possibility of a follow-up project.”

This would give researchers the opportunity to refine the device further and bring it closer to clinical application in human patients with type-1 diabetes, which is the ultimate long-term goal. A fully implantable, automated device to deliver insulin would bring significant benefits to patients with type-1 diabetes, releasing them from many of the chores involved in managing the condition, and giving them more freedom to pursue their own interests. “With intraperitoneal delivery of insulin, you don’t have to count the carbohydrates in a meal, which is a procedure that is very difficult to get right. Insulin is delivered very rapidly with the device that we are developing, and it quickly gets to where it needs to go. The target range is narrower, and people are able to control their exercise,” says Professor Cobelli.

Professor Claudio Cobelli

Humanity’s

Quest for Longevity in the 21st Century: Recent Advancements and Future Implications

In the 21st century, humanity’s quest for longevity is driven by cutting-edge research in genetics, senescence, and cellular rejuvenation. Scientists are exploring the mechanisms of aging, from telomere degradation to epigenetic clocks and senolytics, to extend both lifespan and healthspan and revolutionize how we age.

Throughout history, humanity has continuously pursued ways to extend our lifespan. The desire to not only live longer but to maintain vitality throughout those added years has been a universal ambition, captivating the imaginations of both ancient civilizations and modern societies. In ancient China, emperors were trying to find the elusive elixir of immortality. They turned to alchemists and herbalists who promised they could defy death by consuming potions made from various herbs and minerals. Meanwhile, in medieval Europe, explorers embarked on journeys searching for the mythical Fountain of Youth, believed to be a source of eternal life hidden in distant lands. Bizarre and dangerous experiments can be seen even in the early modern period when doctors believed they could achieve longevity by consuming gold and mercury. As science and technology advanced, these mystical practices were gradually replaced with more evidence-based approaches. During the 19th century, scientists and doctors aimed to prolong life with early medical interventions, such as blood transfusions and organ transplants. Today, decades of research on the mechanisms behind aging and advances in genetics, nutrition, and biotechnology are shedding light on our understanding of longevity. Building upon these advances, longevity research integrates epigenetics, cellular senescence and senolytics, metabolic interventions, cellular rejuvenation, regenerative medicine, biological age clocks, gut microbiome optimization, and many others. The main goal of longevity research is to not only extend the human

lifespan but also enhance the quality of life throughout those additional years. Anti-aging or longevity research aims to add life to years, not just years to life.

Aging research comes to light

Aging research has seen a lot of “hype” in recent years. Geroscientists, who study age-related chronic diseases, and bio gerontologists, interested in the biological mechanisms behind aging, say that aging is finally seen as a legitimate area of research. A clear sign of this shift is the surge in funding and the rise of numerous institutions and organizations around the world dedicated to longevity research. With increased financial backing, aging research has matured into a dynamic discipline that’s now attracting young researchers. This influx of resources allows scientists to broaden their approaches, using omics approaches - techniques used to study the roles, relationships, and functions of various biological molecules on a large scale. Researchers now can design more rigorous experiments with larger sample sizes, which will ultimately lead to a deeper and more precise investigation into the mechanisms of aging.

Identifying the hallmarks of aging

One of the main questions that has sparked a debate among scientific communities is: How do we age? Can we identify the telltale signs of aging on a cellular and molecular level? It is widely accepted that

aging occurs due to damage to genetic material, cells, and tissues, accumulating over time and exceeding the body’s repair mechanisms. This results in the progressive impairment of function and increased risk for age-related diseases, such as cancer, cardiovascular disorders, neurodegenerative diseases, diabetes, and many others.

An area that lacks clarity is the underlying molecular causes behind this damage and why the body’s repair mechanisms, which are effective in younger organisms, become increasingly ineffective in older ones. To better understand this, scientists have been working to identify and describe the key cellular and molecular features that define it. These features are known as the hallmarks of aging. Their deterioration contributes to faster aging and health problems, while their improvement enhances our well-being as we age, and extends our life and health span. Back in 2013, López-Otín et al. identified nine key hallmarks of aging: genomic instability, telomere degradation, epigenetic changes, loss of proteostasis, impaired perception of nutrients, mitochondrial dysfunction, cellular senescence, stem cell exhaustion, and altered intercellular communication. Their work set the stage for a surge in research, with nearly 300,000 related articles emerging over the following decade. This wealth of new information led López-Otín and his team to publish an updated edition in 2023, integrating a decade’s worth of new insights and advancements. The new edition included three additional hallmarks of aging: deteriorated autophagy, chronic inflammation, and imbalance of the intestinal flora also known as dysbiosis.

Genomic instability is a common denominator of aging. The stability and integrity of our genetic material, our DNA, is constantly challenged by external damaging factors such as chemical, physical, and biological agents, as well as internal factors like reactive oxygen species or DNA replication errors. Genetic lesions caused by these threats include telomere shortening, chromosomal losses or gains, point mutations, and translocations. To combat this, our cells have developed efficient damage detection and repair mechanisms that are largely capable of protecting our DNA. However, these repair mechanisms are not flawless and some DNA damage remains unresolved. The consequences of DNA damage include increased cancer risk, accelerated aging and loss of function, and genetic syndromes, for example, Bloom syndrome and xeroderma pigmentosum. Telomere degradation is another unfortunate consequence of accumulated DNA damage and another hallmark of aging. Telomeres are protective caps at the ends of chromosomes, often compared to the tips of shoelaces that prevent fraying. Every time a cell divides, a small portion of the telomere is lost, meaning that as we age and our cells multiply, these chromosome ends become progressively shorter. Once telomeres reach a critical length, cells enter a resting phase and stop dividing, which can lead to cell death or inflammation, thereby accelerating the aging process and increasing the risk of diseases. The enzyme telomerase plays a key role in maintaining telomere length, as it can replenish lost segments of the telomeres. However, most cells in the body don’t produce telomerase. This lack of telomerase activity acts as a safeguard against cancer, as cancer cells often exhibit high telomerase levels, allowing them to divide indefinitely. In Europe, several leading research institutes like the Max Planck Institute for Biology of Ageing in Germany and the European Research Institute for the Biology of Ageing (ERIBA) in the Netherlands are studying how telomere shortening contributes to cellular aging and its link to age-related diseases, as well as exploring how DNA damage and telomere maintenance impact longevity. The Francis Crick Institute in London is investigating the repair mechanisms that help preserve DNA integrity over time. The insights gained by the work of these institutes could lead to breakthroughs in therapies aimed at maintaining telomere length, reducing genomic instability, or enhancing DNA repair.

Aging ClocksMeasuring the hour on an epigenetic timescale

Our genome consists of over 3 billion base pairs, but the information it contains is not limited only to the DNA sequence. Chemical modifications to the DNA and the histone proteins that package it— collectively known as the epigenome—add another layer of control. Epigenetic changes influence gene activity without altering the genetic code itself. The genetic code is stable, but the epigenome is dynamic and adapts to environmental and lifestyle factors like diet, drugs, and stress. Epigenetic changes regulate gene expression, allowing cells to respond to environmental signals and adapt. DNA methylation and histone modifications act as molecular switches that turn genes on or off, helping cells differentiate into various types while maintaining their original cellular “identity”. Epigenetic changes play a critical role in processes like aging, disease progression, and inheritance. For example, as we age, patterns of DNA methylation shift, which can contribute to the development of age-related diseases. In cancer, certain genes can be silenced or activated due to abnormal epigenetic changes. One of the most promising advancements in longevity research is the development of epigenetic clocks - tools that measure biological age based on these epigenetic changes.

To assess aging, or understand an individual’s health status and the rate at which they are aging, scientists evaluate both chronological age and biological age. Chronological age is a straightforward measure - it is simply put, the number of years a person has lived since birth. Chronological age is not a good measure of health status since it does not reflect the biological changes occurring in the body. Biological age, in contrast, refers to how old your body appears to be at the cellular level. Biological age is influenced by various factors such as genetics, lifestyle, and environmental exposures, making it a better indicator of overall health and aging. The epigenetic clock measures biological age. It was developed by Dr. Steve Horvath, called “the father of epigenetic clocks”, a prominent researcher in the field of epigenetics and aging. He developed the first epigenetic clock in 2011. Initially overlooked, his work gained significant attention following the introduction of a multi-tissue clock in 2013, which provided a crucial biomarker for aging. Horvath’s epigenetic clocks are based on DNA methylation patterns, specifically measuring methylation at various sites across the human genome. These patterns are used to estimate biological age by employing regression models and machine learning techniques. Horvath’s clocks draw from extensive public datasets and include species-specific and pan-mammalian versions, reflecting patterns of epigenetic changes across different species and tissues. His second epigenetic clock is called GrimAge, named after the Grim Reaper. The GrimAge is a strong predictor of all-cause mortality, outperforming other epigenetic clocks. It incorporates DNA methylation markers, plasma protein levels, and inflammatory markers associated with aging and age-related diseases. The GrimAge has been validated in numerous studies and is recognized for its accuracy and predictive power in evaluating aging-related outcomes.

In addition to these, various other types of aging “clocks” are also available. For example, the inflammation clock focuses on changes in glycans—complex sugar molecules attached to proteins like immunoglobulin G (IgG). Scientists are studying how these glycans vary with age and how they relate to diseases. Their work reveals that as we age, certain glycans linked to IgG, particularly those involving galactose and sialic acid, decrease in response to chronic inflammation, a common issue in conditions like cardiovascular disease. This change can serve as a marker or “clock” for accelerated aging. Each clock provides unique insights into the aging process and helps us understand, modify, and potentially prevent the risk factors associated with accelerated aging.

Clearing out the waste: Senolytics remove senescent cells

The remarkable thing is, that we are not only on the verge of unveiling the precise mechanisms of aging, but we may also be approaching the possibility of significantly extending human lifespan. In a major recent breakthrough, scientists successfully extended the lifespan of mice by eliminating senescent cells through genetic or pharmacological treatments. Cellular senescence is called a “cell fate”. Essentially, it is a state of halted cell division caused by stress or accumulation of damage over time. Senescent cells permanently stop dividing and undergo significant changes in gene expression. This state can be triggered by various stressors, such as DNA damage, telomere shortening, oncogenic mutations, and metabolic or mitochondrial dysfunction. While senescent cells no longer divide, they remain metabolically active and secrete a range of pro-inflammatory molecules, known as the senescence-associated secretory phenotype (SASP). These molecules can disrupt surrounding tissues, contributing to inflammation, tissue dysfunction, and the progression of age-related diseases. Senescent cells accumulate in multiple tissues as we age and are found at sites of chronic diseases, as well as after exposure to radiation or chemotherapy.

The discovery of senescent cells dates back to the early 1960s when Leonard Hayflick and Paul Moorhead first observed that human cells have a limited capacity to divide—a phenomenon known as the “Hayflick limit.” The concept of senolytics, a class of drugs designed to target and eliminate senescent cells, emerged much later, in 2015, when researchers at the Mayo Clinic, including James Kirkland and colleagues, published a groundbreaking study demonstrating that clearing senescent cells in mice could improve health and extend

lifespan. By selectively killing senescent cells using a combination of the leukemia drug dasatinib and the natural plant pigment quercetin, the researchers were able to reduce signs of frailty, improve cardiac function, and delay age-related conditions. Now that we know that senescent cells can be removed, scientists are studying the potential benefit of using senolytic drugs in humans. Several clinical trials are exploring their potential to treat age-related diseases and improve health span.

In addition to senolytics, there are many other therapeutic interventions currently being explored for their potential to slow aging. Metformin is a drug commonly used to treat type 2 diabetes. Metformin has gained a lot of attention recently for its potential antiaging properties. The TAME (Targeting Aging with Metformin) trial is currently underway to explore its efficacy in humans for delaying agerelated diseases. Rapamycin, originally an immunosuppressant, has also emerged as a powerful tool in longevity research. It inhibits the mTOR pathway, a key regulator of cell growth and metabolism, and has been shown to extend lifespan in various animal models. Other compounds, like NAD+ precursors and resveratrol, are being studied for their roles in boosting cellular repair and mitochondrial function. As we move closer to understanding and potentially reversing the aging process, scientists are starting to think about profound ethical and philosophical questions: Should we extend life indefinitely? What would a significantly longer lifespan mean for society, healthcare, and our concepts of mortality? While the future of longevity research holds immense promise, it also challenges us to rethink aging and what it means to live a meaningful life.

“Epigenetic changes influence gene activity without altering the genetic code itself.“

Nature-based solutions to urban challenges

Nature-based solutions can help address many major challenges facing cities around the world. We spoke to McKenna Davis and Benedict Bueb about the work of the transdisciplinary INTERLACE project, which has developed an array of tools, guidance materials, and further resources to build capacities and empower cities and other stakeholders to implement urban nature-based solutions.

Global cities are facing significant pressures from increasing influxes of people moving to urban centres, necessitating that local authorities seek solutions to provide a healthy environment for the population while addressing wider concerns around biodiversity loss, climate change and pollution. Nature-based solutions (NbS), the concept of working with nature through the protection, restoration and sustainable management of ecosystems, are an important part of efforts to combat these and other social, economic and environmental challenges. “NbS are intended to be multi-functional. They might target improved human health as a core objective, but simultaneously also deliver biodiversity benefits and reduce risks caused by extreme rainfall events and urban heat islands,” explains McKenna Davis, Senior Fellow at the Ecologic Institute in Germany. For example, restoring an urban river can reduce flood risk in an area, while also having other positive effects like providing areas for recreation and social interactions, increasing the aesthetic beauty, supporting local biodiversity and attracting businesses and investments.

The INTERLACE project

With the overarching aim of empowering local authorities to restore their urban ecosystems and make cities more resilient, liveable and inclusive through NbS, the INTERLACE project draws on the expertise of 21 research, city network, city, and communication partners from across Europe and Latin America. “The project is designed to develop approaches and tools around urban NbS, which are then tested and improved through the experiences of our partner cities and made available for cities beyond the project,” outlines McKenna Davis, INTERLACE Project Coordinator. The six medium-sized partner cities (Granollers in Spain, Envigado in Colombia, Kraków Metropolis in Poland, Chemnitz in Germany, Portoviejo in Ecuador and five municipalities around San José (CBIMA) in Costa Rica) helped to identify shared challenges, such as mitigating the impacts of natural events and the need to develop supportive policy frameworks. “Part of this is about reducing flood risk, but the project also aims to fill important knowledge

“ We want to plant a seed and build capacities to enable cities and other stakeholders to tap the full potential of nature-based solutions.”

and skill gaps around NbS,” says Benedict Bueb, the Project Manager.

An effective NbS can represent a more sustainable approach than traditional ‘grey’ infrastructure solutions (e.g. concrete dams and dykes to reduce flooding) to address the targeted challenges. NbS may represent an alternative or complement to those solutions. “There are a variety of hybrid solutions, such as sustainable urban drainage systems, where you have a mix of grey and green components. Or you may integrate green infrastructure on the facades or roofs of buildings,” says Mr Bueb. These solutions of course also need to be maintained and funded in the long-term if they are to deliver the full potential benefits and be sustainable. “These systems are inter-connected, with people not independent from nature, and natural systems not independent from people,” stresses Ms Davis. “While the term NbS highlights the

natural component of these solutions, another key element is the human or people-side of NbS, as the human-nature and natural-built systems are intricately interconnected.”

This philosophy is reflected in the project’s overarching Nature-Places-People conceptual approach, and in the transdisciplinary composition of the consortium, which brings together researchers, urban planners, experts from city administrations and communication agencies. The project team looks to engage different stakeholders, particularly those working in local governments, with the aim of developing a more coherent and holistic approach to protecting and restoring nature. This is intended to encourage a greater focus on sustainability, in terms of both decisionmaking and funding structures. “We want to plant a seed in the public consciousness and help build a wider understanding of the

potential of these solutions,” says Ms Davis. The project team also aims to generate tools to support the planning, implementation and monitoring of these solutions, in which the public can play a significant role. “People can participate in citizen science programmes and report on the numbers of birds and butterflies they see, for example. They can also be involved in the stewardship of NbS, such as in maintaining urban gardens, restoring rivers, or planting and watering trees,” she explains.

Lasting impact of INTERLACE

A variety of tools and resources have been developed in the project for supporting the project’s ambitions, focusing on citizen engagement, education, policy and governance, and planning and assessment. Guidance has been created, for example, on different strategies to engage communities, schools and other stakeholder groups, including a module for designing NbS that has been introduced into the Minecraft game and has already been used in schools across the EU and CELAC regions.

Other tools are focused more on the planning, assessment and monitoring of NbS. One major outcome from the project is an assessment framework that has been developed and piloted across the six cities, which can be used across the world. The project team has also developed the Urban Governance Atlas, an interactive online database of 250 policy instruments supporting NbS from across the world, with a focus on the EU and Latin American regions. It serves as a resource for municipalities and the research community to draw inspiration, see what types of policy instruments can support NbS, and learn from what has worked well in practice in other areas.

A number of additional resources have been developed in the project, including guidance on how to co-create policy instruments supporting NbS. Further resources include a communications toolkit, a database of good practice tools and a stakeholder engagement strategy, which will help cities across Europe, Latin America and beyond to restore ecosystems and boost the resilience of their urban centres.

While the project is set to conclude in the early 2025, it aims to have a long-term legacy far beyond this, particularly in encouraging continued collaboration among the partners and wider NbS communities in both Europe and Latin America. In particular, an online repository of NbS and NbS-related resources which is specifically focused on Latin America (naturaleza-transformativa.com) will act as a valuable resource for interested parties. “It will be a centralised portal for actors in Latin America who want to learn more about NbS. They will be able to exchange insights, post their tools, and learn from the experiences, case studies and other resources shared,” Ms Davis outlines.

The relationships that have been forged during INTERLACE will also provide strong foundations for further collaboration and research. “We hope that it will be possible to follow up on this collaboration through further research projects,” continues Ms Davis, “and continue to develop and promote the wealth of resources and insights coming out of INTERLACE to support the global NbS community.”

This will then contribute to the goal of mainstreaming NbS in policy, planning and decision-making, moving towards making these solutions the default rather than secondary to more conventional grey options. INTERLACE’s work has already helped heighten awareness of the concept of NbS in Latin America and Europe, and the project team are looking to continue in this vein. A parallel process of the European Commission called the EU-LAC Policy Dialogue on NbS has been established, bringing together different stakeholders and actors from both European and Latin American countries to collaborate, build capacities, and increase NbS uptake in policy and practice in both regions. These stakeholders include national government officials, the research community, NGOs and more groups. The goal of encouraging collaboration between Latin America and Europe on the topic of NbS is very strongly supported by the European Commission and projects like INTERLACE, as part of a wider effort to increase the awareness and application of NbS across the world.

INTERLACE

INTERnational cooperation to restore and connect urban environments in Latin AmeriCa and Europe

Project Objectives

The INTERLACE project aims to empower and equip European and Latin American cities to effectively restore and rehabilitate (peri)urban ecosystems towards more liveable, resilient, and inclusive cities. To this end, INTERLACE enhances transatlantic cooperation and promotes participatory engagement in co-developing tools and guidelines for restorative nature-based solutions (NbS). The project builds on existing knowledge from both regions, increasing local governments’ capacity to implement ecologically sound urban planning. Additionally, it mobilizes city networks to foster sustained knowledge exchange and raises awareness of the benefits of healthy (peri)urban ecosystems for social, cultural, and economic well-being.

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 869324.

Project Partners

For a full list of project partners, please see: www.interlace-project.eu/node/30

Contact Details

McKenna Davis

Project Coordinator

Ecologic Institute

E: mckenna.davis@ecologic.eu

W: https://www.interlace-project.eu/ W: https://interlace-hub.com/

Project outputs: https://www.interlace-project.eu/node/156

Davis is a Senior Fellow at Ecologic Institute and Coordinator of Nature-based Solutions (NbS). Her work concentrates on the assessment of NbS and their governance frameworks, with a focus on linkages to sustainable urban development, biodiversity conservation, and climate change adaptation as well as surrounding co-creation and stakeholder engagement processes. She coordinates the INTERLACE project and has led the development of diverse products to support the uptake and mainstreaming of NbS, such as the Urban Governance Atlas and CLEVER Cities Guidance.

McKenna
McKenna Davis

Novel instruments for measuring vapour pressure

Aerosols often act as the seed around which clouds form, yet much remains to be learned about how they grow in the atmosphere. Henrik Pedersen and Aurelien Dantan tell us about their work in developing new instruments for measuring saturation vapour pressure and its wider relevance to understanding the influence of aerosols on the climate system.

The starting point of cloud formation is typically the presence of an aerosol, essentially a collection of particles suspended in the atmosphere, which provides a kind of seed around which a cloud can then form and condense. Primary aerosols are emitted into the atmosphere from a variety of different sources, as a result of both natural processes and human activity. “For example aerosols are formed by sea spray and by the emission of volcanic ash. Aerosols can also come from human sources, such as factories or cars,” outlines Henrik Pedersen, Associate Professor in the Department of Physics and Astronomy at Aarhus University. Once these aerosols are in the atmosphere, they are then transformed by chemical reactions. “Secondary aerosols are produced from small molecules in the atmosphere. They grow bigger as a result of molecular processes,” continues Professor Pedersen. “The diameter of a particle can grow from the sub-nanometre scale to micrometres.”

A wide variety of different factors can influence the rate of this growth, among the most important of which is a substance’s saturation vapour pressure (SVP). The SVP of a substance can be thought of as the point of equilibrium between the rate of evaporation and the rate of condensation, which is currently measured in a static way. “Current stateof-the art methods of measuring SVP work by establishing an equilibrium situation, where you have a substance and a vapour at the same temperature and pressure,” says Professor Pedersen. Accurate and reliable measurements of SVP are essential to gaining a deeper understanding of how aerosols grow in the atmosphere and their wider impact on the climate system, for example in affecting the earth’s radiative balance. “If you don’t know the SVP, then you cannot really describe how this condensation works in aerosol formation,” stresses Professor Pedersen.

New measurement method

As Principal Investigators of a research project based at Aarhus University, Professor Pedersen and Professor Dantan now aim to develop a new, dynamic method for measuring a substance’s SVP. This involves cooling a substance, letting it approach thermal equilibrium and stabilise with its surroundings, and following the pressure during this process. “We don’t depend on reaching thermal equilibrium,” explains Professor Pedersen. The project’s agenda also includes developing a means of measuring very low pressures, in a range relevant to aerosol formation. “The best commercially available absolute pressure sensors are relatively limited in range, down to a few millipascals. Many of the substances relevant to aerosol formation in the atmosphere have a SVP that lies below this,” says Professor Dantan. “We are therefore developing our own sensors to achieve better sensitivities.”

The goal of helping chemists and atmospheric scientists gain more knowledge about the SVP of different substances is a major motivation here, while researchers are also looking to develop new techniques and methods. This is about pushing beyond current capabilities in search of improved instruments. “We’re trying to beat the best commercially available absolute pressure sensors,” says Professor Dantan.

Researchers in the project are developing a measuring instrument using very small nanostructures which vibrate when they interact with a gas, pushing the boundaries of science to achieve higher sensitivities. Even very tiny vibrations of these structures can be measured effectively using laser

the sub-millipascal range, and another paper has recently been published on how the instrument was built, and how SVPs are actually measured. “At the moment we are both measuring the actual pressure of certain substances, and we are developing the method further. So we are trying to probe the limits of the methods and get to even lower pressures,” says Professor Pedersen. The aim is not to reach a specific figure or target, but rather to develop the ability to measure pressure in a lower range, which will help atmospheric scientists build a fuller picture of how aerosols grow. “Measuring vapour pressures for atmospheric substances will be an important step forward in terms of understanding how aerosols grow,” stresses Professor Pedersen.

“ The best commercially available absolute pressure sensors are relatively limited in range, down to somewhere between 5-10 millipascals. Many of the substances relevant to aerosol formation in the atmosphere have a saturation vapour pressure that lies below this.”

interferometry, says Professor Dantan. “We can detect vibrations of these drums as small as 10 -12 metres – a fraction of the size of an atom –quite well using a laser. Because we can detect these very small vibrations, we can then detect very small pressure changes,” he outlines. Validating these improvements is a challenge however, as there are no other absolute pressure sensors available, so the project team is working with known gases to build a fuller picture. “We use a commercially available sensor, with a 5 millipascal sensitivity, to check the response of our own pressure sensor,” continues Professor Dantan. “We’re working to measure this sensitivity more precisely.”

The project team are exploring ideas to further improve the performance of these sensors and reach even lower pressures. In one recent paper, the researchers showed that the sensors can measure pressure in

Very low pressures

This research also holds wider relevance beyond the atmospheric science field, to both the industrial and academic sectors. Measuring very low pressures can be highly challenging, as the gas itself may be difficult to identify, circumstances in which Professor Dantan says absolute pressure sensors can play a valuable role. “An absolute pressure sensor can give you a measure, without needing to guess what gas is there,” he explains. The absolute pressure sensors under development in the project could be an important tool in calibration, in providing a kind of yardstick for other sensors, a topic Professor Dantan plans to explore further. “We are collaborating with a Canadian company. They are very interested in these kinds of sensors, as they could be relevant as a calibration standard for many applications beyond atmospheric science,” he continues.

A NOVEL INSTRUMENT FOR THE ACCURATE AND DIRECT MEASUREMENT OF SATURATION VAPOR PRESSURES OF LOWVOLATILE SUBSTANCES

Project Objectives

The project aims at developing an instrument allowing for the direct and accurate determination of saturation vapor pressures of low-volatility substances using novel methods and state-of-the-art optomechanical absolute pressure sensors.

Project Funding

This project is funded by the Independent Research Fund Denmark (DFF).

Contact Details

Assoc. Prof. Aurélien Dantan

Department of Physics and Astronomy, Aarhus University, Denmark

E: dantan@phys.au.dk

Assoc. Prof. Henrik B. Pedersen Department of Physics and Astronomy, Aarhus University, Denmark

E: hbp@phys.au.dk

W: https://phys.au.dk/forskning/ forskningsomraader/atomar-molekylaer-ogoptisk-fysik/optomechanics-group/

R. V. Nielsen, M. Salimi, J. E. V. Andersen, J. Elm, A. Dantan and H. B. Pedersen, A new setup for measurements of absolute vapor pressures using a dynamical method: Experimental concept and validation, Rev. Sci. Instrum. 95, 065007 (2024)

M. Salimi, R. V. Nielsen, H. B. Pedersen and A. Dantan, Squeeze film absolute pressure sensors with sub-millipascal sensitivity, Sensors Actuators A 374, 115450 (2024)

Aurélien Dantan received his PhD from Université Pierre et Marie Curie in Paris in 2005. After postdoctoral stays in France and Denmark he has been an associate professor at Aarhus University since 2013. His research interests include quantum optics, cavity quantum electrodynamics, optomechanics and the application of nanomechanical systems for sensing and metrology.

Henrik B. Pedersen obtained his PhD from Aarhus University in 1999 and worked as postdoc in Israel and Germany. He became associate professor at Aarhus University in 2007. His research interests include molecular processes, ion trapping, atmospheric physics, non-linear mechanics, and new instrumentation for sensing.

Aurélien Dantan Henrik B. Pedersen

Deep in the roots of carbon sequestration

Soil can play a major role in mitigating the impact of climate change by sequestering carbon from the atmosphere. We spoke to Professor Thomas Kätterer about his research into the impact of different management practices on soil carbon sequestration, and its wider relevance in helping meet emissions reduction targets.

The process of carbon sequestration is the focus of a great deal of attention in research, as countries across the world seek to reduce carbon emissions and mitigate the impact of climate change. Plants take in carbon dioxide (CO2) from the atmosphere and form biomass, part of which then goes into the soil, where it is stored for sustained periods. “This mitigates the effects of climate change,” explains Thomas Kätterer, Professor in ecosystem ecology at the Swedish University of Agricultural Sciences (SLU). Different crop management practices affect the rate at which CO2 is transported from the atmosphere to soil, a topic central to Professor Kätterer’s research. “We’re looking at how different management practices affect this carbon flux. What management practices favour the storage of carbon in the soil?” he asks. “Building up carbon storage also has other positive effects, including on plants, microbes, fungi and animals that live in soil.”

Carbon sequestration

As the Principal Investigator of a project backed by the Swedish government research council (Formas), Professor Kätterer is now looking to build a fuller picture of the rate at which carbon is sequestered under different crop management practices. Certain practices can lead to an increase in soil carbon or organic matter content, which many governments are keen to encourage. “In Sweden for example farmers are subsidised to use cover crops,” outlines Professor Kätterer. These cover crops grow later in the year after the harvest of main crops, leading to the formation of new biomass that increases the level of soil organic matter, alongside other benefits. “When you have these cover

crops it also reduces erosion problems, as rains in Autumn and Winter transport particles to the waterways, and degrade the soil, by transporting the top soil within the landscape,” explains Professor Kätterer.

The project team is now investigating the effects of this practice on rates of carbon soil sequestration, as well as assessing the impact of other practices, such as sowing and rotating different crops. Carbon accumulation in soil is a slow process and annual changes are relatively small, so Professor Kätterer says many years’ worth of data are needed to see measurable changes. “A huge amount of carbon is already there, and changes have to accumulate over a long time before we really see a measurable effect, so we need long-term

studies,” he acknowledges. Researchers are using data from long-term field experiments (LTEs) conducted at different locations, while soil samples are taken from 2,000 fields in Sweden in a long-term monitoring programme. “Researchers go back to the same spot in a field every ten years or so and measure the changes that have occurred,” says Professor Kätterer.

This provides a rich source of data going back several decades in some cases. One experiment started in the town of Ultuna in 1956 for example, with researchers looking at the impact of different treatments within a field on crop yields, and some marked variations in soil carbon content can now be seen. “We measured almost 4 times more

Maize in the Ultuna Frame Experiment, which started in 1956. Photo: Michael Kvick, SLU.

carbon in the soil in the highest treatment compared to the lowest treatment. We have looked at the response of the maize grown over the last 20 years, and have been able to relate maize productivity to changes in soil carbon,” outlines Professor Kätterer. A further important aim in the project is to disentangle the effect of increased nutrient levels from the impact of changes to the soil structure. “If you increase soil organic carbon you also increase levels of nitrogen and phosphorus, which are constituents of organic material in the soil,” explains Professor Kätterer. A higher turnover of soil organic matter leads to the delivery of more nutrients to crops and increased levels of soil organic matter. Researchers have been working to separate the impact of this on maize yields from the

says Professor Kätterer. “There is a lot of interest in effective tools to estimate the impact of different management practices on greenhouse gas emissions and carbon sequestration,” he continues. The project team is now working to improve and refine these models further.

“From the LTEs we parameterise models that we then use at the regional scale, or scale the effect up to the country scale,” says Professor Kätterer. “We use different kinds of models, and are building a platform to run models on different data series.”

Researchers are sharing their findings with the agricultural sector, while Professor Kätterer and his colleagues also work together with Sweden’s Environmental Protection Agency. The data they provide on carbon stock changes in Swedish agricultural land is used

“ We’re looking at how different management practices affect this carbon flux . What management practices favour the storage of carbon in the soil?”

physical effects of improved soil structure, which enhances the water-holding capacity of the soil. “We were able to show that the main factor behind increased maize yields was changes in the physical properties of the soil, rather than increased delivery of nutrients,” says Professor Kätterer. Alongside working with data from LTEs, Professor Kätterer and his colleagues are also conducting flux measurements, measuring the exchange of gases between the surface and the atmosphere. “We have chambers which cover the soilwith or without vegetation, depending on the specific fluxes we are interested in. We can measure the exchange of gases at a small scale,” he explains.

Climate models

These measurements can then be used to inform models of the underlying processes behind soil carbon sequestration, which will be invaluable in terms of understanding which practices can mitigate the impact of climate change. There is also wider interest in these models from beyond the agricultural sector,

every year in compiling the national climate report, illustrating the wider importance of this research. “The national climate report is then delivered to the EU, UN and other bodies,” says Professor Kätterer. This work is very much ongoing, and Professor Kätterer plans to pursue further research in this area, aiming to build a deeper picture of the effects of different land management practices.

“From the soil monitoring programmes and the LTEs we can relate changes in soil properties - like soil organic carbon - to how the land was managed,” he outlines.

The project team is also using other sources of information, including satellite data and public databases, to understand the historical background and then relate that to the changes that researchers see in soil samples today. This can then inform management practices and advice to the agricultural sector, helping farmers improve soil quality and mitigate the impact of climate change. “The models we are developing can provide valuable insights for farmers,” stresses Professor Kätterer.

CARBON SEQUESTRATION IN SWEDISH CROPLAND SOILS

Project Objectives

This project aims to generate new applicable knowledge from the information we have gained from studies on Swedish long-term field experiments, national soil monitoring, and other databases, and integrate it into a novel modeling approach to advance our estimates of the carbon sequestration potential in agricultural soils under Swedish conditions.

Project Funding

This project is funded by the Swedish Research Council under Grant Number 2022-00214_Formas.

https://www.vr.se/english/swecris. html?project=2022-00214_Formas#/

Project Team

Martin Bolinder, Carlos Sierra, Lorenzo Menichetti, Rong Lang, Nick Jarvis, Katharina Meurer, Johanna Wetterlind, Elin Röös.

Contact Details

Project Coordinator:

Professor Thomas Kätterer

Department of Ecology

P.O. Box 7044

750 07 UPPSALA

T: +4618672425

E: thomas.katterer@slu.se

W: https://www.slu.se/en/ew-cv/thomaskatterer2/

https://www.lantbruksforskning.se/projektdatabasen/ fang-och-mellangrodors-bidrag-till-kolinlagringmullens-betydelse-for-skord-och-skordestabilitet/ ff80818182a02a440182d5881d4b084c/

https://ejpsoil.eu/soil-research/carboseq/ https://ejpsoil.eu/soil-research/eom4soil/intodialogue/simple

https://www.nibio.no/en/projects/captureassessment-of-cover-cropping-as-climate-action-incereal-production-in-norway?locationfilter=true

Thomas Kätterer is a professor in ecosystem ecology at the Swedish University of Agricultural Sciences and an elected fellow of The Royal Swedish Agricultural Academy. He has a background in agronomy, soil science, ecology, and environmental sciences. His research focuses on carbon and nutrient cycling in agroecosystems, particularly agricultural management practices related to mitigation and adaptation to climate change.

One of the Swedish long-term soil fertility experiments at Högåsa. The different hues of green indicate crop productivity. Photo: Gunnar Börjesson.
Professor Thomas Kätterer

Mineral transformation in action

Oxygen is used up very quickly in wet or flooded soils, after which other types of respiration set in. Iron is a very important electron acceptor in soil environments for organisms that respire organic carbon to gain energy, processes which have an important influence on the behaviour of many nutrients and contaminants, as Professor Ruben Kretzschmar explains.

Many soils are periodically wet or even flooded, leading to poor aeration. When this happens, soil organisms use up the remaining oxygen in the soil very quickly, leading to a lack of oxygen for respiration.

“Under these conditions, other elements can be used by certain microorganisms for respiration, such as manganese, iron and sulphur. Iron is a very important electron acceptor in soils for organisms that respire organic carbon to gain energy, in the absence of oxygen,” explains Ruben Kretzschmar, Professor of Soil Chemistry at ETH Zurich. These processes hold wider importance, as they can trigger recrystallization and transformation processes of iron minerals, which in turn control the behaviour of many nutrients and contaminants in soils. “Elements like phosphorus, arsenic, chromium, nickel, cadmium and zinc strongly bind to certain iron minerals, so mineral dissolution or transformation can drastically change the mobility and bioavailability of these elements,” continues Professor Kretzschmar. “A lot of toxic contaminant elements can undergo oxidation and reduction (redox reactions).”

IRMIDYN project

As Principal Investigator of the ERC-funded IRMIDYN project, Professor Kretzschmar and his team are now looking at these iron mineral transformation processes at several different sites in Europe and Asia, work which holds relevance to understanding how nutrients and contaminants may behave in soils.

Contaminants may also get into rice paddy fields when they are irrigated using contaminated surface water or groundwater, another topic of interest in the project. “Our group has in the past worked on rice paddies in East and South-East Asia, where arsenic and cadmium pollution is a widespread problem in rice production” says Professor Kretzschmar.

“We want to look at iron mineral transformations in actual soils and understand if they happen in the same way as we observe in the laboratory.”

Environments in which redox-induced iron mineral transformation processes play an important role include, for example, river floodplains, rice paddies, minerotrophic peatlands and inter-tidal flats, such as the Wadden Sea or tropical mangrove ecosystems. “Soils close to aquatic systems effectively act as sinks for contaminants,” outlines Professor Kretzschmar. “For example, there are many rivers in Europe and other countries where floodplains are contaminated, from mining and other industries located along a river.”

A paddy field used for crop cultivation is typically drained and then flooded for an extended period, and it can remain flooded for several months. By contrast soils in the Wadden Sea, which is affected by the tides of the North Sea, are flooded and drained on a more regular basis, says Professor Kretzschmar. “The soils of the inter-tidal flats in the Wadden sea are exposed to oxygen and then flooded again at a much higher frequency than soils in rice paddies,” he explains.

The team of researchers in the IRMIDYN project are looking at the different iron

Collaborator Prof. Worachart Wisawapipat (Bangkok) during field work in Thailand.
Rice paddy soil in Thailand, showing iron-related redox features.

mineral transformation processes which occur in these contrasting environments. One important process is the transformation of ferrihydrite, a weakly crystalline iron oxyhydroxide, to more crystalline iron oxyhydroxide minerals. “It’s essentially the first product that forms when iron oxidises and precipitates into a weakly-crystalline iron oxyhydroxide mineral,” outlines Professor Kretzschmar. “There are also some other more crystalline minerals in soils, like goethite or lepidocrocite, a bright orange mineral often found in soils with variable redox conditions.”

The transformation of ferrihydrite into lepidocrocite and goethite has been intensively studied in the laboratory, yet researchers haven’t really been able to study how it happens in natural soil settings, a topic that Professor Kretzschmar and his team are addressing in the project. “We want to look at these transformations in actual soils to understand whether they occur in the same way as we observe in the laboratory,” he says.

A variety of other minerals can also be formed when iron is exposed to different environments. For example in sulphurrich environments it forms iron sulphide minerals like mackinawite, which can then be transformed further. “Mackinawite can be transformed to greigite, which may be the first step towards the formation of pyrite. This happens in certain types of sediments and other sulphur-rich reducing environments,” outlines Professor Kretzschmar. In-situ pyrite formation in inter-tidal sediments is another topic of investigation in the project, but preliminary results show that it will require longer field experiments over months to years, rather than weeks. “When pyrite oxidises, iron sulphate minerals like jarosite may form under acidic conditions. Jarosite can also host certain contaminant elements. We are therefore studying the structure and dynamics of jarosites in acid sulphate soils in Thailand,” says Professor Kretzschmar.

Studying iron mineral transformations in-situ in the field is not an easy task, however, as iron minerals form only a relatively small part of

soil and they are not easy to detect with X-ray diffraction or other conventional mineralogical analysis techniques. Researchers in the project have developed a method where an isotope of iron – 57Fe – is used as a tracer, allowing Professor Kretzschmar and his team members to monitor any transformations. “57Fe makes up approximately 2 percent of naturally occurring iron. If we produce a mineral that consists of 96 percent 57Fe, and mix it into some soil, then most of the 57Fe will come from this mineral. This allows us to follow its transformation with Mössbauer spectroscopy, as we can selectively detect just 57Fe with this method,” he explains. The project team is also exploring new ways to apply confocal Raman microscopy to study

What happens when microorganisms breathe iron?

You can learn about the work of the IRMIDYN project in this video: https://www.youtube.com/ watch?v=XFWnumTda5s

iron minerals and gain spatial information on iron mineral distribution. “We have used Raman spectroscopy to generate high-resolution maps of iron mineral transformation products in soil microcosms,” continues Professor Kretzschmar.

Mineral transformation processes

The IRMIDYN researchers are using these approaches to gain deeper insights into mineral transformation processes at different sites, mainly in Thailand, Germany and Iceland. It is known that some of these processes can occur relatively quickly in a lab setting – for example, ferrihydrite can completely recrystallise to goethite within days, or even hours – yet the expectation is that they would take longer in natural settings. “We have been monitoring the samples over weeks rather than days,” says Professor Kretzschmar. For example, there are diffusion limitations in the field, as dissolved ferrous iron has to diffuse to the places where it reacts with iron mineral surfaces, so Professor Kretzschmar says transformation processes are generally slower than those seen in the lab.

Team member Katrin Schulz and Prof. Worachart Wisawapipat (Bangkok) sampling paddy soils in Thailand.

IRMIDYN

Iron mineral dynamics in soils and sediments: Moving the frontier toward in‐situ studies

Project Objectives

The thermodynamic stability and occurrence of iron minerals in sufficiently stable Earth surface environments is fairly well understood and supported by field observations. However, the kinetics of iron mineral recrystallisation and transformation processes under rapidly changing redox conditions is far less understood, and has to date mostly been studied in mixed reactors with pure minerals or sediment slurries, but rarely in-situ in complex soils and sediments. This project will take a large step toward a better understanding of iron mineral dynamics in redox-affected Earth surface environments, with wide implications in biogeochemistry and other fields including environmental engineering, corrosion sciences, archaeology and cultural heritage sciences, and planetary sciences.

Project Funding

This research is part of a project that has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (788009IRMIDYN-ERC-2017-ADG).

Project Team

Current members: Dr. Andrew Grigg - Dr. Joëlle Kubeneck - Dr. Pierre Lefebvre - Dr. Sara Martinengo

Former members: Dr. Laurel ThomasArrigo - Dr. Luiza Notini - Dr. Katherine Rothwell - Dr. Katrin Schulz - Giulia Fantappiè

https://soilchem.ethz.ch/IRMIDYN/meet-the-team.html

Contact Details

Project Coordinator,

Prof. Dr. Ruben Kretzschmar

Soil Chemistry Group

Institute of Biogeochemistry and Pollutant Dynamics

Department of Environmental Systems Science

ETH Zurich

CHN F23.1, 8092 Zurich

Switzerland

T: +41 44 6336003

E: kretzschmar@env.ethz.ch

W: https://soilchem.ethz.ch/IRMIDYN.html

W: http://soilchem.ethz.ch

Ruben Kretzschmar earned his PhD in Soil Science in 1994 at North Carolina State University, and has been Professor of Soil Chemistry at ETH Zurich since 1999. His recent research focuses on the speciation and bioavailability of trace elements in redox-affected soils and their coupling to major elements such as iron, manganese, sulfur, and carbon.

This research contributes to a deeper understanding of nutrient dynamics and the behaviour of contaminants like cadmium, arsenic and zinc in soils, which is a major issue in rice production. Arsenic becomes mobilised under iron-reducing conditions in soil, and it can then be taken up fairly easily by rice plants, to a point where it may be harmful to human health. “The amount of arsenic in rice grains must not exceed certain limits,” stresses Professor Kretzschmar. A lot of attention is currently focused on controlling arsenic levels in rice grains, for example through controlled irrigation and by making certain changes to soil chemistry. “Our work is going to be interesting to researchers seeking to understand the speciation and uptake of arsenic in these rice paddy soils,” continues Professor Kretzschmar. “The project’s research will also contribute to a fuller understanding of the impact of a flood on water quality in the surrounding area.”

The new approach developed in the project could be applied by other researchers studying iron mineral dynamics, while it also holds wider relevance beyond the soil chemistry field. Indeed, Professor Kretzschmar says there is interest in the project’s research from several

other scientific disciplines. “For example, geologists or geochemists studying how iron minerals formed under certain conditions in the early ocean might want to use our approach. It could also be applied to study iron mineral transformations in completely different settings, like corrosion science, or other fields,” he outlines. Beyond the scope of the IRMIDYN project, Professor Kretzschmar plans to conduct field experiments over longer timescales, as well as looking at the effects on nutrient and contaminant behaviour. “We’re also starting some experiments on soils that are partially wet rather than fully saturated or flooded. Most soils in Switzerland are periodically wet during certain parts of the year, and there are also anoxic zones in these soils,” he continues.

Prof. Dr. Ruben Kretzschmar
Team member Katherine Rothwell collecting pore water from an inter-tidal sediment.
IRMIDYN Team at field work.
Team members Joëlle Kubeneck and Laurel ThomasArrigo exploring a potential field site in the salt marshes of the German Wadden Sea.

Horse health in ancient societies

Horses played an important social, cultural and economic role in ancient society, and they were treated with a great deal of care and attention when they fell ill. The Hippiatrica brings together prescriptions for treating different ailments, now Dr Elisabet Göransson is working to bring them to a wider audience.

The horse played an important role in ancient societies, helping to plough fields, transport goods and supply armies for instance, and their health was a correspondingly major concern. The Hippiatrica, a collection of ancient Greek texts by several different authors, was created around the fifth century and translated into Latin at a relatively early stage, then copied and spread more widely beyond the Mediterranean basin. The collection outlines a number of prescriptions for treating various different ailments in horses. “Most of the prescriptions are very short lists of what materials should be mixed together to treat a horse,” outlines Dr Elisabet Göransson, Associate Professor in Latin at Lund University. As manager of a project funded by the Swedish Research Council, Dr Göransson is now working with her colleagues - the Greek scholar, Dr Britt Dahlman, Dr Paul Linjamaa, who researches history of religion, and the IT expert Kenneth Berg - to bring attention to the sources through a relational database. “There has been renewed interest recently in hippiatric processes in early history. However, the Hippiatrica is not yet available in English,” she explains.

terms. “You may have to consult a botanist for example, or carefully look up different kinds of chemical substances, so translating these prescriptions can take time,” she continues. “Sometimes magical elements like spells were also added to a prescription.”

A number of the prescriptions include substances that were used on human patients, such as anti-inflammatories, while others may seem more surprising or unconventional to a modern reader. However, over time there was a kind of sifting process, and prescriptions became increasingly knowledge-based. “We see in later

“You may have to consult a botanist for example, or carefully look up different kinds of chemical substances, so translating these prescriptions can take time.”

Translating texts

This is an issue Dr Göransson aims to address, working alongside her colleague Dahlman to translate some of the texts in the collection from Latin and Greek into English for the first time, which will then be made available - along with transcriptions of manuscripts and earlier editions - through a relational database that is being developed in the project. The project’s research is focused on the work of two authors - the veterinarians Apsyrtos and Pelagoniusand translating their work is no easy task, as the texts are organised in different ways and some of the prescriptions include fairly obscure herbs and substances. “Some of these herbs and plants are mentioned only in these texts,” says Dr Göransson. This is called a hapax legomenon in philology, and Dr Göransson says further research is typically required to translate these

manuscripts that the more extreme prescriptions, some including magic spells, have not been copied,” outlines Dr Göransson. The work of translating these highly fluid texts is still in progress, and they will be made available through a multi-lingual relational database that will be hosted by the Swedish Institute in Rome. “This database is designed for investigating fluid text relations, which is one of our specialities here at Lund,” says Dr Göransson. “If for example you’re interested in something concerned with a magic spell, or a particular type of herb, then you can just search for it and the results will be returned.”

This work builds on the previous development of the Monastica database (see separate info box), which gives researchers access to certain monastic texts. The Hippiatrica database will be a similarly valuable source of information for students

and researchers interested in how horses were treated in late antiquity, believes Dr Göransson. “There is a lot of interest in this project. We are trying to design this database so that it will be easy to access, including for people from outside the academic sector,” she says. The database will be launched in 2025, and it could provide a kind of template for the future development of databases of fluid texts. “We have a lot of documentation which could be very useful for setting up a database of fluid texts. We think that now is the time to make this more widely available,” continues Dr Göransson.

Monastica - https://monastica.ht.lu.se/ - is an open access digital platform built on a complex relational My SQL database, presenting the transmission of Sayings of the Desert Fathers, and other early monastic literature. Fluid textual traditions in ten different languages can be studied through transcribed manuscripts, previous editions, and modern translations.

KNOWLEDGE,

MAGIC AND HORSE MEDICINE IN LATE ANTIQUITY

Elisabet Göransson

The Joint Faculties of Humanities and Theology Centre for Theology and Religious Studies LUX, Lund University Box 192

SE-221 00 LUND

E: elisabet.goransson@ctr.lu.se

W: https://projekt.ht.lu.se/hippo/

Elisabet Göransson, is an Associate Professor in Latin at Lund University, Sweden. She specializes in textual scholarship with a focus on more complex, “fluid” text traditions and ways to make such texts known and better studied. Working with early manuscripts and also teaching both Latin, palaeography and courses on textual criticism and digital tools and resources for the study of textual scholarship and textual traditions. She has been part of a number of externally funded projects and started a network on textual scholarship.

Paris, BnF, Espagnol 214 folio, 30 v.

At the Edge of Grammar

Researchers at the University of Aarhus are looking at how people process different types of sentence structures and probing the limits of grammar. The focus of this research is the possibility of extracting elements across long distances from embedded clauses to the left edge of main clauses in Danish and English, as Professor Anne Mette Nyvad explains.

Observing the grammatical rules of a language is essential to clear and effective communication, but sometimes it is unclear whether a given structural output is in fact part of a language or the result of a performance error. As Principal Investigator of a research project based at Aarhus University in Denmark, Professor Anne Mette Nyvad is part of a team investigating the limits of human grammar.

The project deals specifically with the borderlands between what is possible and impossible in the grammatical systems of language, focusing on the borders between clauses, more specifically their left edge. Theoretical and experimental syntax are combined in pursuit of the overall goal of investigating how a fundamental property of human language interacts with unique characteristics of Danish and English.

The focus is on one of the defining characteristics of human language, namely the possibility of extracting elements across long distances from embedded clauses to the left edge of main clauses, e.g Who did you say that John met at the conference? Here, the fronted element who is spelt out at one location (at the left edge of the main clause) and interpreted in another (as the object of met in the embedded clause). The theoretical and experimental investigations of the project all more or less explicitly revolve around issues related to the link between these two realizations of the fronted constituent.

The project team are now comparing the Danish and English languages, specifically with respect to restrictions on the possibility of moving an element out of an embedded clause to the left edge of the matrix clause. Embedded clauses that ban movement from within them are also referred to as syntactic islands, because the elements inside them are effectively ‘marooned’. “For instance, if I was to make the embedded clause in the example above into an embedded question, movement from within it would be unacceptable: Who did you say where you met? However, counterexamples to the proposed universal constraints on long-distance dependencies abound in Danish and the

other Mainland Scandinavian languages and therefore these languages have been assumed to be more ‘tolerant’ of these long-distance dependencies than for instance English. In this project, we are investigating whether Danish is in fact more ‘lenient’ when it comes to movement out of embedded clauses than English,” Professor Nyvad outlines.

Embedded clause

This movement of an element out of an embedded clause to the front of a sentence, or its left edge, can theoretically speaking occur in three different ways. One is by effectively reorganising a sentence to turn it into a question. “For example, you might ask the question: Who did John say that I saw? In that case, you’re moving the element out

“We looked

at

which 35 native speakers of Danish underwent brain scans, from which Professor Nyvad hopes to gain fresh neurolinguistic insights. “We looked at the brains of people as they processed these long-distance extractions, that is, movement from within the embedded clause. We wanted to see which parts of the brain are activated more when you process these very complex sentence structures,” she outlines. The project’s agenda also encompasses psycholinguistic research, which involves looking at behavioural data. “For instance, we can look at acceptability judgments, how people rate this type of sentence. We can also look at their response times or their error rates,” says Professor Nyvad. “We’ve done around seven psycholinguistics experiments, the fMRI study, and also quite a few corpus searches.”

the neural activation patterns associated with the processing of these longdistance dependencies in order to determine which parts of the brain are activated more when

it tries to understand these very complex sentence structures.“

of the embedded clause by way of so-called wh-movement,” says Professor Nyvad. An element can also be moved out of an embedded clause by relativisation, which can produce sentences that may sound a little unnatural to an English speaker. “You might say something like; This is the exercise that I would be surprised if she completed, that’s an example of movement by relativisation,” continues Professor Nyvad. “In Danish we can also move an element out through what’s called topicalisation, where we can have the topic in the first position, like; Him will I get really mad if you talk to, which is not possible in English. We have a wider range of possibilities in Danish when it comes to moving an element out of the embedded clause.”

The project team is now looking at how people process these different types of sentence structures, work which involves several strands of research. In one part of the project researchers conducted an fMRI experiment, in

Researchers are looking at text corpora in the project and assessing what people actually produce, in both English and Danish, which provides a snapshot of how people communicate on a daily basis. This angle of research can be viewed as more ‘clean’ in a sense, as people produce these text corpora naturally. “We know that they are actually part of the language,” says Professor Nyvad. This research is split roughly equally between English and Danish. Bilingual people may also transfer some grammatical structures from their own native tongue to their second language, another topic of interest to Professor Nyvad and her colleagues. “At the outset, we thought we would find that Danish is more tolerant, that it allows more extractions from embedded clauses than English,” she outlines. “We wanted to then see whether native speakers of Danish and English transfer their constraints on long-distance dependencies to a second language? The evidence suggests that this is indeed the case.”

Mainland

Scandinavian languages

This tolerance of extraction from embedded clauses has been prevalent in Danish for at least a hundred years, and while Norwegian and Swedish are similarly liberal, it hasn’t really been observed in English until relatively recently. It was previously thought there was a clear divide between English and the Mainland Scandinavian languages in this respect, but Professor Nyvad says evidence suggests this is not the case. “Our most recent experiment shows that English is more similar to the Mainland Scandinavian languages than has previously been assumed. So maybe there is more common ground than the theoretical literature will have you believe,” she explains. Beyond the scope of the current project, Professor Nyvad hopes to look at languages from outside the Indo-European family, which may have different features.

“The Indo-European languages – including English, Danish, Swedish and Norwegian – all have very similar traits. But the non-Indo-

European languages may be completely different when it comes to the possibility of island extractions,” she says.

This is a topic Professor Nyvad hopes to explore further in future by examining or comparing ten different languages spoken in Europe, looking for similarities and differences with respect to restrictions on extractions from island structures. This would be based on a similar experimental framework to the current project, yet it would be impractical to conduct fMRI studies, so Professor Nyvad plans to use electroencephalography (EEG) technology to measure activity in the brain and probe the limits of syntactical structures. “An EEG cap can tell you, when you’re listening to particular syntactic structures, whether you are responding to something that is for instance a syntactic or semantic anomaly,” she outlines. “You can get an idea as to whether these complex syntactic structures are perceived by the brain – without us really knowing it –as something that is restricted in the syntax, semantics, or pragmatics of a language.”

AT THE EDGE OF LANGUAGE

At the Edge of Language - An Investigation into the Limits of Human Grammar

Project Objectives

This project deals with the borderlands between what is possible and impossible in the grammatical systems of language. More specifically, it is an interdisciplinary exploration of dependencies across clausal boundaries, focusing on how this fundamental property of human language interacts with special characteristics of Danish and English.

Project Funding

This project is funded by The Danish Council for Independent Research. Grant ID: DFF9062-00047B.

Project Participants

Senior researchers:

• Dr Ken Ramshøj Christensen (Associate Professor, Dept. of English, Aarhus University)

• Dr Douglas Saddy (Professor, CINN, University of Reading)

Junior researchers:

Postdocs Dr Christiane Müller and Dr Katrine Rosendal Ehlers • PhD students Julie Maria Rohde and Maria Mørch Dahl

Contact Details

Principal Investigator, Anne Mette Nyvad

Associate Professor Department of English/Scandinavian Studies School of Communication and Culture Aarhus University

JensChr. Skous Vej 4 8000 Aarhus C

Denmark

T: +45 87163016

E: amn@cc.au.dk

W: https://projects.au.dk/at-the-edge-oflanguage

W: au.dk/en/amn@cc.au.dk

Anne Mette Nyvad is an Associate Professor at the Department of English, Aarhus University. She has published 2 books and 31 peer-reviewed papers on a range of linguistic topics within the fields of comparative linguistics, neurolinguistics, language processing and second language acquisition.

Professor Anne Mette Nyvad

How liquid water shaped the landscapes of Mars and made the planet habitable

Liquid water is thought to have been abundant on the surface of Mars early in its history, but today it exists only in the form of ice. The team behind the MarsFirstWater project are investigating the characteristics of water on early Mars, research which holds important implications for future space missions to the planet, as Professor Alberto Fairén explains.

The presence of liquid water is essential to life on Earth, and it has played a central role in the evolution of the planetary surface. Beyond our own planet, water is thought to have been present on several different objects in the Solar system at certain points in history, including Mars. “There is ample evidence that liquid water was abundant on the surface of early Mars, both in the geomorphology of ancient terrains, such as lake beds, deltas and stratified sequences, and also in the presence of minerals that can only be formed in the presence of abundant and persistent liquid water,” explains Alberto Fairén, a Research Professor at the Centro de Astrobiologia

(CAB), an institute supported by the Spanish National Research Council (CSIC). While a reservoir of liquid water has recently been found kilometers deep in the crust of Mars, surface water exists today only in the form of ice, in the polar caps, permafrost, and some isolated underground ice deposits. “Liquid water is not stable on the surface of Mars today,” continues Professor Fairén.

MarsFirstWater project

As Principal Investigator of the ERC-backed MarsFirstWater project, following up on the earlier icyMARS initiative, Professor Fairén is now working to build a deeper picture of the

water that existed on the Martian surface during the early part of its geological history, using data from both past and current space missions, as well as terrestrial analogs. This work starts from the hypothesis that icerich permafrost characterised most of the Martian subsurface and the subsurfacesurface interface during the Noachian period, which is thought to have begun roughly 4 billion years ago. “We have identified key thermal, mechanical and chemical conditions that characterised ice-rich permafrost associated with hydrological processes on early Mars,” says Professor Fairén. “We are using simulation experiments combined with

A cold hydrological cycle on early Mars. Illustration adapted from a digital terrain model of Valles Marineris which was created from 20 individual HRSC orbits, and the colour data were generated from 12 orbit swaths. Credit: ESA/DLR/FU Berlin (G. Neukum)

numerical modelling to investigate the role of fluid dynamics on the morphology of the resulting features, with a particular emphasis on the effect of early fluid composition. We have also provided the first identification of rythmites on Mars.”

This shows that impact events from asteroids and meteorites, which occurred regularly in the Noachian period, were a major source of liquid water on early Mars. Professor Fairén and his colleagues have also documented the history of a specific aqueous episode on early Mars. “This provides the first evidence of powerful storms, torrential rains, megafloods and strong waves in a Martian palaeolake at the Gale crater,” he outlines. The Gale crater is one of the locations on Mars where liquid water is thought to have been present in the past, and NASAs Curiosity rover is currently located there, looking for evidence of past environmental conditions and whether they could have supported life. “Life requires water, so we are sure that biosignatures on Mars will be hidden in places where water was present in the past,” continues Professor Fairén. “We’re interested in molecular biomarkers, natural products that can be assigned to particular biosynthetic origins.”

The most useful molecular biomarkers are organic compounds that are abundant in terrestrial microorganisms and have a high level of taxonomic specificity, meaning that they originate from a limited number of well-defined sources. These compounds may also be fairly well preserved, as they are resistant to geochemical changes and become concentrated upon sediment diagenesis. “This process is controlled by microbial activity and/ or chemical reactions that are catalysed by mineral surfaces,” explains Professor Fairén. Microbial lipids, in particular alkanoic acids, can help researchers reconstruct past conditions on Mars, and so are of great interest to Professor Fairén. “Alkanoic acids are optimal proxies for the reconstruction of paleoenvironmental scenarios, given their abundance (bacteria and eukaryotes contain 1 to 10 percent alkanoic acids), high taxonomic specificity and resistance against diagenesis,” he says.

These attributes make lipid fraction studies a useful means of elucidating the sources and diagenesis of organic matter, and Professor Fairén is using these methods to investigate paleoenvironmental microbiology. A further strand of research involves analysing data on ancient rocks found on Mars, and comparing them with sediments from several extreme terrestrial environments in which the conditions are thought to resemble those on early Mars. “The terrestrial analogs are

places on Earth that are characterised by one or a few environmental, mineralogical, geomorphological, or geochemical conditions that are similar to those observed on present or past Mars,” explains Professor Fairén. “Our investigations are being conducted in several analogs, as there is no single analog on Earth that perfectly reflects the conditions on early Mars. It is likely that early Mars had a diversity of environments in terms of pH, redox conditions, geochemistry, temperature, and so on.”

Terrestrial analogs

The project team is conducting research in five terrestrial environments; the High Arctic, Antarctica, the Atacama desert in Chile, Rio Tinto and the Tirez lagoon in Spain, from which Professor Fairén hopes to learn more about the conditions on early Mars. The Tirez lagoon is a suitable analog for investigating the changing paleoenvironmental conditions on Mars over the time between the Late Hesperian and the Early Amazonian (around 3 billion years ago) periods, building on data from space missions. “A number of the paleolakes that have been identified on Mars were characterised by episodic inundation by shallow surface waters with varying salinity, evaporation, and full desiccation over time. This process occurred repeatedly until the final disappearance of most surface water after the wet-to-dry transition at the end of the Hesperian,” outlines Professor Fairén. “Similar conditions can be tested over time at the Tirez lagoon, including ecological successions.”

This lagoon saw both wet and dry periods over the course of around 20 years, before it reached a state of complete desiccation in 2015. However, this does not mean that the lagoon is completely uninhabitable. “We have demonstrated that the lagoon was habitable both before and after its complete desiccation, despite the repeated seasonal dryness,” says Professor Fairén. Researchers are using information and insights from these terrestrial field analogs in combination

with spacecraft mission-derived datasets to test certain hypotheses on the nature of water on early Mars. “We’re using a range of techniques, including paleogeomorphological reconstructions, computer modelling (geological, geochemical and microbiological models) and laboratory studies,” continues Professor Fairén. “The derived results are producing hard constraints on the physical evolution, geochemical alteration and habitability of surface and near-surface aqueous environments on early Mars.”

The project team is working to build a more comprehensive understanding of the inventory of water during the first billion years of Mars’

The MarsFirstWater Team at the Centro de Astrobiologia (CAB). Credit: AGF.
Fieldwork in the Atacama Desert and the Canadian Arctic. Credit: AGF.

history, and to analyse its early evolution on both global and local scales. This understanding must encompass a precise knowledge of the nature of that water, says Professor Fairén.

“We’re looking at whether it was present in liquid or solid form, the duration of its presence, and its distribution across the planet,” he outlines.

The project’s agenda also includes research into the surrounding environment. “We are investigating the weathering rates and patterns of the host rock, as well as the physicochemical parameters defining such interactions, and the specific landforms and mineralogies generated during these periods. We are also examining the ultimate fate of all that water when the Martian surface desiccated,” continues Professor Fairén.

“Our final goal is to determine the implications these processes had on the potential inception of life on Mars, as a separate genesis from that on Earth.”

Habitability of Mars

This research is being conducted against a backdrop of renewed interest in sending

manned space missions to Mars, searching for signs of life on the planet, and even establishing human settlements there in the long-term. The project’s work represents an important contribution to this wider goal, as it will help inform the progress and development of new missions to Mars. “We aim to advance a novel perspective on the evolution of planetary habitability, which can inform the development of future missions, payload concepts and instrumentation for exploring both Mars and Earth,” says Professor Fairén. However, Professor Fairén says it’s also important to consider the Martian environment and prevent contamination when planning space missions, an issue he is keen to highlight. “Our research will elucidate the challenging constraints for planetary protection policies in current planetary exploration, both uncrewed and crewed. We are attempting to substantiate the argument for a moratorium on proposed human missions to Mars,” he outlines.

The areas on Mars where life is most likely to be found are precisely those where the strictest restrictions are in place regarding exploration, as set forth by planetary protection offices.

These restrictions are there specifically to avoid disturbing any indigenous life forms that may be present, yet given the prospect of new missions to Mars, Professor Fairén believes the current guidelines need to be revisited. “The issue is that, in light of NASA’s and other agencies’ aspiration to send human missions to Mars in the 2030s, the current planetary protection guidelines applied to today’s uncrewed robots are impractical,” he says. A manned mission to Mars will unavoidably bring so-called microbial hitchhikers along with it, as it is not possible to conduct a bioburden reduction process on humans. “It is inevitable that a high degree of forward contamination will occur as a result of human astronaut exploration, given the impossibility of conducting all human-associated processes and operations within entirely closed systems,” acknowledges Professor Fairén.

Illustrations of rovers, orbiters and landers on Mars. Credit: NASA.
Fieldwork at Rio Tinto and Tirez. Credit: AGF and Nuria Rodríguez.

A human presence on Mars will inevitably result in reduced cleanliness in the area, regardless of how careful those people are and what strategies they use. It is therefore unreasonable to delay further robotic astrobiological exploration of Mars on the grounds of preventing contamination of the planet by microorganisms aboard unmanned spacecraft, believes Professor Fairén. “It is surely more prudent to conclude that human contact with Special Regions is not the optimal means of an initial astrobiological

This is an essential step before the arrival of manned missions on Mars, and researchers continue to look for signs of life on the planet. A number of locations have been identified, and Professor Fairén says exploring them with robots should be a priority, before the potential future arrival of crewed missions.

“Examples include aquifers concealed beneath ice masses, analogous to those purported to exist beneath the south polar cap but situated in regions where subsurface ice sheets have been identified at more accessible

“There is ample evidence that liquid water was abundant on the surface of early Mars, both in the geomorphology of ancient terrains, and also in the presence of minerals that can only be formed in the presence of abundant and persistent liquid water.”

exploration. It is imperative that we ascertain, prior to the arrival of humans, whether extant microbial ecosystems are present at - or near - the surface,” he argues. It’s important in this respect that a few select locations be designated, described, and analysed as soon as possible, with the deployment of rovers and landers. “This will enable us to make significant strides in our astrobiological endeavours, particularly in determining whether there is presentday near-surface life on Mars,” continues Professor Fairén.

latitudes. Alternatively, salt crusts with low eutectic points may warrant consideration, as temperature and relative humidity fluctuations could facilitate transient deliquescence processes and solution formation,” he says. With more missions to Mars planned, Professor Fairén hopes to establish a successor project, building on the progress that has been made in icyMARS and MarsFirstWater and gaining deeper insights. “I plan to continue building my team and furthering our results about the geochemical setting of early Mars and the prospects of finding life,” he says.

MarsFirstWater

Study of the origin and water cycle on Mars during the first billion years of the planet’s geological history, and its astrobiological implications

Project Objectives

We examined the water inventory in the first billion years of Mars’ history, studying its evolution at both global and local scales, the duration and locations of water presence, the rates and patterns of hostrock weathering, the physicochemical parameters influencing such interactions, and the formation of specific landforms and mineralogies, to finally evaluate the implications of these processes for the potential emergence of life on Mars.

Project Funding

The “MarsFirstWater” project is funded by the European Research Council ERC, Consolidator Grant no 818602, following the results of the “icyMARS” project, also funded by the ERC, Starting Grant no 307496.

Project Partners

Centro de Astrobiología (CAB), a research center managed by the Spanish National Research Council (CSIC).

Contact Details

Project Coordinator,

Professor Alberto Fairén Centro de Astrobiología

Instituto Nacional de Técnica Aeroespacial M-108, km 4, 28850 Madrid, Spain

E: agfairen@cab.inta-csic.es

W: https://cordis.europa.eu/project/id/818602

a Research Full Professor at the Centro de Astrobiología in Madrid, Spain, and a Visiting Scientist at the Department of Astronomy in Cornell University, New York, USA. He is an interdisciplinary planetary scientist and astrobiologist, specialized on the search for life on Mars and Ocean Worlds.

Alberto Fairén is
Professor Alberto Fairén
Credit: Laura García-Descalzo.
The main building at the CAB facilities. Credit: CAB.

New light on old wood

The

number of rings within a tree indicates its age, while analysis of those rings can also lead to new insights about wood and its provenance relevant to a wide variety of disciplines. We spoke to Professor Dan Hammarlund about his work as the coordinator of a research project dedicated to making dendrochronology data available to researchers.

The number of annual growth rings within a tree indicates its age, while analysis of those rings can also lead to deeper insights into the conditions and circumstances under which it grew. The width of the rings depends to a large extent on local climate conditions at the time of growth, such as precipitation levels, temperature and water availability. “Trees growing in the same area tend to show similar growth patterns. A harsh year and a good year could be lined up next to each other in a series of tree rings,” outlines Dan Hammarlund, Professor in the Department of Geology at Lund University in Sweden. This provides not just data on past climate conditions, but also other data relevant to a variety of disciplines, which Professor Hammarlund and his colleagues in the Old Wood in a New Light project are now working to bring together and make more widely available. “We want to make this tree ring data freely available to researchers across a wide variety of disciplines. In some cases researchers are well aware that tree-ring data are a good resource in their discipline, while others are not aware of their potential,” he says.

Tree ring data

Tree ring data are useful in reconstructing past climate conditions for example, while Professor

Hammarlund says it also holds wider relevance, such as in dating wood used in construction and determining its provenance. The project team, which brings together researchers at four laboratories across Sweden, has deep expertise in analysing tree rings and other wood features. “We bring in cores or pieces of trunks and investigate different radii. We core both living trees and samples in standing constructions, while sometimes we can use a saw to sample

origin of a shipwreck isn’t clear, we can take out tree-ring series from the ship and perhaps also from the cargo. Then we can compare those ring series – if they are long enough - with reference series from different parts of Europe.”

Researchers have been able to show that certain ships sunk along the Swedish coastline originally came from Germany, and that these vessels were carrying wood from southern Poland, from which new insights can be drawn

“We want to make tree - ring data freely available to researchers across a wide variety of disciplines. In some cases researchers are well aware that tree-ring data are a good resource in their discipline, while others may not be aware of their potential ”

dead wood or trunks from archaeological excavations and other interesting sites. We can produce thin little cores, just a few millimetres in diameter, then analyse and measure the rings,” explains Professor Hammarlund. Researchers can also analyse samples of wood that have been used in construction. “Timber constructions contain wood with the exact same structure as living wood, and we can determine their provenance,” continues Professor Hammarlund. “For example, if the

about trading relationships. The aim now for Professor Hammarlund and his colleagues in the project is to develop a database and make this type of data accessible to researchers and the wider public. “We want it to be easy for anyone to go into the database and search for what they are interested in,” he explains. The actual wood samples are available for people who are particularly interested, but the main focus is on bringing together data on samples from different regions. “We try to provide

Photo documentation of tree rings on a panel painting on oak from the 17th century.

both the environmental data and also the meta-data. People can also come to our labs and ask our team to date a sample. Ideally they provide details about where and when it was collected, and so we can then include the accompanying meta-data,” says Professor Hammarlund. “We want it to be easy for anyone to go into the database and search for what they are interested in,” he explains. So far, a lot of effort has been investigated into digitising meta-data, such as felling year, tree species, provenance and the function of the analysed timber. All this derived information, which is of great interest to researchers within cultural history, forestry, palaeoclimatology and a range of other disciplines, will be made freely and easily accessible in the database. This is not always the case with other sources of dendrochronology data, some of which are much more science-oriented. While it’s possible to download datasets made available in these systems by researchers, typically they don’t contain the metadata required to conduct more detailed investigations, which is a major priority in the project. “We aim to provide everything here that people require to conduct their research,” says Professor Hammarlund. This data can then inform research in a variety of different fields, including history and even the fine arts. “One member of the project team has been working with paintings, checking whether they are fakes. Historical paintings are highly valuable, and so people want to make sure they are authentic, and that they really do date from, for example, the 17th century,” outlines Professor Hammarlund. “One way of checking the age of these paintings is through assessing whether the boards really contain wood of the expected age. With tree ring series we can check whether these boards were indeed cut out in the 17th century, or if they are younger.”

Environmental studies

A further potential future application of dendrochronology data is in investigating the environmental background of contaminated sites. Analysis of tree rings can help researchers find out when a site was contaminated, as trees take up different chemical compounds and store them in their rings. “It’s then possible to produce graphs showing how much of the different heavy metals there are for example, and when they were taken up. This can be highly valuable in environmental investigations,” explains Professor Hammarlund. A lot of data have already been entered into the system, yet there is still more work to do, and Professor Hammarlund says there will always be scope for further additions. “We hope to extend the project to improve the database and add more information, but it will never really be complete, there will always be more that we can add,” he continues. “We have put a lot of effort into making the database sustainable, and have put a long-term maintenance plan in place.”

This is central to ensuring that the project has a lasting impact, beyond the conclusion of the funding term, and that researchers can still tap into this data in the future. The project team is working to improve and refine the database, and Professor Hammarlund says it will provide a valuable source of data for researchers. “We have data series going back thousands of years, as we have tree-ring series from logs and stumps that have been preserved in oxygen-free environments, like in peat bogs, and at the bottom of lakes,” he says. “We can pull these tree trunks or stumps up and they look more or less like they were cut down yesterday. We can easily analyse the rings, and they can be amalgamated into long time series’, from the present day back many thousands of years.”

OLD WOOD IN A NEW LIGHT

Project Objectives

The Old Wood in a New Light project is a Swedish initiative that aims to make data on wood and wood constructions available to researchers, which can then inform investigations across a wide variety of disciplines. Analysis of trees and tree rings can lead to fresh insights into how the climate has evolved, while determining the provenance of wood can help researchers reconstruct past trading relationships, just two examples of the wider importance of dendrochronology data.

Project Funding

The project Old Wood in a New Light is funded by an infrastructure research grant from Riksbankens Jubileumsfond, an independent foundation supporting research in the humanities and social sciences (grant no. IN20-0026).

Project Partners

• Lund University

• Stockholm University

• University of Gothenburg

• Swedish University of Agricultural Sciences

• Umeå University

Contact Details

Project Coordinator, Dan Hammarlund

Professor of Quaternary Geology

Lund University

Faculty of Science

Department of Geology

Sölvegatan 12, SE-223 62 Lund, Sweden

T: +46 46 222 79 85

E: dan.hammarlund@geol.lu.se

W: www.lunduniversity.lu.se

W: www.geology.lu.se

The data are being made available through the Strategic Environmental Archaeology Database (SEAD: www.sead.se).

Old Wood in a New Light: An Online Dendrochronological Database in: International Journal of Wood Culture Volume 3 Issue 1-3 (2023) (brill.com)

Dan Hammarlund is Professor in the Department of Geology at Lund University, and has held research positions in Denmark and Canada. His research focusses on various aspects of environmental change based on chemical and biological analyses of lake sediments, peat sequences and tree-ring series.

Professor Dan Hammarlund
Construction of a chronology anchored to the present via the living tree and extended backwards in time with tree-ring series from historical wood.
Sampling of 7000-year-old pine trees excavated from a peat bog.

The power behind a sustainable future

A large proportion of global energy demand is still met through burning fossil fuels such as oil and natural gas. Researchers in the MetalFuel project are exploring the potential of iron powders as an alternative carrier of energy, part of the wider goal of moving towards a more sustainable society, as Professor Philip de Goey explains.

Around 80 percent of global energy demand is currently met through burning fossil fuels, and replacing them with more sustainable alternatives is a complex task. While solar and wind farms are part of the overall energy mix in many countries, they don’t provide the reliable, on-demand supply that the industrial sector requires. “Industry needs a huge amount of energy during Winter, but of course you can’t guarantee that the sun will shine in January. It’s also difficult to store wind and solar power, at least not for very long times and in big quantities,” acknowledges Philip de Goey, Professor of Mechanical Engineering at the Technical University of Eindhoven. What is required is a reliable energy carrier, and Professor de Goey believes metal powders hold great potential in this respect. “Metal powders are compact, safe and are relatively easy to store. They can then be used in Winter at those times and in those places where they are really needed,” he outlines.

MetalFuel project

This is a topic Professor de Goey is investigating as Principal Investigator of the ERC-funded MetalFuel project, with the primary goal of understanding how iron burns. This is crucial to ensuring that energy released through burning a metal powder is released in a controlled and safe way. “We aim to understand how this combustion process works,” says Professor de Goey. This starts from investigating how a single particle of iron powder burns. “What processes take place there? What temperatures are reached and how long does it take? We are conducting numerical and experimental studies on how single particles of different sizes burn,” continues Professor de Goey. “Then when a single burning particle approaches a cold particle at close enough range, it will heat up and ignite that particle. When you have a flame, you essentially have a front of particles which ignite each other. A front of particles is ignited, and then they ignite the next layer of non-ignited particles.”

The speed of this propagating front of particles is an important consideration in terms of the combustion of metal fuels and

Iron-air flame stabilised on a jet-in-hot-coflow burner.

harnessing the released energy. If it moves too slowly, then it’s very difficult to get the energy out, while if it moves too fast it’s very difficult to control the release of energy. “The typical velocity of such a front of particles in a gas flame is around one metre a second, while it’s much faster in a hydrogen flame, at 10 metres per second,” explains Professor de Goey. The particles in an iron flame move much slower, at somewhere between 1-10 centimetres a second depending on their size, which Professor de Goey says raises some important issues. “It is more difficult to get a stable flame from burning iron than with hydrocarbons or hydrogen,” he says. “The mixture of particles and air need more time to burn, but we also don’t want the iron to vaporise, so we need to carefully control the temperature.”

Researchers are using a variety of techniques in the project to investigate this and other issues, aiming to build a deeper understanding of metal combustion. As part of this work, Professor de Goey and his team have created a system to monitor the burning of a single iron particle. “We use a very thin needle to inject single particles into a hot co-flow and depending on the temperature of this co-flow they are ignited or not. Then we can see the particle burning,” he outlines. The time that it takes for an iron particle to burn is an important consideration, while researchers are using a variety of techniques to also look at other issues. “When we have isolated a single particle, we can use cameras and other techniques to study it and its surroundings. That’s very novel,” continues Professor de Goey. “We can look at different wavelengths of radiation, study the temperature of the particle, and many other things.”

This contributes to a deeper understanding of how a single particle burns, from which researchers can then look to investigate how particles ignite each other and create a flame. The project team are conducting experiments using several different types of burners, with the aim of assessing the burning velocity of different flame fronts. “We essentially measure how fast a flame front burns,”

explains Professor de Goey. The MetalFuel project forms one part of a wider undertaking, in which researchers aim to develop a complete cycle, enabling the sustainable, ongoing reuse of metal fuels. “When you have burned the iron and then captured it in the form of iron oxide, we then want to reduce it back to iron, so that it can be burnt again and again,” says Professor de Goey. “A non-profit innovation centre called Metalot has been established to try to bring this idea to the market, and a consortium of parties is working on very largescale systems.”

Sustainability

A system capable of generating 1 megawatt of energy through burning iron has been developed for example, and the iron oxide can then be reused following a reduction reaction, which represents a more sustainable way of meeting demand than conventional alternatives. The emissions associated with this process are also extremely low, a point Professor de Goey is keen to highlight. “Nitrogen oxide (NOx) emissions – which is a major problem here in the Netherlands – are significantly lower with burning iron powder than in the case of natural gas flames,” he says. Different

to translate research advances into practical applications. One area of interest is the steel industry, and researchers are exploring new ways of operating, without completely dismantling existing infrastructure. “We are looking to use what is already there in future value chains. With a coal or a biomass plant, we’re interested in replacing the burners and using iron powder,” says Professor de Goey. While a lot of attention has focused on hydrogen and electrification as a way of reducing our dependence on fossil fuels, iron powders could play an important role in providing carbon-free energy in future,

“We are conducting numerical and experimental studies on how iron particles burn. What processes take place? What temperatures are reached and how long does it take?”

start-ups are now working towards a 20 megawatt system, and Professor de Goey and his team are supporting this research, which he sees as an important contribution to the wider push towards a more sustainable society. “We need to find alternatives to fossil fuels if we are to decarbonise our economy,” he stresses.

The project team is now exploring different funding options to support further investigation, with Professor de Goey looking

believes Professor de Goey. “Iron powder is very interesting, particularly for hightemperature processes in the industry, as it easier to store and transport in a compact and safe way than hydrogen for instance,” he stresses.

MetalFuel

Towards a full multi-scale understanding of zero-carbon metal fuel combustion

Project Objectives

The aim in the MetalFuel project is to develop an experimentally validated theoretical/numerical multi-scale framework for the propagation, stabilisation and structure of dense metal dust flames. Metal fuels hold rich potential as dense energy carriers, and the project’s research will advance knowledge in this field.

Project Funding

This project is funded by the European Reesarch Council (ERC), Grant agreement ID: 884916.

Project Partners

• Group De Goey

• Group Van Oijen

• EIRES

• Mechanical Engineering

Contact Details

Project Coordinator,

Prof. Dr. LPH de Goey

Full professor @ ME Dept Eindhoven University of Technology

De Zaale

Eindhoven

E: L.P.H.d.Goey@tue.nl

W: https://www.tue.nl/en/research/ research-groups/power-flow/towards-a-fullmulti-scale-understanding-of-zero-carbonmetal-fuel-combustion

W: https://research.tue.nl/en/persons/lphphilip-de-goey

Philip de Goey is Professor of Combustion technology at the Technical University of Eindhoven. His areas of expertise include combustion, innovation and technology, and energy. He is highly active in conducting numerical and experimental research in laminar flames and related fields.

SEM image of a number of burnt iron-oxide particles.
Prof. Dr. Philip de Goey
Single particle model with black flow lines and oxigen colors, white indicating high values, blue intermediate and black low levels.
Ignition percentage of iron particles (45-53 micron) at different coflow temperatures.

Steeling artillery for the future

The next generation of artillery products will have a longer range than those currently available, so the steels and processes used within them will need to meet extremely demanding requirements. The team behind the IPN-RAP project is creating new knowledge on how to assess the properties of certain steels, ready for their use in the products of tomorrow, as Andreas Gaarder and Knut Erik Snilsberg explain.

The modern battlefield is characterised by fire and move tactics, as troops seek to target their enemy with powerful artillery, before rapidly moving on to evade returning fire. In the ongoing conflict in Ukraine for example troops may have just a few minutes after firing artillery before the enemy are able to pinpoint where the shots are coming from with radar. “Then the enemy can counter with their own artillery fire. So troops need to be able to fire, then quickly move to another position to get out of harms way,” outlines Andreas Gaarder, Program Director R&D at Nammo. The ability to fire artillery over longer distances is a major strategic advantage in armed conflict, an issue at the heart of the project. “It’s important to able to fire over longer distances, because then you can take out strategic targets further into enemy territory. We see in recent conflicts that fighting forces have had to pull some types of assets further back from the frontline, to stay out of artillery range. Then it becomes harder to conduct operations,” explains Gaarder.

IPN-RAP project

The team behind the IPN-RAP project is now working to develop steel and manufacturing processes for the next generation of ammunition, designed to be propelled over significantly longer distances than is currently possible. The standard NATO range at the moment is 20-30 kilometres, and Gaarder says that can be increased to 40 kilometres using the base bleed drag reducing technology, which Nammo is currently using in their artillery products. “This is where you have some energetics in the base of the projectile, or the shell, which extends the range to 40 kilometres,” he explains. With new rocketassisted technology, the range of artillery can be extended up to 70 kilometres, and potentially even beyond that, in the same weapon platforms as standard ammunition. “Everybody wants longer range artillery, and rocket-assisted technology is a very cost-efficient way of getting projectiles further, while keeping the number of different weapon platforms low” continues Gaarder. “In the IPN-RAP project a team of highly skilled researchers are looking to understand the material characteristics of different steel types, with a view to using them

in rocket-assisted artillery products. We’re really looking to support the ongoing development of the rocket-assisted products.”

The rocket-assisted projectiles will be exposed to quite extreme loads during launch, when the close to 50 kg artillery shell will be accelerated to more than 1000 m/s inside the barrel, experiencing more than 15,000 G in spin force. The steel used in producing it would need to be very strong and have enhanced mechanical properties, while at the same time the manufacturers also aim to get as much

can create more space for propellant, increasing the range of the projectile, so Snilsberg says it’s extremely important for the wall to be as thin as possible. “The steel wall simply provides structural support for the rest of the grenade. It doesn’t create any terminal effects, it’s just there to hold the propellant in place, so a lot of attention is focused on getting that as thin as possible,” he outlines.

There are steels currently available which will do this job effectively, but they are extremely expensive, so aren’t cost-effective in terms of

“Everybody wants longer range artillery, and rocket-assisted technology is a very cost-efficient way of getting projectiles further.”

propellant as possible into the projectile to extend the burn time of the rocket motor, which is where the IPN-RAP project comes in. “This raises the question of how thin you can make the steel walls within the projectile, while ensuring they are still capable of withstanding the pressure and forces they will be exposed to,” says Knut Erik Snilsberg, manager of the project. Reducing the wall thickness by just fractions of millimetres

artillery production. The steels typically used in producing artillery are quite cheap, as projectiles are a high volume product, so now Snilsberg and his colleagues in the project are looking at other possibilities. “We are trying to figure out the material properties of certain steels, and are looking at their capacity to withstand stress and strain. We’re essentially looking at the structural integrity of the selected steels,” he explains.

Microscopic picture of a fracture surface.

The process of producing artillery projectiles involves heating it up and then cooling it down, and researchers are modifying some parameters as they seek to maximise mechanical properties. “You can heat the projectile up at certain speeds and temperatures, and then cool them down –in fluids – at different speeds and temperatures. Then you do your annealing and final machining to achieve the required material properties, so that the steel is able to withstand all the forces that it will be subjected to,” says Snilsberg. “The metals ‘remember’ all manufacturing steps, and understanding the impact on the crystals and particles inside the metal due to each process step is the key in optimizing product performance.”

Steel properties

The focus of attention in the project is the steel within these artillery products, with researchers hoping to achieve the steel properties required for application in rocketassisted technology. This will support the potential future development of a rocketassisted artillery product. “If we achieve what we hope to then this will significantly reduce the risk around the development of the actual product. Then we would know that we have the properties and material characteristics that we need,” outlines Gaarder. The properties of the steels can be observed during firing tests, when a flight follower is used to follow the path of a projectile. “This is essentially a high-speed camera that is able to follow the path of a projectile, which travels at almost 1,000 metres a second,” continues Gaarder. “If the steel wasn’t mechanically strong enough then the projectile would not come out in one

piece. You would see a failure in the structure, it would break up into different pieces and just start tumbling.”

A spin tester has also been developed by researchers in the project, in which the projectile experiences the same degree of spin that it would be subjected to during flight. This can provide an insight into how the propellant behaves when the projectile spins under certain circumstances, yet Gaarder says certain aspects can only be assessed by actual live firing. “You’re only able to get certain forces when you actually fire the gun,” he stresses. This research is being conducted against a backdrop of rising geopolitical tensions, with pressure to increase defence budgets and invest in the next generation of artillery products. “Rocket-assisted technology is quite an affordable way of getting very long range artillery products,” says Gaarder.

IPN RAP

IPN RAP - Rocket Assisted Product

Project Objectives

The RAP project will develop steel and processes for next-generation artillery ammunition that has twice the range of the Nammo products of today. To achieve this goal, one possibility is to combine traditional ammunition with a rocket engine. In order for the product to have enough fuel, the steel grenade must be made with as thin walls as possible. This is very demanding as the product is exposed to very large loads during launch, and hence need for increased material mechanical properties. The project will develop relevant tests for material properties so that the right materials and processes can be used.

Project Funding

This project is funded by the Research Council of Norway RCN.

Contact Details

Andreas Gaarder

Program Director R&D

Nammo Raufoss AS – P.O. Box 162 – NO2831 Raufoss W: www.nammo.com

Nammo is an international aerospace and defense company headquartered in Norway. With more than 3 100 employees, 27 production sites and a presence in 11 countries, Nammo is today one of the world’s leading providers of specialty ammunition and rocket motors for both military and civilian customers.

Nammo was formed in 1998 through a merger of the ammunition businesses in the three Nordic countries, Norway, Sweden and Finland. Today Nammo is owned by the Norwegian Ministry of Trade, Industry and Fisheries and the Finnish Aerospace & Defense company Patria Oyj, each owning 50% of the shares in the company.

Andreas Gaarder Knut Erik Snilsberg
Microscopic picture of a fracture surface.
Forging of artillery shells.

Neutron scattering opens up window on novel materials

Neutron sources have an important role to play in the development of new materials for storing sustainable energy, allowing researchers to gain deeper insights into their functionality. We spoke to Professor Elizabeth Blackburn about how she and her team are using neutron scattering techniques to investigate energy and quantum materials.

The properties of a material may not always be fully evident from surface analysis, as the internal structure often has a major influence on how it functions. Neutron sources play an important role in this respect, enabling researchers to study the interaction between neutrons and different materials in scattering experiments, from which deeper insights can then be drawn. “Neutrons interact with the nuclei of atoms, rather than the electrons around them. Neutrons therefore scatter in a different way to x-rays from a synchrotron, so that sometimes you can see different phenomena more clearly,” explains Elizabeth Blackburn, Professor in the Department of Physics at Lund University in Sweden.

As the Principal Investigator of a project backed by the Swedish Research Council (Vetenskapsrådet), Professor Blackburn is part of a team of researchers using neutron scattering techniques to investigate the potential of certain materials in energy storage, a topic central to the wider goal of moving towards a more sustainable society. One of the main advantages of neutrons in this respect is that the particle has a magnetic moment in and of itself. “We can use the magnetic moment of the neutron to interact directly with magnetic fields, including those from nuclei like hydrogen,” says Professor Blackburn. “We’ve been conducting experiments on the motion of lithium deep inside some materials that could be used as lithium batteries. We’re also investigating quantum materials, which are essentially materials in which quantum mechanics is necessary to explain their fundamental behaviour. This typically means that electrons show strange magnetic effects. With wide angle scattering techniques, we are able to look at energy materials right down to the atomic scale, and build a fuller picture of their functionality.”

Polarised neutrons

This research involves using what are called polarised neutrons, where the neutrons in a beam are aligned, with their dipoles pointing in the same direction. There are several different ways of aligning neutrons; researchers at Lund

have now developed a polarising supermirror for this purpose. “A polarising supermirror can be thought of as a bit like an optical fibre. If you put neutrons in at the beginning of a waveguide coated with these supermirrors, you get most of the neutrons at the end. So you hardly lose any, however far you transmit them. If you choose the right material, the supermirror will absorb all of the neutrons that have the polarisation direction that you don’t want, and only keep the ones that are aligned in the way that you do,” says Professor Blackburn. These polarising supermirrors

are a fairly well established concept, but now Professor Blackburn and her team have developed a novel design. “We have a kind of array of these supermirrors with a particular shape, a logarithmic spiral. We have designed and tested this with various simulations, and a device has been constructed to our specifications,” she outlines.

The plan is to test this device at the ISIS neutron and muon source in the UK later this year, the results of which can then inform the design of another for use on an extreme environment spectrometer at the European

special experimental setup built to allow us to apply

Top-down schematic of the design of the transmission wide angle polarization analyser. Credit: Martin Månsson.
A
electric fields to the sample at the same time as high magnetic fields. This is just before installation on the instrument ZOOM at the ISIS Neutron and Muon Source. Credit: Elizabeth Blackburn

Spallation Source (ESS) called BIFROST. The ESS is still under construction at Lund, but when finished it promises to open up new investigative opportunities for researchers across a wide variety of disciplines, and the polarising supermirrors will be an important component. “They will be available for use at both ISIS and the ESS by whoever wants to use them,” says Professor Blackburn. As an experimentalist, Professor Blackburn is herself keen to make full use of these kinds of instruments, which she says can play a highly valuable role in research. “For example, people studying the motion of hydrogen – in water or in materials that absorb a lot of hydrogen –

collaboration with a researcher at ISIS. This work has been progressing well,” she outlines. On the quantum materials side, attention in the project has largely focused on the magnetoelectric effect, where an electric field can be used to reflect magnetism and viceversa. “We have been conducting detailed experiments at the ISIS facility, and this work has also been going well. However, it’s very complex, and it’s taking us some time to unravel what’s going on,” continues Professor Blackburn. “This side of the project is slightly more exploratory, in that we are trying to find new types of behaviour that haven’t really been observed before.”

“We’re investigating quantum materials, essentially materials in which quantum mechanics is necessary to explain the fundamental behaviour. This typically means that electrons show strange magnetic effects.”

have found that one of the best ways to analyse the data is to use polarised neutrons,” she explains. “The polarisation capability gives you a greater degree of certainty when analysing the contributions from hydrogen. Certain materials can absorb a lot of hydrogen.” These materials can be very porous, and the behaviour by which hydrogen is absorbed in the interior may be very different to that seen at the surface. A neutron source helps researchers look at what’s happening deep within a material, and Professor Blackburn is using neutron scattering to build a fuller picture in this respect. “On the energy materials side, we’ve been looking at the motion of lithium in two compounds, for which we developed simulation tools in

Instruments

The instrumentation side of the project is also well advanced, with plans to test the polarising supermirror in the next few months, in good time to build a second one to be used at the ESS. The aim is to provide a polarisation capability to instruments at ISIS and ESS, which will benefit not just researchers in physics or nanoscience, but also scientists from a wide range of other fields. “The instrumentation we are building will aid in the development of magnetic materials that will eventually be used in making better transformers. These instruments will also be of use to people looking at polymer dynamics for example, or in studying the behaviour of cell membranes,” says Professor Blackburn.

WIDE-ANGLE NEUTRON

POLARISATION ANALYSIS TO STUDY ENERGY AND QUANTUM MATERIALS

Project Objectives

To provide Europe’s neutron facilities with new, powerful polarisation analysis hardware, covering wide angles to get as much data as possible out of the neutron instruments. To resolve hard materials questions using this equipment, with a focus on energy and quantum materials.

Project Funding

This project was funded by the Swedish Research Council. The total amount of funding was 15,700,000 SEK.

Project Partners

• KTH: Martin Månsson

• European Spallation Source: Wai-Tung (Hal) Lee, Pascale Deen, Rasmus Toft-Petersen

• ISIS Neutron and Muon Source: GØran Nilsen, Pascal Manuel

Contact Details

Project Coordinator, Professor Elizabeth Blackburn Synchrotron Radiation Research Lund University Box 118, Lund Sweden

T: +46 46 2227152

E: elizabeth.blackburn@sljus.lu.se

W: https://portal.research.lu.se/en/persons/ elizabeth-blackburn

W: http://www.sljus.lu.se/research-fields/ magnetism-and-superconductivity/

https://doi.org/10.1051/epjconf/202328603004, https://doi.org/10.1051/epjconf/202328606002

Elizabeth Blackburn is a Professor of Physics at Lund University a position she has held since 2018. She gained her degree at the University of Cambridge, and has worked in research at institutions in Europe and America. In her role at Lund she has helped to develop a hub of research activity on magnetic materials and the interesting physics brought about by the behaviour of electrons in solids.

Prof. Elizabeth Blackburn
The transmission wide angle polarization analyser for use at the ISIS Neutron and Muon Source manufactured by SwissNeutronics. Photo credit: SwissNeutronics (https://www.swissneutronics.ch/).

Political parties that challenge some of the most fundamental tenets of liberal democracy are on the rise across Europe. The team behind the AUTHLIB project is investigating the factors behind the growing appeal of illiberal forces, which can then inform the development of tools to defend liberal democracy against the challenges it faces, as Professor Zsolt Enyedi explains.

Defending liberal democracy

The ideal of liberal democracy is underpinned by certain fundamental principles, like constraints on power, accountability, transparency, and tolerance of different social attitudes and worldviews. However, over recent decades illiberal governments have been elected in Hungary, Poland, and Italy, while parties that question some fundamental tenets of liberal democracy are on the rise in other parts of Europe. “Parties that can be considered to be of the radical right are among the principal contenders for power in long established democracies such as France and Austria for example, while some liberal democratic norms are being widely questioned,” says Zsolt Enyedi, Professor in the Department of Political Science at the Central European University (CEU) in Vienna. As a Principal Investigator in the AUTHLIB (Neoauthoritarianisms in Europe and the Liberal Democratic Response) project, an initiative which brings together researchers from eight universities and think tanks across Europe, Professor Enyedi is part of a team looking at the root causes of this shift in Europe in general

and in seven countries in particular. “We look at attitudes among the wider population and at the ideologies produced by intellectuals. We also try to put them into historical and psychological contexts,” he outlines.

The AUTHLIB project

This research is inter-disciplinary in scope, with the project team using a variety of methods to investigate mass attitudes, political ideologies and their associated variations, and how changing attitudes are being translated into policy. The project’s agenda includes conducting surveys, lab experiments and quantitative text analysis, alongside other strands of research. “We have also analysed legislative texts and party manifestos in search of policy changes under illiberal governments. This methodological diversity is a unique aspect of the project,” says Bálint Mikola, a researcher at the CEU Democracy Institute who is also working on AUTHLIB. The wider aim in the project is to build a fuller understanding of why the liberal consensus has broken down. “What is it that makes

illiberal initiatives increasingly appealing, across many different European countries?” asks Professor Enyedi. “As a final stage, we plan to come up with recommendations to support liberal democracy, building on empirical investigation into the challenges that it faces. We want to build our recommendations on a very detailed investigation of the data.”

Part of this work involves looking through speeches, party programmes, social media conversations and other materials to investigate the language of illiberalism and probe its appeal. This will help researchers differentiate between different types of illiberal forces, which Professor Enyedi says is a major priority in the project. “Some illiberal actors are motivated primarily by nationalist, nativist resentment for example, while others have a more religious fundamentalist character, and again others are inspired by leftist ideas,” he explains. These parties may also take very different positions on certain areas of policy. “Some parties and movements are fighting culture wars on all fronts, particularly on women’s issues and the treatment of sexual minorities. Other parties

place less emphasis on these issues, and even accept some emancipatory ideas concerning gender equality and the treatment of sexual minorities,” continues Professor Enyedi. “We see some interesting differences between these parties. Some of them are very egalitarian and aim to help the poor. Others are more traditionally right-wing, in the sense that they focus not on redistribution, but more on keeping taxes low. Overall, these challengers have been able to show not only that there are alternatives to the liberal democratic mainstream, but also to innovate in terms of management and competence.”

The reasons behind growing popular support for these parties are correspondingly complex and varied, but Professor Enyedi says there is a general shift in the overall structure of the politics of advanced capitalist societies. Whereas in the past centre-left parties garnered

as being outside the political mainstream, many of which have attracted significant support in recent elections across Europe, which Professor Enyedi and his colleagues have been keeping a close eye on. “We are closely studying election results and reflecting on them. We ran surveys in seven countries following the 2024 elections to the European Parliament, and we will consider the impact of these elections on basic value orientations and how people think about the functioning of democracy,” he explains. The final stage of the project’s research centres around developing a toolkit to defend liberal democracy, based on a detailed understanding of the challenges it faces. “Our data will show the differences between the various groups of illiberals. From that point, policy makers can distinguish among their concerns and can develop better tailored responses,” explains Professor Enyedi.

“What is it that makes illiberal initiatives increasingly appealing, across many different European countries?”

a lot of their support from the working class, this group is increasingly supportive of illiberal forces, for a number of reasons. “One is that they simply don’t agree with the worldview and agenda of their more educated compatriots. They are less cosmopolitan, less progressive on many issues, and are more worried about the security and survival of their native culture,” outlines Professor Enyedi. Economic factors are also at play, with those who have lost out due to globalisation more likely to support illiberal parties than those who benefit from postindustrial market structures. “This polarisation goes a long way to explaining why people join illiberal movements,” says Professor Enyedi. “The urban/rural divide is also a factor in many countries, along with a general loss of optimism about the future, which was less common with previous generations.”

This is manifested in the growing belief that the young people of today will have a lower standard of living than their parents, and many blame liberal cosmopolitan elites for allowing this situation to develop. This may then lead people to turn to illiberal parties or those seen

Sharing results beyond academia

An e-learning platform is currently under development, which will represent one of the principal outputs from the project. There are also other public-facing aspects of the project’s work, as Professor Enyedi explains. “We are working for example on allowing any interested individual to play with the data we have gathered about ideological, political spaces in Europe. They will then be able to position themselves with respect to the positions of the parties and movements that we have mapped,” he says. Researchers also plan to write a number of policy papers to shed new light on the diversity of the illiberal landscape and how some of the illiberal challengers in Europe operate.

“We seek to share our findings and results about such developments beyond academia, with policy-makers and the wider public, as well,” continues Professor Enyedi. “Concurrently, we are also working on a number of academic articles and books, which will be published in different journals and volumes.”

AUTHLIB

Neo-Authoritarianisms in Europe and the Liberal Democratic Response Project Objectives

The AUTHLIB project is investigating the root causes of the shift away from liberal democratic norms that has been observed across many European countries in recent years. Researchers from eight universities and think tanks across Europe are investigating the different varieties of illiberalism that can be observed across Europe, with the aim of developing a toolkit to defend and strengthen liberal democracy.

Project Funding

Funded by the European Union (Horizon Europe, Grant Nr.: 101060899) and UK Research and Innovation under the UK government’s Horizon Europe funding guarantee (Grant Nr. 10041578).

Project Consortium

Central European University • Charles University • Sciences Po • Scuola Normale Superiore • SWPS University • Transatlantic Foundation / The German Marshall Fund of the United States • University of Oxford • University of Vienna

Contact Details

Zsolt Enyedi

Professor

Central European University

E: enyedizs@ceu.edu

E: info_authlib@ceu.edu

W: https://www.authlib.eu : https://x.com/AuthlibEU : https://www.youtube.com/@AuthlibEU : https://www.facebook.com/AuthlibEU

Zsolt Enyedi is a Professor of Political Science at the Central European University in Vienna. The focus of his research interests is on party politics, comparative government, church and state relations, and political psychology, especially authoritarianism, prejudices, and political tolerance.

Professor Zsolt Enyedi

Probing the legacy of the Cape Colony

The Cape Colony was established by the Dutch East India Company in 1652 with the initial aim of providing a refuelling station for passing ships, but over time the colony expanded and the European population grew. We spoke to Professor Erik Green about his research into how the settler economy evolved, its impact on indigenous people, and its long-term legacy.

The Cape Colony on the southern tip of Africa was established by the Dutch East India Company (VOC) in 1652, and the early pioneers were subsequently followed by further European migrants, mainly from the Netherlands. The initial intention was to establish a refuelling station for ships passing around the Cape of Good Hope, but over time more settlements were established further inland from Cape Town. “Around 1660/1670 the VOC allowed small groups of Europeans to settle inland of Cape Town in Stellenbosch, then later on in Franschoek and Drakenstein. The VOC needed these settlers to supply wheat and meat to the passing ships, as they still had problems trading with the indigenous Khoesan people,” explains Erik Green, Associate Professor in the Department of Economic History at Lund University. The plan was not to expand the colony, but once Europeans had been allowed to settle further inland, it started off a kind of self-perpetuating process. “Settlements just expanded and expanded, in search of new land,” says Professor Green.

Economic impact

As Principal Investigator of a research project based at Lund University, Professor Green is now looking at how the Cape Colony was initially established and then grew over the period between 1660-1840, as well its wider economic impact and legacy. This research involves analysing several sources of data, including slave records and annual tax censuses, which Professor Green says provide very detailed, household-level information

on people from a European background. “This includes how many cattle they own, their livestock, wagons and amount of crops produced. There’s also data on the assets they hold, right down to the numbers of candles. This allows for a detailed understanding of wealth accumulation in this part of the population,” he outlines. The project team also have access to information on the indigenous Khoesan population. “We have

information from the early part of the 19th century, roughly 1800-1840. Most of the information comes from missionary stations, where the Khoesan lived in freedom,” continues Professor Green.

Researchers in the project are also analysing observations on wages for the Khoesan working for European farmers over the period 1800-1820. This information is pretty scattered, and covers a relatively short

Painting of indigenous Khoesan people and European settlers in southwestern Cape, c. mid-18th century.

period, yet Professor Green says it still holds a lot of interest. “We’re able to compare the wages paid to Khoesan farm workers with those of Europeans who were being employed as servants to the company,” he outlines. While the data shows that on average Khoesan people earned much less than their European counterparts, there were exceptions. “A few Khoesan people seem to have earned more, and some were in fact fairly wealthy. We have identified around 40 or 50 Khoesan that were as wealthy as the average European farmer,” says Professor Green. “The VOC was a very weak authority – it ran a deficit every year throughout its presence in the Cape – and found it difficult to enforce rules, especially in the frontier regions. So decisions about wages and contracts were really taken by the individual Khoesan and the individual European farmer.”

The only data available on these more prosperous Khoesan people is their name and their wealth, so it’s very difficult to explain why they managed to do as well as they did. It’s also difficult to draw clear inferences about the wealth of individual Europeans, as the data presents quite a complex overall picture. “If you look at the

project is about using the Cape colony and the rich source material that we have on it to contribute to wider debates about questions around colonialism, inequality and the rule of law,” outlines Professor Green. The history of the Cape Colony can also tell researchers much about how colonies were established across the new world, and how labour was divided and rewarded to support economic development, a topic that Professor Green is exploring in the project. “I’m working on papers where we try to convince scholars that the history of the Cape Colony holds wider relevance,” he continues.

History of the Khoesan

The project’s research will also shed new light on the history of the Khoesan, a group that is still largely neglected in contemporary public debate around issues like redistribution of resources and land reform. While the African National Congress (ANC) – which forms part of the ruling coalition in modern South Africa following recent elections – has stated its commitment to land redistribution, the Khoesan are still marginalised to some extent. “Land dispossession that happened in the 18th century is not considered a

“Some of the research in the project is about using the Cape colony and the rich source material that we have on it to contribute to wider debates about questions around colonialism, inequality and the rule of law.”

annual wealth of the European farmers, we see that the numbers are extremely volatile. If we group them in percentiles according to their wealth, we see that they move across these boundaries on a very significant scale,” says Professor Green. A good year could be followed by quite a challenging year; Professor Green is investigating the importance of weather patterns in this respect. “We have annual rainfall data, and we’re working together with geographers in Johannesburg to see the extent to which annual volatility in output can be explained by rainfall patterns,” he continues. “Crops can be quite vulnerable in bad weather, not only to rainfall, but also to very strong winds.”

This work will help Professor Green and his colleagues in the project build a highly detailed picture of the Cape Colony, and reconstruct how relations between colonisers and indigenous people developed over time. Researchers are currently working on a number of papers, with some focused on the history of the Cape Colony, while others are wider in scope. “Some of the research in the

matter for contemporary politics. Khoesan claims to land are still something of a nogo zone in South African politics,” explains Professor Green. By analysing material from the missionary stations and the data on Khoesan farm labourers, Professor Green will help make the history of the Khoesan people more visible. “The Khoesan people have lived in South Africa for a very long time. We want to shine a light on this history,” he says.

This work is very much ongoing, and new sources of information on the Khoesan people have come to light, which Professor Green hopes to explore in future. Alongside searching for more sources of information, Professor Green also hopes to incorporate the data that has been gathered into a longer timescale, building a fuller picture of the history of the Khoesan right up to the early part of the 20th century. “The only problem is that we have a gap between roughly 18401880, where we don’t have individual-level information. We need to find ways to deal with that gap,” he says.

THE ESTABLISHMENT, GROWTH AND LEGACY OF A SETTLER COLONY

The establishment, growth and legacy of a settler colony: Quantitative panel studies of the political economy of Cape Colony

Project Objectives

This research program aims to contribute to the large literature on colonialism and global inequality by – for the first time – using big data and machine learning techniques to analyze the roots and gradual development of a settler economy - the Cape Colony - its institutions and their impact on growth, inequality, and welfare over time and space.

Project Funding

Funded by Riksbankens Jubeliumsfond (2021-2026, 29 425 000 SEK, Dnr: M20-0041)

Project Partners

Stellenbosch University • Utrecht University • Leiden University • Massachusetts Institute of Technology • University of Colorado • University of California, Davis

Contact Details

Project Coordinator, Erik Green Senior lecturer, Associate Professor Senior lecturer, Department of Economic History Senior lecturer, Economic development of the Global South Department of economic history, Lund University P. O. Box 7083 220 07 Lund

Sweden T: +46 736 816900

E: erik.green@ekh.lu.se

W: https://portal.research.lu.se/en/projects/ the-establishment-growth-and-legacy-of-asettler-colony-quantitat

Erik Green is an associate professor in economic history. His area of expertise is the economic history of colonial SubSaharan Africa, which a focus on agrarian change, labour relations, inequality and long-term structural change. He has published extensively on issues related to agrarian change, rural labour relations, slavery and economic and sectoral change in southern, eastern and western Africa.

Professor Erik Green

Learning from the history of financial crises

The last hundred years have been marked by a number of financial crises, from the Great Depression of the 1930s to the global financial crisis of 2008. Researchers in the Mercator project are looking at the way these events are remembered and how memories of them influence the bankers of today, as Professor Youssef Cassis explains.

The 2008 financial crisis had a dramatic impact, leading to a sustained recession across many of the world’s most developed economies. The roots of the crisis lay, among other things, in excessive lending and reckless behaviour by some bankers at major financial institutions, a topic that Professor Youssef Cassis and his team are investigating in the ERC-backed Mercator project. “We are interested in what led up to the 2008 crisis,” he outlines. The project’s research centres around probing commercial and investment bankers’ memories and knowledge of previous financial crises, to assess whether - and how - those recollections have influenced their approach to their work. “The project is about memory. We are interviewing people who were in senior executive positions at top American and European banks during the 2008 crisis (chairs, CEOs, CFOs, CROs), who had a global vision of what happened and made strategic decisions. What experience, memory, knowledge of financial crises did they have when the sub-prime crisis started? Had they themselves gone through previous crises?” explains Professor Cassis.

This research has uncovered an extremely diverse range of experiences. These include

the junk bond crisis of 1990, when the New York-based bank Drexel Burnham collapsed, and the European real estate crisis of the early ‘90s, while Professor Cassis says the events of 9/11 also influenced many bankers. “The 9/11 terrorist attacks led to a geopolitical crisis, with consequences across the Middle East,

but they also caused severe financial issues,” he says. “What also comes out is a difference between the macro level – relating to the financial system as a whole, or the national economy of a country – and the micro level, which is about what happens at the level of the company,” he says. “Not surprisingly, however, bank executives’ experiences of financial crises has been at micro-level.”

Financial crises differ in character, and the criteria by which their severity should be assessed is still the subject of debate; one important issue is the risk of contagion into other parts of the financial system.

“The level of systemic risk is important, as the inter-links and relationships between institutions can lead to a domino effect,” continues Professor Cassis. “Some banks are characterised as systemic banks, and they are subject to stress tests every so often by the monetary authorities, in order to see what will happen if they fail.”

The project is also interested in the memory of the bankers in positions of responsibility at major financial institutions today, more than 15 years after the collapse of Lehmann brothers, which marked a key point in the development of the crisis. For this reason Professor Cassis is also conducting interviews

Crowd gathering at the intersection of Wall Street and Broad Street after the 1929 crash.

with people from this group, looking at how their experience and recollections of the 2008 crisis affect the way they work. Many of those interviewed have clear memories of what happened in 2008, as they were already in fairly senior positions at the time, but their insights and experience may be lost when they retire. “What will happen in 15 years time, when many of these people will have retired? Will any memories remain?” asks Professor Cassis. “If the memory of the Great Depression is anything to go by, it is doubtful that the memory of the Global Financial Crisis of 2008 will survive until the late 2030s”.

The cultural memory of financial crises

The events of 2008 will eventually become part of the cultural memory of financial crises, and will be subject to re-interpretation and analysis by subsequent generations, like the other major events Professor Cassis and his team are considering. Alongside the 2008 crisis, researchers are also looking at three other major financial events; the Great Depression of the ‘30s, the international debt crises of 1982 and the Asian financial crisis of 1997. “We’re analysing the way that these four financial crises are remembered,” explains Professor Cassis. While it’s of course not possible to interview people directly involved in the Wall Street Crash, there is no shortage of material in films, books, newspapers and other forms of media. Every generation reinterprets the Great Depression

in the light of the concerns of the time and wider trends in economic thought. “After the Great Depression we moved towards a more regulated economy, then from the 1970s onwards in a more liberal direction,” continues Professor Cassis. “The narrative of events can change over time. We’re looking at whether the answers individual bankers provide to our questions are compatible with the narrative of the day.”

Researchers are also looking at how the changes of memory and narratives have influenced the process of regulation and - especially in the last forty yearsderegulation. Two particularly interesting cases stand out here amongst the regulatory decisions that Professor Cassis and his team are examining. “The Glass-Steagall Act of

by the economic circumstances in which they grow up and the prevailing narrative of the time, for example the general trend towards deregulation and market freedom in the US and the UK in the ‘80s under Ronald Reagan and Margaret Thatcher. Many of those in charge of financial institutions in 2008 entered the workforce during this period, often bringing with them a very different outlook from their predecessors, another topic Professor Cassis and his team are addressing. “An important question addressed by the project is whether a new financial elite emerged in the late twentieth and early twenty-first century. A prosopography (collective biography) of more than 500 leading bankers - looking in particular at their studies and careers - reveals

“What experience, memory, knowledge of financial crises did senior executives have when the sub-prime crisis started in 2008? Had they themselves gone through previous crises?”

1933, which separated commercial banking and investment banking in the United States, was abrogated in 1999. This was probably the most potent symbol of the regulation that marked the post Second World War period,” outlines Professor Cassis. “We’re also looking at the decision not to regulate Overthe-Counter derivatives by the Commodity Futures Modernization Act (CFMA) in the United States in 2000.”

An individual’s outlook is typically shaped

some limits to the internationalisation of the financial elite, whether in terms of board composition or education and training,” he says. “The growing number of MBAs and science degrees should not obscure the persistence of national traditions in higher education; and national peculiarities continued to permeate career patterns.”

Wider changes also took place in how economics was taught. “There was an increased ‘mathematisation’ of the discipline

World map showing GDP real growth rates for 2009. CIA world factbook estimates[1] as of April 2010. https://commons.wikimedia.org/w/index. By Gdp_real_growth_rate_2007_CIA_Factbook.PNG: Sbw01f, Kami888, Fleaman5000, Kami888derivative work: Mnmazur (talk) - Gdp_real_growth_rate_2007_CIA_Factbook. PNG, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=10058473

MERCATOR

The Memory of Financial Crises: Financial Actors and Global Risk

Project Objectives

The MERCATOR project explores the extent to which the memory, or absence of memory, of previous financial crises can explain certain practices within the financial system. The project stems from two simple questions: do financial actors have any knowledge, memory and understanding of previous financial crises? And, more generally, how far are they aware of the inherent instability of the financial system?

Project Funding

The project leading to this application has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 884910).

Project Team

Mr Bruno Pacchiotti, Research Associate; Dr Alice Pearson, Research Fellow (financial crises and the teaching of economics); Dr Tobias Pforr, Research Fellow (memory and the process of financial regulation and deregulation); Dr Giuseppe Telesca, Research Fellow (the cultural memory of financial crises); Dr Niccolò Valmori (prosopography of the financial elites, 1980s-2000s).

Contact Details

Project Coordinator, Professor Youssef Cassis

Villa Schifanoia, via Boccaccio 121, IT-50133 Florence T: +39 055 4685 820

E: youssef.cassis@eui.eu

W: https://mercator.eui.eu/

Youssef Cassis is Professor at the European University Institute, Florence, and Visiting Professor at the London School of Economics and Political Science. He has published widely on business, banking and financial history, including Capitals of Capital. A History of International Financial Centres, 17802005 (Cambridge University Press, 2006, 2nd ed. 2010); Crises and Opportunities. The Shaping of Modern Finance (Oxford University Press, 2011), and The Oxford Handbook of Banking and Financial History (Oxford University Press, 2025).

from the ‘70s and ‘80s onwards. Where previously there was a lot of space for the teaching of history and political economy, with essay-based exams, from the ‘70s onwards there was a general shift towards an increased emphasis on mathematics and economic modelling,” says Professor Cassis. “This reflects the rise of neoclassical economics and the theory of rational expectation. The turning point is around the ‘70s/’80s – and this has important implications for the way in which we deal with financial crises.”

Financial stability

Most of those who cut their professional teeth during this period weren’t directly confronted with the instability of the financial system in the early stages of their careers, and began to take up positions of responsibility in the years leading up to 2008. This generation had not really experienced a major financial crisis before 2008, and to some extent had maybe forgotten that they could occur. “There was a belief that risk could be mastered, that technology enables better control of financial markets,” outlines Professor Cassis. Some degree of instability is however inherent to the financial system, believes Professor Cassis. “Banks take shortterm deposits and loans from the money market, and transform them into longterm loans or investments, and there is an inherent instability there,” he continues. “It may not be possible to prevent all financial crises, but what should be avoided are the very serious financial events, like that of 2008, or the Great Depression. There may be more minor crises, which are localised or limited to a single institution.”

The nature of a future financial crisis is very difficult to predict and prepare for, but there are some measures which banks can take, such as maintaining an adequate level of capital and liquidity. More broadly, remembering previous financial crises will remind bankers themselves that these types of events can happen, which may then influence their approach to their work. “Even when everything seems rosy, the market is booming and the economy is doing well, things can still go wrong,” says Professor Cassis. The team is planning to publish a book bringing together the project’s results, which Professor Cassis says will present a very detailed, nuanced picture of events.

“The book will show the complexity of events. It will try to show that memory is important, but also quite an elusive concept,” he outlines. “We hope to attract the attention not only of regulators, but also of commercial and investment bankers. We know very little about their views, unlike those of regulators, central bankers and policymakers, who have attracted far greater attention, as their role involves thinking about the economy as a whole, not only their own institution.”

A financial crisis typically has many causes, but the behaviour of bankers can be a contributory factor. Some level of risk is inherent in the financial system, yet an awareness of history can also benefit bankers and help limit the possibility of a new financial crisis developing. “Memory is important, because it gives you landmarks, and it reminds you that there is an inherent instability in the system,” says Professor Cassis. “Bankers who keep that in mind will tend to be more cautious.”

Professor Youssef Cassis

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.