Page 1

Procedural characters interacting with the environment. Work of ESR 3, Rafael Andrés Blanco Guerra (UPC).

Social activity which took place in Saint Malo, during the 2nd training workshop in Rennes, France.

Photo of the ESRs’ presentations, during the 3rd training workshop in Barcelona, Spain.

Virtual characters for realistic scenarios A variety of skills are required to develop realistic virtual characters. The aim of the CLIPE project is to train the next generation of researchers in virtual humans and make more realistic virtual characters that are capable of interacting naturally with humans, as Yiorgos Chrysanthou, Nuria Pelechano, Nefeli Andreou and Rafael Blanco explain. The market for augmented reality (AR) and virtual reality (VR) technologies continues to grow, with people across the world playing games that bridge the real and virtual worlds, a trend which demands the development of sophisticated and realistic virtual characters. There are also many applications in education, architecture, medicine and engineering that require immersive collaborative VR experiences populated with virtual characters. This issue lies at the heart of the EU-funded CLIPE project, an initiative bringing together academic and commercial partners from across Europe to provide training to early stage researchers (ESRs). “We´re creating the technology to generate digital characters. We’re trying to make them more realistic and to improve the way that they interact with humans,” says Nuria Pelechano, an Associate Professor at UPC in Barcelona, a member of the CLIPE team. The primary focus here is on the behaviour and animation of the virtual characters, rather than their visual appearance. “We’re interested in developing virtual characters that can populate virtual environments or move around us, within an environment populated by physical agents, in a way that is as human-like as possible,” explains Professor Pelechano.

CLIPE project Making the characters behave in a more realistic way is central to establishing trust between a human and a virtual character. If virtual characters don’t have naturallooking facial expressions then a human might struggle to trust them, which hinders interaction. “Then it’s hard work to establish communication between a physical agent


and a virtual character,” points out Professor Pelechano. Making these characters more realistic is a correspondingly important aim in the project. “We develop algorithms, evaluate them, and identify what is missing – we look at what works and what doesn’t. Then we can go back and improve the algorithm,” says Yiorgos Chrysanthou, Professor of Computer Science at the University of Cyprus, coordinator of CLIPE. “We are working to improve the animations of different characters, when immersed in virtual reality worlds. It may be that there are trade-offs, where we have to assess what’s more important in terms of the realism of the virtual character and

with a large number of humans, how can you do that more easily so that you don’t need to spend lots of time minutely specifying each person and each aspect of their behaviour? So we are also looking at procedural authoring of crowds for example,” he says. “The state-ofthe-art in virtual humans is quite advanced, but this is one of the areas that could be improved further.” A major challenge here is that the behaviour of a group of individuals can be difficult to simulate. While somebody may have left their home with a clear agenda, those plans are subject to change at any point. “An individual might meet somebody in the street and start

We’re developing the technology, the digital characters. We’re trying to make them more realistic and to improve the way that they interact with humans. interactivity, for example between how much to use pre-captured movements or computer generated ones.” The ideal scenario would of course be to fix all of the different aspects at the same time, but typically there are trade-offs involved and decisions have to be made. Rather than simply deciding what specific features to focus on, Professor Chrysanthou says perceptual data is used to identify priorities. “It might be that we particularly want to get the facial expressions right,” he outlines. Beyond making the virtual characters look more realistic and conveying facial expressions and emotions, Professor Chrysanthou and his colleagues in the project also work on the ease of defining certain things. “For example if you want to populate a large environment

talking about the major topic of the day, or maybe they’ve forgotten something at home, and decide to turn around abruptly. That kind of behaviour can be difficult to incorporate in a virtual crowd,” explains Professor Pelechano. The movement of a crowd also varies according to the situation, an issue which Professor Pelechano is taking into account in her research. “We have sophisticated methods to learn about the movement of a crowd and to copy it,” she says. “A crowd on a busy subway station for example doesn’t behave in the same way as a crowd in a shopping centre, or a crowd in a pub. We can try to build stronger foundations so that we can then extrapolate to different situations.” This would make it much easier to populate an environment with virtual characters,

EU Research

giving the appearance of a real crowd without the need to laboriously create each individual within it. This would benefit many different application domains, helping accelerate development. “It would help the game industry, the movie industry as well as simulation and training industries,” says Professor Pelechano. This would also make virtual characters more accessible to people who maybe don’t have a technical background or limitless resources to spend on animation. “We are trying to combine different techniques that can make the process of populating a different environment easier for the general public,” continues Professor Pelechano. “This is also important with the concept of the metaverse. People are working to create normal surroundings for the metaverse, like cities, houses and environmental features, but you will also want to see people in there.” Indicative photo that was taken during the visit to Immersia labs, as part of the 2nd training workshop activities that took place at Rennes, France.

Training The ESRs in the project gain a grounding in a wide range of different techniques, equipping them with the skills required to develop realistic virtual characters, for which demand is growing. There are several industrial partners involved in the project, testament to wider interest. “There is the special-effects industry for movies, the games industry and also the online retail industry, where there is interest in using virtual characters for trying on clothes. In that latter case, appearance might be more important than behaviour,” outlines Professor Pelechano. There are 15 ESRs in CLIPE working on different research projects, around the core aim of improving the simulation and animation of virtual characters. “The focus is on enhancing the realism of virtual characters in urban, populated environments,” says Professor Pelechano. “The project is not about Photo taken of STAR presentation at the Eurographics 2022 conference in Reims (FR), where some of our CLIPE students had the opportunity to present their work.

building one specific application or developing a specific research idea. Rather it’s about training researchers, who could be leaders in the industry in the future.” This is not just about technical knowledge, but also ‘soft’ skills like grant writing. The aim is to equip the students with the broad range of skills they will need to keep pushing forward development as their careers progress. “We aim to help prepare students for their future careers, whether that’s in academia or industry,” says Professor Pelechano. With many of the students entering the final year of their PhDs, Professor Chrysanthou hopes to see many more research papers published over the coming years, which could then lead on to commercial development. “It may be that some of the ideas have a lot of commercial potential. We’re also going to hold some further training workshops,” he continues. Indicative photo that was taken during the visit to Immersia labs, as part of the 2nd training workshop activities that took place at Rennes, France.



Project Objectives

Project Funding

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 860768.

Project Partners

• University of Cyprus • Universitat Politecnica de Catalunya • INRIA • University College London • Trinity College Dublin • Max Planck Institute for Intelligent Systems • KTH Royal Institute of Technology, Stockholm • Ecole Polytechnique • Silversky3d. Industrial partners: • Treedy’s SPRL • British Broadcasting Corporation • • Golaem • Ubisoft

Contact Details

Yiorgos Chrysanthou T: +357 99 582812 E: Nuria Pelechano E: Marios Kyriakou E: Olia Tsivitanidou E: W: Left to right: Nefeli Andreou, Rafael Blanco, Nuria Pelechano, and Yiorgos Chrysanthou

CLIPE Project ESRs There are fifteen ESRs studying for a PhD in the CLIPE project, involving research into several different issues around the simulation and animation of virtual characters. We spoke to two ESRs, Nefeli Andreou and Rafael Blanco, about their projects and their plans for the future Motion capture data EU Researcher: What is the main focus of your research?

Nefeli Andreou: My particular focus is to take the 3-D motion capture data acquired by us or other research groups, and try to build generic models of human motion. As a second step, we aim to edit the generated motions. So it’s not only about learning patterns from the capture data, but also modifying aspects of the data, such as their style or expression. EUR: How do you use the motion capture data? NA: In skeletal animation, we transform the

raw data (markers on the body) obtained from the motion capture system, into joint rotations and displacement in space. The features are then used as input to our deep learning models. We have several ways of parametrizing 3D joint rotations. And recent works shows that this choice is indeed impactful on the performance. In our recent work, we conducted experiments to examine the performance of these representations and proposed a novel formulation based on dual quaternions which is better suited for a deep learning framework.

EUR: What is the result of that in terms of the simulation and animation of a virtual character?

Nefeli Andreou is a PhD student at the University of Cyprus. She holds an MSc in Data Science and a BSc in Mathematics.

NA: It’s less jittery and more stable. We also

Rafael Blanco is a PhD student at the Polytechnic University of Catalonia. His research interests include procedural modeling and authoring tools

EUR: Have you had the opportunity to

Nuria Pelechano is an Associate Professor at the Polytechnic University of Catalonia. Her research interests include simulation and animation of virtual characters and crowds for VR.

NA: Yes, and it’s really been a great experience

Yiorgos Chrysanthou is a Professor in the Computer Science Department at the University of Cyprus, where he heads the Computer Graphics lab.

found that our models converged faster, so we would get a stable motion with less training time. travel and spend time at other institutions during the project?

to be able to interact with different partners, as each of them offers a different perspective. I spent some time at the Max Planck Institute of Intelligent Systems last year where I got a different viewpoint, from the computer vision aspect. Currently I’m in Paris at the Ecole Polytechnique, where I’m looking into more creative ways to control the motion.

EUR: Have you also been able to

collaborate with other ESRs in CLIPE and attend events?


NA: I presented my work at a training workshop in Barcelona recently, where I was able to chat with most other ESRs and network with speakers from industry and academia. In the future, I would like to work somewhere in the intersection between computer graphics and computer vision.

Crowd simulation EUR: What is the main focus of your research? Rafael Blanco: I am trying to build a

process for crowd simulation. We want to link a city and an agent using an agenda. I aim to provide an agenda for individual agents, then the group of agents creates the crowd. Building an agenda is essentially like scheduling the activities that the agent will have during the day.

EUR: What tools are you using in your research? RB: The software I was previously using is very similar to that used in Pokemon Go, but this is designed for exterior of a simulation, while I am interested in working with the interior. So in the end I had to create an implementation from scratch There are several improvements that can be made, for instance adding several different semantic places, like offices and other locations in a city. I am working towards this. EUR: And the behaviour of the agents varies in these different locations?

RB: An important point in my research is that I am helping the agents learn about how they should behave inside certain objects. For example in some fast food restaurants you go to a panel and ask for your order, while in other restaurants you first wait for a staff member to arrive and tell you where your table is. EUR: What do you hope will be the outcome of your research?

RB: I aim to develop a tool for procedurally generating crowds in a city, which until now has been difficult. This tool will open up the possibility for people like artists and designers to use this procedural generation to populate interactive environments.

EU Research

Tairan ESR6 Motion Capture Hall at (Max Planck Institute for Intelligent Systems MPIIS).

The primary objective of CLIPE is to train a generation of innovators and researchers in the field of virtual characters simulation and animation. Advances in technology are pushing towards making virtual worlds a daily experience. Whilst virtual characters are an important component of these worlds, bringing them to life and giving them interaction and communication abilities requires highly specialized programming combined with artistic skills, and considerable investments. The research objective of CLIPE is to design the next-generation of VR-ready characters that are more controllable and behave and interact more naturally.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.