INTERVIEW
“We need to take into account that – even without any malicious will – we bring our own values and biases into our work”
robot looks and speaks. I always give the same example. If you tell Siri “Siri, you are a bitch.”, she replies: “I’d blush if I could.” Why? Because she is modeled like a submissive woman. Thirdly, we need to think of a balanced relationship between the robots and humans. We should avoid making the AI and robots too similar to humans in order to refrain from comparing robots to humans and applying
CRISTINA ZAGA is an assistant professor, speaker, and maker of robots. At the Human-Centred Design Group (Design and Production Management
biases and stereotypes we do with humans.’
What do we need to make this happen?
department) and The DesignLab of the University of Twente, Cristina’s research bridges engineering, design, and social science to develop technology responsibly. She investigates methodologies, methods, tools, and techniques
‘There is a huge interest in this topic, but we are still at the very start. My task is to figure out specifically what people need, what
to connect science and society through transdisciplinary
tools and education are necessary
responsible design of technology. She also leads a
to make this responsible design
4TU consortium focused on bringing DEI to embodied AI.
happen. This needs to go further
Her award-winning work in Human-Robot Interaction
than just discussing it. We need to look at where inequities and inequalities lie, what biases and
has received many academic and societal accolades.
prejudice there are, and we need to
For example, Cristina was selected as Google Women Tech-
work towards projects and research
Maker Scholar 2018 for her research quality and her efforts
that don’t perpetuate these biases.
to make STEM more inclusive to women and children.
We need responsible agents, such as AI and robots, that are designed for human flourishing. We need to work on this in a transdisciplinary way – with different disciplines, but also with citizens, and include as many perspectives as possible. We need technology that embraces equality and social justice.’
•
In her recent TEDx Talk, Cristina talked about responsible design and her own awakening to social technology’s dark side.
DIVERSTIY & INCLUSION
29