Teaching soft robots self-awareness

Soft robots with human-like perception can anticipate sensory inputs, detect contact and adapt dynamically, paving the way for applications in autonomous exploration and precision-driven medical procedures.
Did you know that you actually have a “sixth sense”? Called proprioception, it helps your body make sense of where it is in space. It’s what allows gymnasts to orient themselves mid-somersault or basketball players to dribble while running without glancing at the ball. But it’s also what lets you touch your nose with your eyes closed, sip coffee without looking at your
mug, or feel the difference between hard cement and soft grass, even while wearing shoes.
Conferring this sort of sensory awareness upon soft robots is a key focus of Professor Cecilia Laschi and her team at the Department of Mechanical Engineering, College of Design and Engineering (CDE), National University of Singapore. Drawing inspiration from the human perception system, they developed the “expected perception” framework, which enables robots to anticipate sensory inputs, detect external forces and adapt dynamically — all without relying on cameras or external vision systems.
Professor Cecilia Laschi and her team developed a framework that enables robots to anticipate sensory inputs, detect external forces and adapt to environments dynamically. Issue 04 | January 2025 Forging

Applied to a flexible soft robot equipped with liquidmetal sensors, the expected perception system enables precise detection of external strain and deformation. Being aware of its own shape, the robot can interpret its surroundings, distinguish between self-induced motion and external contact and even determine the direction and magnitude of forces. This gives robots better perception abilities — a boon for operation in dynamic environments.
The team’s findings were published in Nature Communications on 18 November 2024.
Giving robots a sensory upgrade
Proprioception in humans relies on sensors located in the muscles, tendons and joints, which work in tandem with other senses to provide constant feedback about body position and movement. Think about splaying out your fingers — you know it’s happening without even looking. These sensors also let us gauge the weight of objects we’re interacting with, or pick up on subtle changes in our surroundings — like a shift from smooth tile to uneven gravel underfoot.
“Soft robots, too, need proprioception,” says Prof Laschi, who is also the new Director of the Advanced Robotics Centre at CDE. For instance, a robotic gripper designed to handle groceries needs to feel what it is touching and sense the positions of its fingers. Without this feedback, soft robots struggle to perform tasks that require adaptability and precision. “For a soft robot, however, it is difficult to distinguish between proprioception and exteroception. Its strain sensors respond both when the soft robot deforms because of its own movement and when there is an external contact, in the same way.”
Issue 04 | January 2025 Forging New Frontiers
To address this, Prof Laschi’s team devised a new system — an “expected perception” — to let soft robots better perceive what they’re interacting with. At its core, the loop mimics the way human brains predict sensory input and combine it with sensory feedback. Embedded into a flexible robot capable of bending in all directions, the system allows the robot to calculate its predicted position based on movement commands and compare it with its real-time position, measured using liquid-metal-based sensors in its body. Any discrepancies between the two positions signal external contact. The robot then quickly detects and responds to such forces. This mirrors how humans adjust to external stimuli, such as catching a falling object or regaining balance.
The researchers tested the system in two scenarios: navigating a maze and learning from human interaction. In the maze experiment, the robot moved through pathways autonomously, using touch to detect walls and adjust its movement. With no cameras or external tracking systems, it relied entirely on its proprioceptive abilities to find its way out. In the second scenario, a human operator guided the robot through a simulated massage or medical procedure on a manikin. The robot learned the operator’s movements and forces, then replicated them with high accuracy.
“It could detect external contact within 0.4 seconds and distinguish its source with remarkable precision,” adds Prof Laschi. “The robot also identified the direction of applied forces with an error margin below 10 degrees, even in dynamic environments.”
“It could detect external contact within 0.4 seconds and distinguish its source with remarkable precision.”
Sensory breakthrough for better human-robot interaction
There are possibilities aplenty for soft robots with heightened senses. “They could be used as highly responsive arms for an octopus-inspired robot, deployed for autonomous underwater exploration, environmental monitoring and other operations,” says Prof Laschi.
Assistive soft robots may give a better human-robot interaction experience, using physical contact and delivering assistance to senior citizens.
“Robotics is inherently a cross-disciplinary field — bringing together expertise from various domains, from engineering and biology, to neuroscience, material science and AI, to realise its practical applications with real-world impact, in industry, exploration and monitoring, medicine and healthcare, and many more.”
Soft robots could also assist surgeons in minimally invasive operations, exerting just the right amount of force while manoeuvring through delicate tissues. What’s more, their capacity to learn from human guidance and repeat actions makes them well-suited for rehabilitative and therapeutic tasks.
“Robotics is inherently a cross-disciplinary field — bringing together expertise from various domains, from engineering and biology, to neuroscience, material science and AI, to realise its practical applications with real-world impact, in industry, exploration and monitoring, medicine and healthcare, and many more,” adds Prof Laschi.
Looking ahead, the team aims to develop the idea of brain-inspired predictions further, using machine learning for building the internal models that brains build from experience and use for predictions. With our expected perception framework, the researchers plan build robots for assisting the elderly and their caregivers, as well as workers in most physically-demanding tasks.