The robot expresses joy when gently stroked./Courtesy of Ulsan National Institute of Science and Technology

If someone suddenly taps you on the shoulder, your eyes might widen, or you might flinch. People respond instantly to such stimuli, and if the same stimuli are repeated, they gradually become desensitized or change their expression. Researchers in Korea have developed a robot that can mimic this 'flow of emotional changes.'

Professor Lee Hee-seung from the Ulsan National Institute of Science and Technology (UNIST) and his research team announced on the 8th that they had developed adaptive robot technology that expresses emotions through eyes and movements, with responses changing over time.

The robot expresses a total of six emotions through combinations of eye shapes, colors, and movements. Stimuli are input by petting or tapping the robot's head, with petting recognized as a positive stimulus and tapping as a negative stimulus. For example, if the robot is tapped suddenly, its eyes widen and turn blue, with a movement of leaning back to express surprise. If the same stimuli are repeated, the emotion expression changes based on the cumulative value of previous emotional states and stimuli, rather than simply repeating the same response.

When repeatedly inputting negative stimuli by tapping the robot, the robot's emotions naturally change to surprise, anger, and disgust./Courtesy of Ulsan National Institute of Science and Technology

These adaptive expressions recreate a flow of emotions similar to that of real people. In user evaluations, many noted that 'the point of having slightly different responses depending on the situation, even with the same stimuli, felt impressive and unlike simple mechanical reactions.' More than 80% of participants rated the emotional expressions as 'natural and vibrant.'

The research team interpreted emotions as a physical quantity, 'vector,' that changes over time rather than a fixed state, and reflected this in the robot control model. Strong stimuli quickly increase the magnitude of the emotional vector, while weak stimuli gradually alter the response.

Professor Lee Hee-seung said, 'Existing robots were limited to showing fixed emotions in response to stimuli, but this model implements the flow of emotional changes, allowing users to feel as if they are interacting with a living being.' He noted, 'It could be applied in various human-centered robot fields, such as companion robots or emotional support technology.'

This research was accepted by the ICRA (International Conference on Robotics and Automation), a leading international conference in the field of robotics, and was presented at the 2025 ICRA annual conference held in Atlanta, U.S., on May 21.