Lab Manager | Run Your Lab Like a Business

News

Making Robots More Human

From joy to sadness, facial expressions could soon be decipherable to robots.

by American Chemical Society
Register for free to listen to this article
Listen with Speechify
0:00
5:00

robot expressionsFrom joy to sadness, facial expressions could soon be decipherable to robots. Graphic courtesy of the American Chemical SocietyMost people are naturally adept at reading facial expressions — from smiling and frowning to brow-furrowing and eye-rolling — to tell what others are feeling. Now scientists have developed ultra-sensitive, wearable sensors that can do the same thing. Their technology, reported in the American Chemical Society journal ACS Nano, could help robot developers make their machines more human.

Nae-Eung Lee and colleagues note that one way to make interactions between people and robots more intuitive would be to endow machines with the ability to read their users' emotions and respond with a computer version of empathy. Most current efforts toward this goal analyze a person's feelings using visual sensors that can tell a smile from a frown, for example. But these systems are expensive, highly complex and don't pick up on subtle eye movements, which are important in human expression. Lee's team wanted to make simple, low-cost sensors to detect facial movements, including slight changes in gaze.

The researchers created a stretchable and transparent sensor by layering a carbon nanotube film on two different kinds of electrically conductive elastomers. They found it could tell whether subjects were laughing or crying and where they were looking. In addition to applications in robotics, the sensors could be used to monitor heartbeats, breathing, dysphagia (difficulty swallowing) and other health-related cues.

Get training in Safety Culture and earn CEUs.One of over 25 IACET-accredited courses in the Academy.
Safety Culture Course

The authors acknowledge funding from the National Research Foundation of Korea.