Tech News
← Back to articles

The breakthrough that makes robot faces feel less creepy

read original related products more articles

When people talk face to face, nearly half of their attention is drawn to the movement of the lips. Despite this, robots still have great difficulty moving their mouths in a convincing way. Even the most advanced humanoid machines often rely on stiff, exaggerated mouth motions that resemble a puppet, assuming they have a face at all.

Humans place enormous importance on facial expression, especially subtle movements of the lips. While awkward walking or clumsy hand gestures can be forgiven, even small mistakes in facial motion tend to stand out immediately. This sensitivity contributes to what scientists call the "Uncanny Valley," a phenomenon where robots appear unsettling rather than lifelike. Poor lip movement is a major reason robots can seem eerie or emotionally flat, but researchers say that may soon change.

A Robot That Learns to Move Its Lips

On January 15, a team from Columbia Engineering announced a major advance in humanoid robotics. For the first time, researchers have built a robot that can learn facial lip movements for speaking and singing. Their findings, published in Science Robotics, show the robot forming words in multiple languages and even performing a song from its AI-generated debut album "hello world_."

Rather than relying on preset rules, the robot learned through observation. It began by discovering how to control its own face using 26 separate facial motors. To do this, it watched its reflection in a mirror, then later studied hours of human speech and singing videos on YouTube to understand how people move their lips.

"The more it interacts with humans, the better it will get," said Hod Lipson, James and Sally Scapa Professor of Innovation in the Department of Mechanical Engineering and director of Columbia's Creative Machines Lab, where the research took place.

See link to "Lip Syncing Robot" video below.

Robot Watches Itself Talking

Creating natural-looking lip motion in robots is especially difficult for two main reasons. First, it requires advanced hardware, including flexible facial material and many small motors that must operate quietly and in perfect coordination. Second, lip movement is closely tied to speech sounds, which change rapidly and depend on complex sequences of phonemes.

Human faces are controlled by dozens of muscles located beneath soft skin, allowing movements to flow naturally with speech. Most humanoid robots, however, have rigid faces with limited motion. Their lip movements are typically dictated by fixed rules, which leads to mechanical, unnatural expressions that feel unsettling.

... continue reading