Robots like Atlas, Spot, and Stretch have amazed people with natural, life-like agility and body balance. What they were lacking, though, was a way to quickly connect this natural movement to perception—the robotic equivalent of the reflexes that let you catch a ball or duck in an instant to avoid getting hit.
So, a team of scientists at ETH Zürich got busy fixing this problem. “I wanted to fuse perception and body movement,” said Yuntao Ma, a roboticist who led a team developing an AI-powered, badminton-playing robot.
Chasing shuttlecocks
The robot built by Ma’s team was called ANYmal and resembled a miniature giraffe that plays badminton by holding a racket in its teeth. It was a quadruped platform developed by ANYbotics, an ETH Zürich spinoff company that mainly builds robots for the oil and gas industries. “It was an industry-grade robot,” Ma said. The robot had elastic actuators in its legs, weighed roughly 50 kilograms, and was half a meter wide and under a meter long.
On top of the robot, Ma’s team fitted an arm with several degrees of freedom produced by another ETH Zürich spinoff called Duatic. This is what would hold and swing a badminton racket. Shuttlecock tracking and sensing the environment were done with a stereoscopic camera. “We’ve been working to integrate the hardware for five years,” Ma said.
Along with the hardware, his team was also working on the robot’s brain. State-of-the-art robots usually use model-based control optimization, a time-consuming, sophisticated approach that relies on a mathematical model of the robot’s dynamics and environment. “In recent years, though, the approach based on reinforcement learning algorithms became more popular,” Ma told Ars. “Instead of building advanced models, we simulated the robot in a simulated world and let it learn to move on its own.”