Forward-looking: A new chapter in human-computer interaction is unfolding at Meta, where researchers are exploring how the muscles in our arms could soon take the place of traditional keyboards, mice, and touchscreens. At their Reality Labs division, scientists have developed an experimental wristband that reads the electrical signals produced when a person intends to move their fingers. This allows users to control digital devices using only subtle hand and wrist gestures.
This technology draws on the field of electromyography, or EMG, which measures muscle activity by detecting the electrical signals generated as the brain sends commands to muscle fibers.
Typically, these signals have been used in medical settings, chiefly for enabling amputees to control prosthetic limbs. Meta's work, however, rekindles decades-old ideas by leveraging artificial intelligence to make EMG an intuitive interface for everyday computing.
"You don't have to actually move," Thomas Reardon, Meta's vice president of research heading the project, says. "You just have to intend the move."
Since experiments began, the prototype wristband has evolved beyond clunky early visions. Now, by detecting faint electrical pulses before any visible movement, the device allows actions like moving a laptop cursor, opening apps, or even writing in the air and seeing the text appear instantly on a screen.
"You don't have to actually move," Thomas Reardon, Meta's vice president of research heading the project, told The New York Times. "You just have to intend the move." This distinction – where intent is enough to trigger action – a leap forward that could outpace the very fingers and hands we've long relied upon.
Unlike some emerging neurotechnology approaches, Meta's solution is entirely noninvasive. Efforts by startups like Neuralink and Synchron are focused on brain-implanted or vascular devices to read thoughts directly, complicating adoption and raising risks. Meta sidesteps surgery: anyone can fasten this wristband and begin using it, thanks to advanced AI models trained on the muscle signals of thousands of volunteers.
"The breakthrough here is that Meta has used artificial intelligence to analyze very large amounts of data, from thousands of individuals, and make this technology robust. It now performs at a level it has never reached before," Dario Farina, a bioengineering professor at Imperial College, London, who has tested the system, told The New York Times.
With EMG data from 10,000 people, Meta engineered deep learning models capable of decoding intentions and gestures without user-specific training.
Patrick Kaifosh, director of research science at Reality Labs and a co-founder of Ctrl Labs, which was acquired by Meta in 2019, explained that "out of the box, it can work with a new user it has never seen data for." The same AI foundations that support this wristband also drive progress in handwriting recognition and gesture control, freeing users from physical keyboards and screens.
... continue reading