One of the more persistent and long-standing Apple rumors has been the launch of new AirPods models with built-in cameras. Another leaker added their support for this idea just yesterday.
Exactly what role these cameras would perform has been the subject of much speculation, with some suggesting they will be used to support Apple Intelligence visual features. While that is certainly possible, I can’t help wondering whether the reports point to support for hand gestures similar to those used with Vision Pro …
AirPods cameras
The first report of AirPods with cameras dates back almost two years now. Both Bloomberg’s Mark Gurman and Apple analyst Ming-Chi Kuo have said that the feature is on the way, likely sometime this year. What hasn’t been clear, however, is the reason for this.
So far, speculation has centred on two different possibilities.
First, that the cameras are internal, being used in a similar way to the Apple Watch sensors to provide health data. An infrared camera should be able to read heart-rate data in order for it (or a connected iPhone) to perform similar analysis to the Apple Watch.
Second, to provide a view of the user’s surroundings in order to enable visual Apple intelligence features. Gurman has said this is his expectation.
Apple is working on a new version of the AirPods Pro that uses external cameras and artificial intelligence to understand the outside world and provide information to the user. This would essentially be the smart glasses path — but without actual glasses.
(A third one has been to adapt spatial audio while wearing Vision Pro, but since the headset already has all of the cameras and sensors needed to orientate itself in space, I can’t really see an argument for this.)
Vision Pro-style hand gestures
... continue reading