Tech News
← Back to articles

Qualcomm's Latest Chip Could Lead a New Wave of Camera-Equipped AI Watches and Wearables

read original related products more articles

I've been steeling myself for a coming wave of AI-infused wearables that could be worn all over the place, based on reports on gadget plans at Meta, Google and Apple -- a halo of connected tech with cameras onboard, streaming to AI services. Qualcomm's latest chip, announced Monday at Mobile World Congress in Barcelona, is built for it, and the first devices using it are coming this summer. Samsung, Google and Motorola are already building hardware with it.

I sat down with John Kehrli, senior director of product management for Qualcomm, to discuss the newest wearable chip push, and it caught my attention on several levels. The reason you should care is that this is a clear preview of tech products to come: Qualcomm's chips power almost all of the non-Apple watches, VR headsets and smart glasses out there.

While Qualcomm has had separate chip lines for smartwatches and for smart glasses and VR headsets, the new Snapdragon Wear Elite chip aims to bridge across categories. It's a higher-powered watch chip filled with different wireless connection capabilities, but it is also made to support video input and streaming for AI, even 1080p video output to displays. That could include AI-infused smart glasses.

"It's not just the watch: for sure that's a focus for us, but the portfolio [of devices] has expanded dramatically," Kehrli says.

Here's the news about Snapdragon Wear Elite that stood out for me.

Qualcomm's new chip design is meant to be flexible in form. It could end up many places. Qualcomm

A lot more onboard processing for offline AI

A big part of Qualcomm's push on these chips is to do more generative AI and LLM work on device, a trend I expect to grow. The Snapdragon Wear Elite looks a lot more powerful than previous Qualcomm watch chips. Some of the offline, on-device functions could be voice-based AI, for fitness or, according to Qualcomm, for "life logging."

I'm not sure I need life logging, but I'd be interested in having more AI-based controls for wearables. The extra power looks to also drive video on displays and run onboard cameras, including video streaming. The whole idea behind next-wave multimodal AI is to have AI services be aware of what you're doing -- that'll mostly happen via camera access.

Kehrli says the processing cores for the neural processing unit on the Snapdragon Wear Elite could support AI models of up to 2 billion parameters on device, at about 10 tokens per second to process. He sees that being good enough for a lot of offline needs, with cloud-connected AI kicking in when needed otherwise.

... continue reading