I think it’s safe to say that AI gadgets are having a rough time right now. Humane and its Ai Pin are, at this point, completely in the rearview, with the company dissolved and sold to HP. Humane’s counterpart, Rabbit, though it introduced a big software update to its R1 device recently, is still kicking, but that’s about all that I could say in the positive about the orange gadget. What’s worse is that the would-be saviors of the AI gadget category, Sam Altman and Jony Ive, who announced back in May that their plans to create the Next Big Device via a new company called IO, seem to be floundering as well. A new report from the Financial Times just this week suggests Altman and Ive are struggling to make their gadget (a “palm-sized” device) useful, private, or even fully power the thing with AI from the cloud. Not great. That’s a lot of tumult, but despite all those woes, there is one device with “AI” in the name that doesn’t appear to be on the downslope, and it doesn’t live in the palm of your hand, or on your shirt; it lives on your face. I’m talking about smart glasses, specifically Meta’s “AI glasses.” Smart glasses are having a moment right now, and Meta is at the center, thanks in large part to the Ray-Ban branding. It might seem weird to group Meta’s smart glasses in with devices like the R1 or Ai Pin, but Meta would disagree; AI is at the center of its smart glasses, especially its version without a screen. If you’re not familiar with the Ray-Ban Meta AI glasses, the only thing you need to know is that “Hey Meta” is the key to making them feel smart. There’s an integrated voice assistant that you can use to play music, take pictures and videos, call folks with WhatsApp or Instagram, and do typical voice assistant stuff like check the weather or your battery life. That’s just where “Hey Meta” begins, though. In addition to being a voice assistant, the Ray-Ban Meta AI glasses also heavily feature computer vision. Thanks to the cameras and microphones on the AI glasses, you can (in theory) use Meta AI to do lots of stuff that a regular voice assistant can’t do, like translate text and speech, give you context on a piece of art or a product at a store, or describe things you’re looking at, which is a pretty handy ability from an accessibility standpoint. What separates Meta’s AI glasses from Humane’s Ai Pin or Rabbit’s R1, though, is that smart glasses are a form factor that people actually want to use if sales are any indication. Meta has sold 2 million pairs since the AI glasses release in 2023, which isn’t iPhone levels, but is definitely not a bad start for a relatively new device category. Compare those numbers to the 10,000 Ai Pins sold by Humane, and the success looks even more promising. Don’t get me wrong, people are buying smart glasses for a lot of reasons, surely almost none of them AI-related, but just because it’s not top-of-mind doesn’t mean people aren’t going to use it, and maybe even more so than similar tools on a phone. While companies like Google have leaned heavily into AI on Pixel devices by the way of Gemini, the adoption has been tepid; features are new and awareness is still low. There’s also more friction with using computer vision on a phone in particular, since you have to whip out your phone, navigate to a feature, and then do what you came to do. With smart glasses? Not so many barriers. Computer vision is a more natural part of the way you use them, especially because of the emphasis on voice commands in lieu of a touch-based UI. With smart glasses, your device is always out, and it’s always pointed at the thing you want. It’s the form factor (and in some ways the restrictions of that form factor) that gives smart glasses the edge over pins and card-sized computers and even the omnipresent glass slab that is your phone. Now, whether those computer vision-centric features are sticky is another question entirely. Having used the Ray-Ban Meta Gen 1 and Gen 2 extensively, I can tell you right now that more advanced AI commands on the smart glasses are hit or miss at best, even when you genuinely want to use them. I’ll give you an example. I’m at the beach with my mom, hunting for sharks’ teeth in the surf, and I come across a shell that looks a lot like it could be a tooth. It looks like a shark’s tooth, but how can one really be sure? With the Ray-Ban Meta AI glasses already on my face, I look at the shell (or tooth) in my hand and ask, “Hey Meta, how can I tell if this is a shark’s tooth?” The answer? Yes, you’re holding a shark’s tooth. Great! The only thing is, every other black shell that I picked up that clearly wasn’t a tooth was also a tooth according to the glasses. Not so great. On one hand, my request is a tall order (there’s probably a lot of esoteric knowledge that goes into answering a question like that), but on the other, these are the moments where Meta AI should shine. But whether or not my answer was satisfactory, the fact that I even bothered to use Meta AI says a lot, and it’s more than most AI devices can claim. Familiarizing people with AI features is a major part of the battle for AI-centric devices, and training, or God forbid, retraining, people to use devices is tough. For Meta and its increasingly crowded array of smart glasses, there’s a long way to go before Meta AI actually becomes useful, but in the pantheon of AI gadget failure, smart glasses might actually be failing forward.