Last December, I wore Google Glasses in several forms while they were still under development. Soon you'll be able to get your hands on the final versions. When, exactly, and for how much? We may find out more in just a handful of days.
While Meta has been the biggest tech company aiming for a place on your face in glasses form, it's far from the only one. Google's about to enter the race with a whole range of smart glasses, the company's first return to everyday face tech since Google Glass in 2013.
This time, the focus is almost entirely on AI. Gemini will be the reason and the biggest function for what makes Google's Android XR glasses work, but they'll come in a wide range of designs: Warby Parker, Gentle Monster, Kering Eyewear and Samsung are all expected to have their own models. Xreal, a maker of display glasses, will have an additional plug-in mixed reality device called Project Aura, too.
This year's Google I/O developer conference is just around the corner on May 19, and we should hear a lot more about Google's smart glasses strategy. But we already know a lot, since Google talked about and demoed these glasses last year. Now that we're in 2026, all these glasses should finally arrive, and if you've even been half-thinking about getting a pair of smart glasses, you'll want to see what they're all about.
Watch this: What to Expect From Google I/O: Glasses, Glasses, Glasses 05:58
All about Gemini
Google, Samsung and Qualcomm have been collaborating on Android XR, a new OS for a whole range of mixed reality headsets, AI glasses, display-enabled glasses and eventually augmented reality glasses. The first product of this collaboration, Samsung Galaxy XR, arrived last fall.
Galaxy XR is very much a VR headset, but also a mixed reality computer, similar to the Apple Vision Pro and the Meta Quest 3. It runs Android apps via its Android XR OS, and also has Gemini AI that can respond to voice, and run live to see anything on your device's screen and in the real world via its external cameras.
That on-tap Gemini assistant is exactly what will be the key app for the next wave of smart glasses. Much like Meta's Ray-Ban and Oakley glasses, which use Meta AI, Google's glasses will use Gemini and also related Gemini apps like Nano Banana and NotebookLM.
Pop-up information on the display-enabled glasses will offer contextual details, like live map data. Google
... continue reading