As soon as I saw someone wearing Meta's new AR glasses for the first time, I knew they looked different. Thicker than regular Meta Ray-Bans, they reminded me of something filmmaker Martin Scorsese would wear. They were stylish, with a sort of translucent-ish brown frame. I also noticed its sidekick, the ribbed fabric wristband, but only because I was looking for it.
The new Meta's Ray-Ban Display Glasses are very real. They go on sale Sept. 30 for $799.
And they're wild to use. When wearing a pair myself, I saw a display projected on the lens of my right eye. I navigated apps with little flicks and pinches of my right hand sporting the neural band, which is part of the set.
Watch this: Meta Ray-Bans Get Built-In Displays and a Neural Wristband 05:53
The current Meta Ray-Bans are different: They don't have a display or a wristband, and they cost a lot less. They're also the best smart glasses right now. I'm not sure how many people will find Meta's new Display Glasses essential or affordable. Also, they won't work with all prescriptions.
For example, I have a -8.00 refraction, and Meta's newest glasses won't work for me. They're only designed to fit a +4.00 to -4.00 prescription range for now (though Meta provided some chunky lens inserts so I could actually see the demo). That missing piece says a lot about where Meta is with designing smart eyewear for everyone.
The Display Glasses are the most expensive and ambitious device announced at Meta Connect 2025, but they're not alone. The company also debuted the newest generation of standard Meta Ray-Bans, available now, with better cameras and battery life, as well as the fitness-focused curved Oakley Vanguard glasses, coming Oct. 21.
Meta's new glasses are a package with the neural band: They're a set. Meta
It's all in the wristband
While Meta's Ray-Bans have become a success as wearable smart tech, the next steps on the road to augmented reality glasses are more challenging. A year ago, I tried Meta's moonshot prototype for AR glasses, Project Orion, which had large 3D displays, eye tracking and worked with a neural wristband. The promise was to make glasses almost an augmented extension of the body, a way to bring AI to your wrist and eyes.
As I expected, that neural wristband is back and finally debuting with these new Ray-Ban Display Glasses. It's part of the $800 package and easily the most interesting element. But while these new glasses miraculously exist as an actual product just a year after Orion, they lack a lot of Orion's futuristic extras. There's no eye tracking, 3D or spatial awareness, and there's only a small color display in one eye.
Still, my roughly 40-minute demo with Ray-Ban Display Glasses at Meta's campus showed me that the future of glasses is coming, alongside wearable wrist interfaces. The future of AI on glasses is also evolving, and it's fascinating.
During my demo, I felt myself becoming augmented, sometimes in odd ways. So I do have major questions about how it'll work in the real world and not feel distracting or distancing.
Meta's head of wearables, Alex Himmel, is wearing the sand-colored Meta Ray Ban Display glasses. You can't even see the display in his glasses, but he can. Scott Stein/CNET
You can hardly see what I'm doing
One of the most amazing things about Meta's Ray-Ban Display Glasses is that you can't see a display projected in the right lens. Like, at all.
My colleagues and I looked at the glasses worn by Alex Himmel, Meta's vice president of wearables. The lenses appeared totally clear, with no rainbow-smudge-type patch usually seen on AR glasses. (That smudgy look is normally due to waveguides, which refract light from side-projected display engines.) The Ray-Ban Displays use LCOS, or liquid crystal on silicon, projection tech -- and the waveguides are remarkably invisible.
This makes the idea of wearing these glasses in the real world intriguing. Maybe I could privately see things displayed and not weird anyone out. But what would people think when I suddenly start swiping my fingers in odd ways while staring off in the distance?
Meta's illustration of the display in the Ray-Ban Display glasses is a pretty good representation of what I saw. The display is only visible in one eye. Meta
The single 600x600-pixel color display on Ray-Ban Display Glasses isn't very large. The field of view is only about 20 degrees, a fraction of what's normally on any pair of AR glasses. But it has a 42 pixel-per-degree density, which looks crisp enough. I was able to make out really small text on Spotify album cover art, for example.
It's odd to have a display in just one eye, though. The sort of "half-there" experience makes the display feel a little hard to focus on at times. It looks projected a couple of feet in front of me.
To make the display visible outdoors, Meta Ray-Ban Displays come with transition lenses by default. I stepped outside and saw my lenses dim, and I stared at the sun and could still make out the pop-up display in front of me.
Meta promises a surprising 6 hours of mixed-use battery life for these Ray-Bans, which is more than I expected. However, take that with a grain of salt. I would assume that heavy use, especially with the display on, eats into that time.
The Meta Neural Band next to the glasses I wore (with their chunkier lens inserts put in just for me to see in this demo). Scott Stein/CNET
Neural wristband feels like the start of a new paradigm
I've been using hand gestures on my devices for a while, whether it's hand tracking on Meta Quest or Apple Vision Pro, or the growing number of gestures that Apple Watches support. Meta's snug little neural band, however, seems capable of doing a lot more. It's tech I've seen Meta evolving over the years. Mark Zuckerberg demoed it to me back in 2022.
The snug fabric band fit me the same as when I tried it last year. It's meant to sit a bit past your wristbone, riding a little higher up than most watches. Inside, electrodes rest at various spots around your wrist. They measure electrical signals given off by motor neurons, even when making very subtle hand movements. Meta has trained the band's algorithms to detect forefinger and middle finger pinches, wrist turns and thumb swipes, while a closed fist "scrolls" back and forth or up and down.
The pinches, taps and swipes really do work, even when my hand is resting out of sight. But the particular language of taps and swipes was sometimes confusing to remember. Unlike Meta's Orion glasses, these Ray-Bans have no eye tracking onboard. As a result, swiping to various app icons or parts of the display sometimes takes more effort than simply glancing and tapping, like on Orion or Apple Vision Pro.
Meta's neural band has electrodes on the inside, and fits snugly on my wrist. Meta
I played music via Spotify, and then held my fingers together and turned them like a dial to increase and decrease volume. I digitally zoomed in using the display to take photos using the same twist-pinch gesture. I tapped to answer an incoming call from a Meta employee down the hall, which came in as a video chat. I could share my POV camera view at the same time.
These Display Ray-Bans still have a touchpad on the side of the glasses and capture buttons, and they can work with voice commands just like other smart glasses. The neural band, however, is a necessary navigating tool for going through the menus in the display and finding apps in the OS. I practiced invoking the display to turn on with middle-finger pinches.
The idea of suddenly having a way to do spontaneous gestures on the fly feels like a magic trick. If this band can work with more devices, it could feel like a universal interface. That's a big if, though. Meta's main hardware products are VR headsets and glasses, and it's hard to imagine how something like this could ever interface with a phone OS or a computer without a major tech platform partnership. Right now, that isn't on the table.
But Meta promises the neural wristband could evolve into more uses. In a surprising demo, another team member wearing the band was handwriting messages with her finger on her pants leg. That's a feature expected to come in the future. I remember Meta's Reality Labs Research Chief Scientist Michael Abrash discussing the possibilities of this tech with me years ago.
In the realm of accessibility, it's possible that neural input tech could even assist people with little to no motor function or loss of limbs. But right now, this neural band is designed to fit on a wrist. At Meta's campus, I spoke with athlete Jon White, a Paralympic trainee and triple amputee military vet who's been testing Orion and Meta's new glasses and neural band. He shared thoughts about how useful the glasses have been while kayaking, and also wondered about how this tech could extend further.
I'm curious, though, if I'd even remember to wear the neural band. One more device to put on with smart glasses feels like a hassle. Meta doesn't have its own smartwatch, but a band like this would make sense as part of a watch. The screenless neural band fits comfortably enough, almost like a Whoop band.
Meta says the band's battery lasts 18 hours on a charge. It is also IPX7 water resistant, so it could withstand a splash or brief submersion. Still, I imagine myself wearing a watch on one wrist and this neural band on the other, and that feels like a lot.
Live captions on the Ray-Ban Display glasses can isolate voices in a crowd. Meta
Apps: an unknown assortment, largely Meta-focused
The big pitch for Ray-Ban Display Glasses is doing a whole lot more in-glasses, possibly while not even looking at your phone. For the most part, the apps I saw demos of are based on Meta's platform. Communicating comes from WhatsApp and Messenger. Meanwhile, Meta AI provides AI assistance for live captioning or heads-up display-enabled informational search. In another demo, I was sent an Instagram video link about my beloved New York Jets. I watched a clip from a Jets-Pats playoff game circa 2011, something I remember also doing in Meta Orion glasses.
Smart glasses are only as good as the services they can smartly work with. I already find Meta Ray-Bans surprisingly good, but also quite limited. They can't see my emails, or my notes, or lots of things I do on my phone. They run their own AI that isn't the same as other AI tools or platforms I might be using from Google or Apple.
Meta's images of things I demoed in-display aren't exactly the same thing: they only appear in one eye, which sometimes feels a bit weird. Meta
Will Meta gather enough app developers to make these Ray Ban Display Glasses feel rich enough to truly make me open my phone less? I don't know.
Some ideas wowed me. Video chats in-glasses can share my POV, opening up some interesting ideas for telepresence or assistance. Also, an impressive live captioning mode showed me conversational captions from people in the room in real time while also focusing on the conversation I was looking at and filtering out the others.
Meta has maps on these glasses, too, with real-time navigation that can show me directions. I saw maps on Google's prototype display glasses last year, so this idea isn't new, though Meta's glasses are the first to arrive. But, again, I use Google Maps as a search engine. Will Meta's maps feel as connected to my search interests, or will it even know where my home or friends' homes are?
Some AI functions are also just weird. I was invited to have Meta AI generate a new version of a photo of Alex Himmel on the fly, in a pop art style. Inexplicably, Meta AI then suggested I add a "sugar vortex," and then pastries, a giant mushroom and a frog waiter. I kept adding stuff. I made a bizarre photo, but why?
Meta AI can do some helpful contextual things, though, which can sometimes save time. I asked about Justin Fields' stats for the Jets, and the little button suggestions below brought up career stats, and stats as a Bears Quarterback options, just in case I wanted to feel a little sadder about my fan tendencies.
Meta
Is the augmented-body future already here?
You can probably tell I have a lot of questions about these glasses. That's because the further Meta pushes into utility and true wearable computing, the more it faces an uphill challenge integrating its glasses' vision into the phones we're already carrying in our pockets. Meta Ray-Ban Display glasses connect to phones using Bluetooth 5.3 and Wi-Fi 6, but likely in a similar way that existing Ray-Bans do: with a limited range of direct hook-ins to the phone apps we use regularly.
These glasses look and feel a lot more impressive than I was expecting, but they're also a functional step down from the Orion dream of last year, a vision that Meta says it's still working on.
In the meantime, what will wearing these glasses mean for everyday life? Will I swipe, tap and glance as I walk into a supermarket or drop my kid off at school? Will I find all these things I can do or invoke feel like possibilities or distractions?
It seems like the start of body augmentation with tech in a whole new and expansive way. Meta won't be the only one to do this. The conversations laid out back in the Google Glass days are starting again, now.
And neural tech, still in its infancy, is going to grow. After the wrist, what else will be possible? Will Mark Zuckerberg's plans for making these glasses cognitive extensions really ever work? And even if they do, how will that impact all of us?
Meta
Personally, I'm also a bit sad that I can't wear them as everyday glasses yet. The lack of a deeper range of prescription support means I may have to wear contacts to test these Ray-Bans in a few weeks. For my demo, Meta provided prescription inserts that added to the lens's thickness, and generally worked fine, but inserts aren't available for anyone to order.
It's just another sign that the Ray-Ban Display Glasses, currently a US-only product until early 2026, are also very much a work in progress.
For most people, the more affordable everyday Ray-Ban and Oakley smart glasses, which have been upgraded to have better 8-hour battery life and better cameras, are the way to go.