Kerry Wan/ZDNET Follow ZDNET: Add us as a preferred source on Google. I got to wear the new Meta Ray-Ban Display glasses at Meta Connect 2025 on Wednesday. And while they are a long way from replacing your smartphone, they are good enough to make it clear that smart glasses are a lot better when they include a heads-up display. From what I experienced on Wednesday, we're definitely taking a step toward a world where we spend less time with our heads buried in our smartphones. Also: Meta Connect 2025 live updates: Ray-Ban Display, Oakley Vanguard smart glasses, more In the Meta Connect keynote on Wednesday evening at the company's headquarters, Meta CEO Mark Zuckerberg said we've lost a little something as a society with the proliferation of smartphones. Meta's hope is that smart glasses can help restore some of what we've lost by allowing people to stay in the moment more often. Meta is attempting to do that by releasing its next-generation glasses with a bright, full-color display and a smart wristband that reads subtle gestures from your hand and can do several new things not possible in a pair of smart glasses until now. Zuckerberg was clearly more giddy about the launch of the Meta Ray-Ban Display glasses than anything else announced at Meta Connect 2025. "We've been working on glasses for over 10 years at Meta, and this is one of those special moments," Zuckerberg said. "I'm really proud of this, and I'm really proud of our team for achieving this." Mark Zuckerberg strode triumphantly onto the Meta Connect 2025 stage wearing the Meta Ray-Ban Display smart glasses. Jason Hiner/ZDNET Also: I tried the Meta Oakley Vanguard smart glasses, and they're cooler than they look The glasses will retail for $799, which includes both the glasses and the wristband. They come in two colors, shiny black and and a transparent light brown, and all pairs have transition lenses. They will support prescription lenses between +4.00 to -4.00, but it's unclear how much prescription lenses will add to the cost. They have 18 hours of battery life with 6 hours of mixed-use, according to Meta. They are also water-resistant. They arrive in stores on September 30, when you'll be able to go and get a demo at Best Buy, Sunglass Hut, LensCrafters, and Ray-Ban Stores. They will eventually be available at some Verizon stores as well. What it's like to wear the Meta Ray-Ban Display glasses These Meta Ray-Ban Display glasses weigh 69 grams, so they are a little larger than the audio-only Meta Ray-Bans that weigh 49 grams. However, when wearing them I didn't find them to be heavy and I barely noticed any difference from the regular Meta Ray-Bans, which I wear almost daily. These Meta Ray-Ban Display glasses have a full-color screen in the right eye and the display sits slightly off to the side so that it doesn't distract you. It outputs up to 5,000 nits of brightness -- brighter than any smartphone or even the Apple Watch Ultra-- so that you can see the display whether you're inside or outside. Meta used a liquid crystal on silicon (LCOS) display and one of the most remarkable things about it is that the person you're looking at can't see it when they are looking at you, so it doesn't become a distraction. The EMG neural wristband on the Meta Ray-Ban Display glasses. Kerry Wan/ZDNET The glasses have a full operating system to take advantage of the display by allowing you to capture photos and videos, listen to music, invoke live captions during a conversation, get visual feedback for questions you ask AI, engage in WhatsApp messaging, and use AI to capture notes on important conversations. Also: I tried smart glasses with a built-in display, and they beat my Meta Ray-Bans in key ways The primary way you interact with the new smart glasses is by using an EMG neural wristband that is as much of a breakthrough as the smart glasses display. Within 5-10 minutes of using the wristband, I was pinching with my thumb and middle finger to go back, swiping across screens with my thumb, pinching with my forefinger and thumb to select things, and pinching with my forefinger and thumb and twisting my wrist to turn a setting up or down (such as volume). Zuckerberg called the wristband "the world's first mainstream neural interface." The wristband itself is made of a sturdy cloth material and uses a combination of a metal cinch and magnets to fit snugly around your wrist. I found it to be pretty comfortable, and within a few minutes I didn't think about being on my wrist. Jason Hiner/ZDNET The display itself is fairly clear and easy to read, although at times I found myself closing my left eye in order to see the screen more clearly through my right eye. But I especially loved being able to frame photos and videos with the screen before shooting or to adjust the framing during a video recording. You can even use the pinch-and-wrist-twist gesture to zoom up to 3x. You can then preview the photos or videos on the glasses and text them to someone or post them on social media without having to use your phone. Also: Meta Ray-Bans vs. Oakley: I tested both smart glasses, there's a clear winner One of the smartest things I saw the glasses do in my time trying them was the live captioning feature, where I identified the person standing right in front of me as the one I wanted it to focus on, and it transcribed only their words while ignoring the words in conversations of other people in the same room. You could easily imagine being in a noisy place and having the device only pay attention to the person you were talking to and transcribing their words on the screen while drawing out the words of others around you. In fact, this would be super helpful when having an important conversation with someone in a loud room so that you could catch everything they were saying. This is related to a new feature called Conversation Focus, which is also on the audio-only smart glasses. It amplifies the audio of someone you're speaking with directly. Meta CEO Mark Zuckerberg and CTO Andrew Bosworth demo the Meta Ray-Ban Display glasses at Meta Connect 2025 on September 17, 2025. Jason Hiner/ZDNET Similarly, the Meta Ray-Ban Display glasses seemed to be better at acting only on my voice when using voice commands and not being distracted by the voices of people around me who were also talking. That is a much smarter feature than the standard Meta Ray-Bans and Meta Oakley glasses are currently capable of. Another one of the smart things I saw the glasses do came when prompting the AI. I asked for a recipe of banana bread and the AI went out to the internet and found one and then organized it into a series of cards that i could swipe through to view step-by-step. This would have been a lot nicer to use hands-free in the kitchen rather than getting food on my phone, tablet, or laptop. Something else that looked interesting but I didn't get to try was the ability to respond to a short text message by using the hand with the wristband to write the letters and then allow the device to turn them into words in a message. It's like using an invisible stylus -- definitely looking forward to testing that. Other things the glasses can do include video calling your friends to let them see what you see and turn-by-turn walking directions (in 28 cities to start, because Meta is doing its own mapping). Also: Are smart glasses with built-in hearing aids viable? My verdict after months of testing Lastly, another one of the features that Meta talked about on stage but I didn't get to try was Live AI. The idea with this feature is that the glasses can see what you see and hear what you hear. So you can start a Live AI session when someone is about to give you some important information that you need to remember, like instructions, explanations, sharing an important idea, or a message to give to someone else. Then, it can take notes, make a summary, record steps to remember, and generally make sure you get it right without missing important details. I'm looking forward to trying the Meta Ray-Ban Display glasses in the weeks and months ahead and reporting back on what I learn. You can expert to see a lot more hands-on perspectives, reviews, and feature breakdowns here on ZDNET. And keep an eye out, because we're also expecting a lot more smart glasses to be released over the next 6-12 months, including Google's re-entry into the space with new glasses powered by Android XR.