Sabrina Ortiz/ZDNET Follow ZDNET: Add us as a preferred source on Google. ZDNET's key takeaways The Meta Ray-Ban Displays are the company's most advanced smart glasses. The smart glasses feature an in-lens color display and a neural wristband for $799. Both the feel of the glasses and the demo left me impressed. AI has evolved beyond chatbots and into smart glasses, with the goal of providing language processing models with the context of the real world from your point of view. The Meta Ray-Ban Displays take that assistance to the next level with an in-lens color display and a neural wristband, enabling you to do most of what you can on your phone right from your eyes. Also: I tried the Meta Ray-Ban Display glasses, and they got me excited for the post-smartphone era The smart glasses are so daring and promising that, since their launch at Meta Connect last month, it has become nearly impossible to book the demo required before purchasing the glasses. However, I finally had the chance to try the pair and was impressed by its form factor, multimodal AI assistance, and a bonus unreleased feature. Here's what I learned from my hands-on demo of the smart glasses -- including what to expect and whether they're worth the $799 price and the effort of booking a session. A smartphone for your face? A true marker of whether a wearable is worth carrying is if it can replace a device you already own, such as your smartphone. To my surprise, the Meta Ray-Ban Displays passed this test, at least from my initial demo. I was able to use the glasses for most tasks I'd typically use my phone for, including playing music, answering texts, taking photos, browsing social media, and even taking a video call. In every example, the experience was fairly seamless due to the combination of the in-lens display and the neural band. Sabrina Ortiz/ZDNET Despite the glasses appearing to be regular lenses from the outside, they feature an in-lens color display in your right eye. This, in my experience, produces crisp, sharp visuals, even when watching videos like a video call or an Instagram reel. Unlike glasses with optical modules, which I have tested plenty of, the in-lens display makes it seamless to look at and also pretty difficult for anyone else to tell that you are doing so. Also: I tried smart glasses with a built-in display, and they beat my Meta Ray-Bans in key ways The neural band, which is wrapped snugly below your wrist, allows you to interact with the display with the most subtle finger movements. For example, a pinch between your index finger and thumb is used to select, while a pinch between your middle finger and thumb is used to return to the home screen. While it takes a second to get used to, it is pretty intuitive, especially because the gestures can be subtle. At one point, I was making the gesture with my hand inside my sleeve, and with slight taps, I could still interact seamlessly with the glasses. Sabrina Ortiz/ZDNET When the ability to see vivid visuals is paired with the neural band, it basically allows you to access everything you would on your phone, but with the convenience of no one knowing. For example, while sitting in a meeting or class, I could technically answer texts or scroll through Instagram without anyone being aware of it. To further enhance this assistance, Meta is introducing a handwriting gesture feature that enables you to type text by mimicking a writing motion with your fingers, essentially air-drawing the words. I got to demo it and was impressed at how accurate the text was, even though I tried using both script and print. CNET: Meta's Bosworth hints that Neural Band could eventually evolve into a watch Getting used to this motion was even easier than the other gestures, as we are accustomed to making those gestures when writing physically with a pen and pencil, and since you aren't worried about your handwriting, you can be sloppy. Leveled up AI assistance The original Meta Ray-Bans already offered Meta AI assistance from within the smart glasses to perform tasks such as providing more information about what you were looking at, live translating conversations for you, or answering any questions you may have. Sabrina Ortiz/ZDNET Even after reviewing those, I was fairly satisfied with the AI, as it had practical and helpful applications. Now, the Displays do the same thing, but with the added layer of a visual component, which actually makes a world of difference. For example, if you are cooking and want Meta AI to give you a recipe, in the past, you could have asked and listened to the answer. However, this time, you can take it a step further, also viewing the visual cards it creates for you with the ingredients and step-by-step instructions, while also remaining hands-free because you can use the neural band gestures to swipe through. Also: Meta gives advertisers new AI personalization tools - while using your chats to target content In one instance, I asked Meta AI to identify a flower I was looking at. The assistant not only spoke the answer aloud but also displayed the flower's name and image in my viewfinder. Later, I asked it to generate an image of a woman with the same flower on her head. The AI produced the image instantly, visible right in the viewfinder. While it's hard to see an immediate practical use for on-the-spot image generation, it was a compelling demonstration of the technology. The double thumb tap gesture, shown here, activates Meta AI. One of the more practical uses of Meta AI is its ability to display real-time captions of the world around you. The feature could be especially useful for people who are hard of hearing, but it also has everyday benefits -- like following a conversation in a noisy restaurant or crowded room. In my testing, the transcriptions were largely accurate, though the environment was relatively quiet. Do they pass the everyday wear test? Maybe. Let's talk aesthetics. With a wearable as personal as smart glasses, it is more important than ever for them to look good. I was wary that the Meta Ray-Ban Display would be extremely clunky, as they have to pack a lot of tech into a smaller form factor. While there is no denying that they look less like regular glasses than the standard Meta Ray-Bans, they can still pass as regular glasses, albeit statement glasses on the chunkier side. Perhaps it's because I already wear black frames daily, or simply because larger frames suit me, but I found the design surprisingly flattering and wearable. The Sand color, shown below, is a subtle neutral tone that helps soften the look by blending more naturally with the skin. Sabrina Ortiz/ZDNET Ideally, smart glasses should be comfortable enough to wear all day. Even if the battery runs out, those with prescription lenses will still need to wear them. Meta claims up to six hours of mixed use per charge, so comfort becomes just as important as performance. Also: 5 reasons I use local AI on my desktop - instead of ChatGPT, Gemini, or Claude In my brief time with them, I was pleasantly surprised by how comfortable they felt. The fit and weight were similar to my thicker everyday glasses, so if you already wear frames regularly, you're unlikely to notice much discomfort. For context, the glasses weigh 69 grams -- a bit heavier than the Meta Ray-Ban Gen 2's 52 grams. The neural wristband felt exactly as you'd expect -- like a snug, flexible band. It wasn't tight enough to be irritating, though my demo lasted under an hour. A longer wear test would be needed to properly gauge comfort for both devices. Bottom line (for now) The Meta Ray-Ban Display smartglasses feel like a well-developed product that can actually provide users with significant enhancements to their everyday lives, offering style, comfort, and promising use cases. However, at the $799 price point, they're not meant for everyone just yet. Rather, they should appeal most to early tech adopters and those interested in experiencing what may be the future of AI wearables. The company makes a convincing case for that.