is a senior reporter and author of the Optimizer newsletter.She has more than 13 years of experience reporting on wearables, health tech, and more. Before coming to The Verge, she worked for Gizmodo and PC Magazine.
This is Optimizer, a weekly newsletter sent every Friday from Verge senior reviewer Victoria Song that dissects and discusses the latest phones, smartwatches, apps, and other gizmos that swear they’re going to change your life. Optimizer arrives in our subscribers’ inboxes at 10AM ET. Opt in for Optimizer here.
There’s a hard conversation to be had about smart glasses in the coming weeks and months. At Meta Connect 2025, I got a first glimpse of the Meta Ray-Ban Display, the company’s first pair of smart glasses with a built-in monocular display. There’s no beating around the bush. The demos I got were nothing short of impressive. But something about having an invisible display and the ability to appear present while secretly doing something else under the table — is eerie. I’ll dive into the questions these glasses raise in the coming weeks, but today I want to focus on one way that Meta’s glasses are genuinely making life better: accessibility.
“For me, missing both my legs means that obviously walking is just a bit more difficult and more hazardous than other people,” Jon White, an inspirational speaker and Paralympic trainee who became a triple amputee after serving as a British Royal Marine, tells me in an interview at Meta’s headquarters ahead of the announcement. “Anything that means I’m not looking at my phone [so] I’ve got my head up, looking around me is much better.”
The Meta Ray-Ban Display glasses add live captioning, which will be a huge help to the hard of hearing.
White says that with only one arm, the ability to respond to messages without needing to pick up a phone in his remaining hand is crucial. Likewise, when White posts on Instagram about his engineering projects, the glasses’ camera allow-=s him to showcase his point of view without having to switch how his phone is positioned for the best angle. In our chat, White relays a story to me about how, when giving a speech, he was given a clicker for his slides and then handed a handheld mic. “I was like, ‘What do you want me to do with this?’”
And that’s just through one lens. Even some of the glasses’ features that seem far-fetched could be game changers for people with visual or hearing impairments. Take Meta’s Live AI feature. In my first impressions of the feature, I questioned the use of AI describing things you can already see. After publication, I was quickly served a piece of humble pie when several members of the low-vision and blind community reached out to tell me how these gadgets enabled them to live more independently. (I invited one to come share their experience on a recent Vergecast episode, which you can listen to here.) One anecdote that stuck with me was the ability to read menus in restaurants. Most eateries don’t carry Braille menus, and even if they did, it’s not a skill every visually impaired person has. Live AI on the glasses can read menu items aloud for visually impaired people, eliminating the need to rely on a sighted person.
It was also sobering to learn that, as a mass market product, the Meta glasses are significantly more affordable than similar gadgets created specifically for the visually impaired community. The glasses cost roughly $300-$400, whereas similar tools like OrCam readers can range from $1,990 to $4,250, with limited options for insurance.
The neural band helps you control the Display glasses with gestures, freeing up your hand.
With the new Meta Ray-Ban Display glasses, I was also struck by a demo of a live captioning feature and how it might help folks who are deaf or hard-of-hearing. (Supposedly, it can provide translated subtitles in real time, too, but I’ll reserve judgment until I get to see that for myself.) Not only were the captions near instantaneous, but they were pretty accurate and unaffected by cross-talk. Because of the directional microphones, only the person you were directly looking at was captioned. And it’s not just the caption feature.
... continue reading