In a lot of ways, Meta's hasn't changed much with its second-gen Ray-Ban glasses. The latest model has the same design and largely the same specs as the originals, with two important upgrades: longer battery life and improved video quality. At the same time, the Ray-Ban Meta glasses have a lot of features that didn't exist when I first reviewed them two years ago, largely thanks to AI. And with the release of its second-generation frames, there's still a lot to look forward to, like new camera features and AI-powered audio. The good news is that Meta isn't limiting these updates to its newest frames, so if you have an older pair you'll still see the new features. But, if you've been on the fence about getting a pair, there's never been a better time to jump in. Engadget 87 100 Expert Score Ray-Ban Meta (second generation) Meta's second-generation smart glasses are becoming a genuinely useful accessory. Pros Noticeably better battery life YouTuber-friendly 3K video Meta AI translations are a game-changer for travel Cons Framing POV photos and video is still a challenge Pricey lens upgrades $379 at Meta Same look, (slightly) better specs Meta and EssilorLuxottica haven't strayed too far from the playbook they've used for the last two years. The second-generation Ray-Ban Meta glasses come in a handful of frame styles with a number of color and lens variations that start at $379. I tried out a pair of Wayfarer frames in the new "shiny cosmic blue" color with clear transition lenses. Advertisement Advertisement I personally prefer the look for the slightly narrower Headliner frames, but the second-gen glasses still look very much like traditional Wayfarer glasses. I've never been a fan of transition lenses for my own prescription eyewear, but I'm starting to come around on them for smart glasses. As Meta has improved its cameras and made its AI assistant more useful, I've found more reasons to wear the glasses indoors. The second-generation Ray-Ban Meta glasses come with clear frames, with polarized and transition lenses available as an upgrade. (Karissa Bell for Engadget) Also, if you're going to be paying $300 or more for a pair, you might as well be able to use them wherever you are. It also helps that the transition lenses on the second-gen Ray-Ban Meta glasses get a bit darker than my first-gen Wayfarers with transition lenses. Upgrading from the standard clear lenses will cost you, though. Frames with polarized lenses start at $409, transitions start at $459 and prescription lenses can run significantly more. As with the recent Oakley Meta HSTN glasses, the second-gen Ray-Bans come with a longer battery life and better camera. Meta says the battery can last up to eight hours on a single charge with "typical use." I was able to squeeze a little more than five and a half hours of continuous music playback. That's a noticeable step up from the battery on my original pair which, after two years, is starting to show its age. The glasses also now support higher-resolution 3K video recording, but the 12MP wide-angle lens shoots the same 3,024 x 4,032 pixel portrait photos as earlier models. The second-gen glasses have the same design as the first-gen, with a capture button on the right side of the frames. The charging case provides an additional 48 hours of battery life. (Karissa Bell for Engadget) For videos, there’s a noticeable quality boost, but I still think it's probably not necessary for most people if you're primarily sharing your clips on social media. It does make the glasses more appealing for creators, though, and judging by the number of them in attendance at Connect, I suspect Meta sees them as a significant part of its user base. I'm looking forward to Meta adding the ability to record Hyperlapse and slow-motion videos, though, as I think these may be more interesting than the standard POV footage for everyday activities. Meta AI + what's coming Two years ago, I was fairly skeptical of Meta's AI assistant. But since then, Meta has steadily added new capabilities. Of those, the glasses' translation abilities have been my favorite. On a recent trip to Argentina, I used live translation to follow along with a walking tour of the famous Recoleta cemetery. It wasn't perfect — the feature is meant more for back-and-forth conversations rather than extended monologues — but it allowed me to participate in a tour I would have otherwise had to skip. (A word of warning: using the live translation for an extended period of time is a major battery killer.) Advertisement Advertisement Meta AI can also provide context and translations in other scenarios, too. I spent some time in Germany while testing the latest second-gen Ray-Ban glasses and found myself repeatedly asking Meta to translate signs and notices. For example, here's how Meta AI summarized this collection of signs. Meta AI was able to translate these signs (left) when I asked it "what do these signs say?" (Karissa Bell for Engadget) As I wrote in my review of the Oakley Meta HSTN glasses, I still haven't found much use for Live AI, which lets you interact with the assistant in real-time and ask questions about your surroundings. It still feels like more of a novelty, but it makes for a fun demo to show off to friends who have never tried "AI glasses." There are also some very interesting accessibility use cases that take advantage of the glasses' cameras and AI capabilities. Features like "detailed responses" and support for "Be My Eyes" show how smart glasses can be particularly impactful for people who are blind or deal with low vision. One AI-powered feature I haven't tried out yet is Conversation Focus, which can adjust the volume of the person you're speaking to while dampening the background noise. Meta teased the feature at Connect, but hasn't said exactly when it will be available. But if it works as intended, I could see it being useful in a lot of scenarios. I'm also particularly intrigued by Meta's Connect announcement that it will finally allow third-party developers to create their own integrations for its smart glasses. There are already a handful of partners, like Twitch and Disney, which are finding ways to take advantage of the glasses' camera and AI features. Up to now, Meta AI's multimodal tools have shown some promise, but I haven't really been able to find many ways to use the capabilities in my day-to-day life. Advertisement Advertisement Allowing app makers onto the platform could change that. Disney has previewed a smart glasses integration for inside of its parks that would allow visitors to get real-time info about the rides, attractions and other amenities as they walk around. Golf app 18Birdies has shown off an app to deliver stats and other info while you're on the course. Should you buy these? And what about privacy? When the Ray-Ban Meta glasses came out two years ago, this was a pretty straightforward question to answer. If the idea of smart glasses with a good camera and open-ear speakers appealed to you, then buying a pair was a no-brainer. Now, it's a bit more complicated. Meta is still updating its first-gen Ray-Ban glasses with significant new features, like Conversation Focus, new camera modes and third-party app integrations. So if you already have a pair, you won't be missing out on a ton if you don't upgrade. (And with a starting price of $299, the first-gen glasses are still solid if you want a more budget-friendly option.) There are also other options to consider. The upcoming Oakley Meta Vanguard glasses come with more substantial hardware upgrades and other unique features that will appeal to athletes and anyone who spends a lot of time outdoors. And on the higher end, there are the $799 Meta Ray-Ban Display glasses that blend AR elements with its existing features in an intriguing way. Meta has already previewed several new features, like new camera modes and Conversation Focus. (Karissa Bell for Engadget) Advertisement Advertisement I also have many of the same concerns about privacy as I did when I reviewed Meta's first Ray-Ban branded glasses back in 2021. I'm well aware Meta already collects an extraordinary amount of data about us through its apps, but glasses just feel like they provide much more personal, and potentially invasive, access to our lives. Meta has also made some notable changes to the privacy policy for its glasses in recent months. It no longer allows users in the United States to opt out of storing voice recordings in its cloud, though it's still possible to manually delete recordings in the Meta AI app. The company says it won't use the contents of the photos and videos you capture to train its AI models or serve ads. However, images of your surroundings processed for the glasses' multimodal features like Live AI can be used for training purposes (these images aren't saved to your device's camera roll). Meta's privacy policy also states that it uses audio captured via voice commands for training. And it should go without saying, but anyone using Meta's glasses should be very careful about sharing their interactions with its AI app, as a bunch of users have already seemingly inadvertently shared a ton of highly-personal interactions with the world. If any of that makes you uncomfortable, I'm not here to convince you otherwise! We're still grappling with the long-term privacy implications of generative AI, much less generative AI on camera-enabled wearables. At the same time, as someone who has been wearing Meta's smart glasses on and off for more than four years, I can say that Meta has been able to turn something that once felt gimmicky into a genuinely useful accessory.