Tech News
← Back to articles

Cooking with glasses

read original related products more articles

I've been thinking the new Meta Ray-Ban augmented reality glasses. Not because they failed onstage, which they absolutely did. Or that shortly after they received rave reviews from Victoria Song at The Verge and MKBHD, two of the most influential tech reviewers. My impression is that the hardware has improved but the software is pretty bad.

Mostly I keep thinking about the cooking demo. Yeah, it bombed. But what if it worked? What if Meta releases the third iteration of this hardware next year and it worked? This post is just questions.

The demos were deeply weird: both Mark Zuckerberg and Jack Mancuso (the celebrity chef) had their AR glasses in a particular demo mode that broadcasted the audio they were hearing and the video they were seeing to the audience and to the live feed of the event.

Of course they need to square the circle of AR glasses being 'invisible technology,' but showing people what it does. According to MKBHD's review, one of the major breakthroughs of the new edition is that you can't see when other people are looking at the glasses built-in display.

I should credit Song for mentioning the privacy issues of the glasses in her review for The Verge. MKBHD briefly talks about his concern that people will use the glasses to be even more distracted during face-to-face interactions. It's not like everyone's totally ignoring the implications.

But the implications. Like for example: I don't know, sometimes I'm cooking for my partner. Do they hear a one-sided conversation between me and the "Live AI" about what to do first, how to combine the ingredients? Without the staged demo's video-casting, this would have been the demo: a celebrity chef talking to himself like a lunatic, asking how to start.

Or what if we're cooking together – do we both have the glasses, and maybe the Live AI responds to… both of us? Do we hear each other's audio, or is it a mystery what the AI is telling the sous-chef? Does it broadcast to a bluetooth speaker maybe?

Maybe the glasses are able to do joint attention and know who else is in your personal space and share with them in a permissioned way? Does this lock everyone into a specific brand of the glasses or is there some open protocol? It's been a long, long time since I've seen a new open protocol.

Marques's video shows an impressively invisible display to others, but surely there's some reflection on people's retinas: could you scan that and read people's screens off of their eyeballs?

I hate to preempt critiques but the obvious counterpoint is that "the glasses will be good for people with accessibility needs." Maybe the AI is unnecessary for cooking, and a lot of its applications will be creepy or fascist, but being able to translate or caption text in real-time will be really useful to people with hearing issues.

... continue reading