is a senior reporter focusing on wearables, health tech, and more with 13 years of experience. Before coming to The Verge, she worked for Gizmodo and PC Magazine.
I dream of a gadget that can do it all. Instead, when I leave for the office, I pack one or two phones, a portable battery bank, a laptop, a Kindle, a new product I’m testing, and at least one pair of earbuds. In my backpack, there’s a pouch full of cords and adapters. On my body, I usually sport between two and four wearable devices. I know mine is a “gadget maximalist” life. But, surely, one day, the powers that be will convene and society will decide on the Next Must-Have Gizmo — one all-powerful, do-everything device that will replace the phone.
Google doesn’t seem to think so. At least, not based on what I witnessed at last week’s Made by Google event.
At a studio in Brooklyn Navy Yard, Google showed off four phones, a smartwatch, and a pair of earbuds. That’s fairly typical for a product launch, but something about this year’s updates was different. It wasn’t just the odd keynote format, or the latent anxiety of Gemini getting stuffed into every single corner of every product. It was the uncanny feeling that AI won’t be the thing that tears down walled gardens. It’ll strengthen them. Instead of streamlining the number of gadgets we carry around, it might make them multiply.
The word “vibe” gets overused these days, but in that weird Brooklyn TV studio, I felt a palpable vibe shift in mobile computing. Especially with wearables. Where think pieces and online discourse used to argue that wearables were dead, the category is now being positioned as a vanguard for AI.
“The first 15 years of wearables were very much, ‘gather data, the quantified self.’ That’s where Fitbit started,” explains Sandeep Waraich, Google’s product lead for Pixel wearables. “It was episodic data and that’s fast moving to continuous insights because data only goes so far. It’s moving from highly generic to something very personal.”
You can see where this is all going. Wearables generate a ton of data. It’s a lot harder than it sounds to find actionable insights in a way that’s digestible and keeps people engaged long-term. It’s the sort of task that AI would theoretically be good at — which is why you see every fitness tracker and app on the market hopping on the bandwagon.
At the same time, as Waraich describes it, wearables are the “only one device in our computing lives that is guaranteed on-body presence.” Your phone may seem like it’s glued to your hand, but even it might be left behind on a table, stashed in a purse, or turned off at a show. If you want the most personalized, always-available AI assistant, it has to know absolutely everything there is to know about you. Is there a better way to do that than to be on you?
The problem with AI hardware is that we’re in the spaghetti stage. No one knows what the winning formula is, and so every idea under the sun is going to get thrown at the wall until something sticks. You have your always-listening life recorders that purport to be your second memory. Meta’s hypothesis is that multimodal smart glasses are the platonic ideal gateway to AI. Jony Ive and Sam Altman can afford to be hyper vague about whatever project they’re working on because anything they say at this point could be correct.
But to hear Google tell it, no one form factor is going to reign supreme.
... continue reading