Tech News
← Back to articles

Apple’s Circle to Search competitor for iOS 26 looks awful

read original related products more articles

Earlier this week, during the WWDC keynote, Apple showed off its new iOS 26. For the first time since iOS 7 in 2013, Apple is revamping the operating system’s look and feel, introducing a very Windows Aero-esque design language called “Liquid Glass” (RIP Windows Vista), and since this was the flashy new thing at the keynote, it’s been the week’s hot topic.

However, we also saw teasers of other new features that aren’t getting the same level of attention. Within the segment on iOS, for example, Billy Sorrentino showed off a new capability of Apple’s AI-powered Visual Intelligence, which is called, pretty simply, Image Search. The way it works is that you take a screenshot of anything you see on your iPhone’s screen. Once you have the screenshot, you can hit the Image Search button in the lower right. Using AI, Visual Intelligence will scan the screenshot and search for things it sees or create calendar events for dates and times revealed in the image.

If this sounds familiar, it’s because Google’s Circle to Search does the exact same thing and has been available for over a year now. However, I’m not bringing this up to do the usual “LOL, Apple stealing from Android!” reaction. I’m bringing it up because, based on what we saw in the video, Image Search within iOS 26 seems uncharacteristically bad.

Visual Intelligence in iOS 26: Circle to Search, but bad

During the keynote (starts at 38:27 in the video embedded at the top), Sorrentino makes Image Search seem so easy and powerful. In his first demo, he pulls up a social media feed. There are multiple posts that are only text, and then one image. He takes a screenshot, initiates Image Search, and tells us, the audience, that he’s interested in the jacket the model is wearing in the social media post.

Apple's own demo on this Circle to Search-esque feature was plagued with bad answers and a poor UI.

Image Search does its thing and pulls up a collection of images that share similarities with the social media post. Note that it doesn’t search for the jacket. The software doesn’t even know that Sorrentino is interested in the jacket because he never indicated that. All the software does is find images that look similar to the one in his screenshot, and Sorrentino acts like this is a marvel. Sir, I’ve been using TinEye to do that since 2008.

Also, note that Image Search ignored everything else going on in the screenshot. It didn’t search for the Emoji that appears in one of the posts, nor did it search for anything related to the numerous avatar images. Somehow it knew to only search through that one image, which seems like something that won’t ever happen in real life.

In the next demo, Sorrentino finds an image of a room with a mushroom-shaped lamp. He initiates Image Search again, but this time tells the system to investigate the lamp specifically. He does this by scribbling over the lamp with his finger. Note that he doesn’t circle the lamp, because that would be a dead giveaway of Apple’s intention here, but whatever.

Once he circles to search scribbles on the lamp, he sees another list of images. Notice anything weird, though? None of the lamps on the visible list are the one from the original photo! Even the first result, the one he chooses, is very clearly not the lamp he was looking for, but Sorrentino moves forward with adding it to his Etsy favorites as if this were a big success. My guy, that is not the lamp. The system failed, and you’re pretending it succeeded.

... continue reading