You can trick Google's AI Overviews into explaining made-up idioms
Published on: 2025-08-13 08:28:16
As Big Tech pours countless dollars and resources into AI, preaching the gospel of its utopia-creating brilliance, here's a reminder that algorithms can screw up. Big time. The latest evidence: You can trick Google's AI Overview (the automated answers at the top of your search queries) into explaining fictional, nonsensical idioms as if they were real.
According to Google's AI Overview (via @gregjenner on Bluesky), "You can't lick a badger twice" means you can't trick or deceive someone a second time after they've been tricked once.
That sounds like a logical attempt to explain the idiom — if only it weren't poppycock. Google's Gemini-powered failure came in assuming the question referred to an established phrase rather than absurd mumbo jumbo designed to trick it. In other words, AI hallucinations are still alive and well.
Google / Engadget
We plugged some silliness into it ourselves and found similar results.
ADVERTISEMENT Advertisement
Google's answer claimed that "You can't g
... Read full article.