GeorgePeters/DigitalVision Vectors/Getty Images
Follow ZDNET: Add us as a preferred source on Google.
ZDNET key takeaways:
Humans are misusing the medical term hallucination to describe AI errors
The medical term confabulation is a better approximation of faulty AI output
Dropping the term hallucination helps dispel myths about AI.
The expression "AI hallucination" is well-known to anyone who's experienced ChatGPT or Gemini or Perplexity spouting obvious falsehoods, which is pretty much anyone who's ever used an AI chatbot.
Only, it's an expression that's incorrect. The proper term for when a large language model or other generative AI program asserts falsehoods is not a hallucination but a "confabulation." AI doesn't hallucinate, it confabulates.
Also: 3 ways AI agents will make your job unrecognizable in the next few years
The word confabulation is also from the psychology literature, just like hallucination, but they mean very different things.
... continue reading