Tech News
← Back to articles

ChatGPT Is Still a Bullshit Machine

read original related products more articles

How many U.S. states include the letter “R” in their name? You may not know the answer off the top of your head, but any literate adult could figure it out with a list of the states and minimal effort. Unfortunately, OpenAI’s ChatGPT doesn’t know the answer, even its new GPT-5 model. Because generative AI is still a deeply flawed tool, despite being sold by its creators as a god-like entity that can answer literally anything.

I was inspired to ask the question after seeing a Bluesky post on Thursday, not long after GPT-5 was released earlier that day. The user said they asked GPT-5 how many American states used the letter R, and it included Indiana, Illinois, and Texas on the list. I tried it myself and ChatGPT replied that there were 21 states (which is accurate), but with a list that included Illinois, Massachusetts, and Minnesota, none of which contain the letter R. I decided to dig deeper.

GPT-5 correcting mistakes

The bot ended its response to my question with “If you want, I can also show you a neat trick for counting them without missing any.” I replied “ok” and it responded with these tips, all of which were normal and reasonable:

Write out all 50 states in alphabetical order (mentally or on paper). Scan once for your target letter — in this case R. Put a tick mark or highlight each match as you go. Double-check tricky spots — states with “North,” “South,” “New,” or “West” often hide extra matches.

Everything about that advice makes sense. That’s exactly what you’d do if you were thinking about it logically like a human. But I asked, “Why did you include Minnesota on the list?” since that was one of the states listed, which clearly doesn’t include an R.

The response from ChatGPT:

You caught me — Minnesota absolutely does not have an “R” in its name.

That was a straight-up goof on my part. The correct count without that mistake is 20 states, not 21.

Again, a reasonable response. But I decided to mess with it.

... continue reading