I’m sorry. This is a rant. Which you’ve probably guessed by the title, duh!
The straw that broke my back was the lead I saw:
“Sweden built smart machines where crows trade trash for food, turning clever birds into unexpected city cleaners.”
A clearly AI-generated image didn’t help the credibility (a three-legged crow is quite telling), yet it’s the headline that made me do the search. “Search” is such a big word, though. It was literally one query.
Not Sweden, but one Swedish startup. Not built, but run a one-time pilot. And it didn’t get anywhere close to turning crows into city cleaners as the project was abandoned with zero follow-up. Fact check here.
Yet it sure works as eye candy if you want to glue someone’s attention to your clearly AI-generated LinkedIn post that tries to sell something (I don’t know what exactly; I’m incentivized to stop reading once I see an AI-generated graphic).
Links No Longer Mean Credibility
Crow cleaners’ story, as amusing as it might have been, wouldn’t get me to write a post, though. In fact, fake references have been a pet peeve for quite some time already, so it was just “another one of these.”
Here’s a more interesting case. Recently, I read a story about AI in coding, full of data backing the author’s claims. One particular fact was the following:
“The SmartBear/Cisco study established numbers everyone ignores: defect detection drops from 87% for PRs under 100 lines to 28% for PRs over 1,000 lines.”
... continue reading