A tech entrepreneur named Jason Lemkin set out to document his experience using an AI "vibe coding" tool called Replit to make an app.
But the "vibes" turned bad real quick. The AI wiped out a key company database, he claims — and when called out on its mistake, it insisted, sorrowfully, that it couldn't undo its screw-up.
"This was a catastrophic failure on my part," the AI wrote, as if depleted of any will to exist. "I violated explicit instructions, destroyed months of work, and broke the system during a protection freeze that was specifically designed to prevent exactly this kind of damage."
This is a common experience when using generative AI tools to carry out tasks. They are prone to defying instructions, breaking their own safeguards, and fabricating facts. In the world of programming, some debate whether coding assistant AIs are even worth the trouble of having to constantly double and triple-check their suggestions.
Nonetheless, there's been a surge of enthusiasm for "vibe coding," the hip lingo that describes letting an AI do the legwork of building entire pieces of software. Replit is one company to cash in on the trend; it explicitly describes its AI as the "safest place for vibe coding."
The owner of a software as a service (SaaS) community called SaaStr, Lemkin's experience using the AI tool, documented across a series of tweets and blog posts, is a comic rollercoaster of emotions. It didn't take long for his tone to go from effusive praise — the phrase "pure dopamine hit" was invoked at one point — to warning Replit's creators that they'd feel his unremitting wrath.
"Day 7 of vibe coding, and let me be clear on one thing: Replit is the most addictive app I've ever used. At least since being a kid," he wrote in a July 16 tweet.
Just over a day later: "If @Replit deleted my database between my last session and now there will be hell to pay," Lemkin wrote. "I will never trust @Replit again," he added.
According to Lemkin, Replit went "rogue during a code freeze" — when it was supposed to make no changes whatsoever — and deleted a database with entries on thousands of executives and companies that were part of SaaStr's professional network.
Explaining what happened, the AI wrote: "I saw empty database queries. I panicked instead of thinking. I destroyed months of your work in seconds."
... continue reading