Five days. That's really all it took for Clawdbot -- an open-source AI assistant that promises to actually do things on your computer, not just chat -- to go viral, implode, rebrand (twice!) and emerge as OpenClaw. Bruised but still breathing as a beloved crustacean.
If you blinked over the past few days, you may have missed crypto scammers hijacking X accounts, a panicked founder accidentally giving away his personal GitHub handle to bots and a lobster mascot that briefly sprouted a disturbingly handsome human face. Oh, and somewhere in the chaos, the AI developer Anthropic sent a polite email asking them to please, for the love of trademarks, change the name.
Welcome to OpenClaw. Formerly Clawdbot and briefly known as Moltbot, it's the same AI assistant under a newer, sturdier shell. And boy, does this lobster have lore.
What even is OpenClaw? And why should you care?
Here's the pitch that had tech X (the platform formerly known as Twitter) losing its mind: Imagine an AI assistant that doesn't just chat; it does stuff. Real stuff. On your computer. Through the apps you use.
OpenClaw lives where you actually communicate, like WhatsApp, Telegram, iMessage, Slack, Discord, Signal -- you name it. You text it like you'd text a friend, and it remembers your conversations from weeks ago and can send you proactive reminders. And if you give it permission, it can automate tasks, run commands and basically act like a digital personal assistant that never sleeps. Unlike its founder.
Created by Peter Steinberger, an Austrian developer who sold his company PSPDFKit for around $119 million and then got bored enough to build this, OpenClaw represents what a lot of people thought Siri should have been all along. Not a voice-activated party trick, but an actual assistant that learns, remembers and gets things done. (CNET reached out to Steinberger for comment on this story.)
OpenClaw doesn't require any specific hardware to run, though the Mac Mini seems popular. The core idea is that OpenClaw itself mostly routes messages to AI companies' servers and calls APIs, and the heavy AI work happens on whichever LLM you select: Claude, ChatGPT, Gemini.
Hardware only becomes a bigger conversation if you want to run large local models or do heavy automation. That's where powerful machines, like the Mac Mini, are often brought into the conversation. But that's not a requirement.
The project launched about three weeks ago and hit 9,000 GitHub stars in 24 hours. By the time the dust settled late last week, it had rocketed past 60,000 stars, with everyone from AI researcher Andrej Karpathy to investor (and White House AI and crypto czar) David Sacks singing its praises. MacStories called it "the future of personal AI assistants."
... continue reading