For more than two years, an app called ClothOff has been terrorizing young women online — and it’s been maddeningly difficult to stop. The app has been taken down from the two major app stores and it’s banned from most social platforms, but it’s still available on the web and through a Telegram bot. In October, a clinic at Yale Law School filed a lawsuit that would take down the app entirely, forcing the owners to delete all images and cease operation entirely. But simply finding the defendants has been a challenge.
“It’s incorporated in the British Virgin Islands,” explains Professor John Langford, a co-lead counsel in the lawsuit, “but we believe it’s run by a brother and sister and Belarus. It may even be part of a larger network around the world.”
It’s a bitter lesson in the wake of the recent flood of non-consensual pornography generated by Elon Musk’s xAI, which included many underage victims. Child sexual abuse material is the most legally toxic content on the internet — illegal to produce, transmit or store, and regularly scanned for on every major cloud service. But despite the intense legal prohibitions, there are still few ways to deal with image generators like ClothOff, as Langford’s case demonstrates. Individual users can be prosecuted, but platforms like ClothOff and Grok are far more difficult to police, leaving few options for victims hoping to find justice in court.
The clinic’s complaint, which is available online, paints an alarming picture. The plaintiff is an anonymous high school student in New Jersey, whose classmates used ClothOff to alter her Instagram photos. She was 14 years old when the original Instagram photos were taken, which means the AI-modified versions are legally classified as child abuse imagery. But even though the modified images are straightforwardly illegal, local authorities declined to prosecute the case, citing the difficulty of obtaining evidence from suspects’ devices.
“Neither the school nor law enforcement ever established how broadly the CSAM of Jane Doe and other girls was distributed,” the complaint reads.
Still, the court case has moved slowly. The complaint was filed in October, and in the months since, Langford and his colleagues have been in the process of serving notice to the defendants — a difficult task given the global nature of the enterprise. Once they’ve been served, the clinic can push for a court appearance and, eventually, a judgment, but in the meantime the legal system has given little comfort to ClothOff’s victims.
The Grok case might seem like a simpler problem to fix. Elon Musk’s xAI isn’t hiding, and there’s plenty of money at the end for lawyers who can win a claim. But Grok is a general purpose tool, which makes it much harder to hold it accountable in court.
Techcrunch event Join the Disrupt 2026 Waitlist Add yourself to the Disrupt 2026 waitlist to be first in line when Early Bird tickets drop. Past Disrupts have brought Google Cloud, Netflix, Microsoft, Box, Phia, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, and Vinod Khosla to the stages — part of 250+ industry leaders driving 200+ sessions built to fuel your growth and sharpen your edge. Plus, meet the hundreds of startups innovating across every sector. Join the Disrupt 2026 Waitlist Add yourself to the Disrupt 2026 waitlist to be first in line when Early Bird tickets drop. Past Disrupts have brought Google Cloud, Netflix, Microsoft, Box, Phia, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, and Vinod Khosla to the stages — part of 250+ industry leaders driving 200+ sessions built to fuel your growth and sharpen your edge. Plus, meet the hundreds of startups innovating across every sector. San Francisco | WAITLIST NOW
“ClothOff is designed and marketed specifically as a deepfake pornography image and video generator,” Langford told me. “When you’re suing a general system that users can query for all sorts of things, it gets a lot more complicated.”
A number of US laws have already banned deepfake pornography — most notably the Take It Down Act. But while specific users are clearly breaking those laws, it’s much harder to hold the entire platform accountable. Existing laws require clear evidence of an intent to harm, which would mean providing evidence xAI knew their tool would be used to produce non-consensual pornography. Without that evidence, xAI’s basic first amendment rights would provide significant legal protection..
... continue reading