Tech News
← Back to articles

Payment processors were against CSAM until Grok started making it

read original related products more articles

is a reporter who writes about tech, money, and human behavior. She joined The Verge in 2014 as science editor. Previously, she was a reporter at Bloomberg.

Posts from this author will be added to your daily email digest and your homepage feed.

For many years, credit card companies and other payment methods were aggressive about policing child sexual abuse material. Then, Elon Musk’s Grok started undressing children on X.

The Center for Countering Digital Hate found 101 sexualized images of children as part of its sample of 20,000 images made by Grok from December 29th to January 8th. Using that sample, the group estimated that 23,000 sexualized images of children had been produced in that time frame. Over that 11-day period, they estimated that on average, a sexualized image of a child was produced every 41 seconds. Not all of the sexualized images Grok has produced appear to be illegal, but reports indicate at least some likely cross the line.

There is tremendous confusion about what happens to be true on Grok at any given moment. Grok has offered responses with misleading details, claiming at one point, for instance, that it had restricted image generation to paying X subscribers while still allowing direct access on X to free users. Though Musk has claimed that new guardrails prevent Grok from undressing people, our testing showed that isn’t necessarily true. Using a free account on Grok, The Verge was able to generate deepfake images of real people in skimpy clothing, in sexually suggestive positions, after new rules were supposedly in effect. As of this writing, some egregious prompts appear to have been blocked, but people are remarkably clever at getting around rules-based bans.

In the past, payment providers have been aggressive about cutting access to websites thought to have a significant presence of CSAM

X does seem to have at least partially restricted Grok’s image editing features to paid subscribers, however — which makes it very likely that for at least some of these objectionable images, money is actually changing hands. You can purchase a subscription to X on Stripe or through the Apple and Google app stores using your credit card. Musk has also suggested through his posts that he doesn’t think undressing people is a problem. This isn’t X’s first brush with AI porn, either — it’s repeatedly had a problem moderating nude deepfakes of Taylor Swift, whether or not they are generated by Grok.

“The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there.”

But Musk’s boutique revenge porn and CSAM generator is, apparently, just fine.

It’s a striking reversal. “The industry is no longer willing to self-regulate for something as universally agreed on as the most abhorrent thing out there,” which is CSAM, says Lana Swartz, the author of New Money: How Payment Became Social Media, of the inaction by Stripe and the credit card companies.

... continue reading