Tech News
← Back to articles

X blames users for Grok-generated CSAM; no fixes announced

read original related products more articles

It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users generating content that the platform deems illegal, including Grok-generated child sexual abuse material (CSAM).

On Saturday, X Safety finally posted an official response after nearly a week of backlash over Grok outputs that sexualized real people without consent. Offering no apology for Grok’s functionality, X Safety blamed users for prompting Grok to produce CSAM while reminding them that such prompts can trigger account suspensions and possible legal consequences.

“We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary,” X Safety said. “Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.”

X Safety’s post boosted a reply on another thread on the platform in which X owner Elon Musk reiterated the consequences users face for inappropriate prompting. That reply came to a post from an X user, DogeDesigner, who suggested that Grok can’t be blamed for “creating inappropriate images,” despite Grok determining its own outputs.

“That’s like blaming a pen for writing something bad,” DogeDesigner opined. “A pen doesn’t decide what gets written. The person holding it does. Grok works the same way. What you get depends a lot on what you put in.”

But image generators like Grok aren’t forced to output exactly what the user wants, like a pen. One of the reasons the Copyright Office won’t allow AI-generated works to be registered is the lack of human agency in determining what AI image generators spit out. Chatbots are similarly non-deterministic, generating different outputs for the same prompt.

That’s why, for many users questioning why X won’t filter out CSAM in response to Grok’s generations, X’s response seems to stop well short of fixing the problem by only holding users responsible for outputs.