Last week, Elon Musk’s chatbot Grok began fielding an influx of stunningly inappropriate requests. Though the AI has long been known to have loose guardrails, users suddenly swarmed the AI to generate either nudes or sexually charged images of X users based on photos they posted to the site — and it obliged. Even worse, some of the individuals it took requests for appeared to be minors. The trend was so prolific that AI content analysis firm Copyleaks estimated the bot was generating a nonconsensually sexualized image every single minute.
Equally stunning is that the chatbot’s maker, xAI, has remained silent on the issue, despite it gaining international attention in news media and on X, where the bot operates. So has owner and CEO Musk — except for one instance in which he completely failed to meet the gravity of the situation.
“Grok’s viral image moment has arrived, it’s a little different than the Ghibli one was though,” one writer who covers AI euphemistically observed in a tweet.
“Way funnier 😂,” Musk responded.
For the most part, the only acknowledgment of wrongdoing has come from Grok itself, including in one widely seen post where it issued an “apology” — an output that many media outlets interpreted as Grok speaking for xAI.
“Dear Community, I deeply regret an incident on Dec 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt,” it wrote. “This violated ethical standards and potentially US laws on CSAM.”
“It was a failure in safeguards,” it added, “and I’m sorry for any harm caused. xAI is reviewing to prevent future issues.”
In another tweet spotted by Ars Technica, Grok acknowledged the gravely inappropriate requests using a royal “we.”
Responding to a user who had spent the past few days flagging the issue to Grok, the chatbot wrote: “We appreciate you raising this. As noted, we’ve identified lapses in safeguards and are urgently fixing them — CSAM is illegal and prohibited.”
“xAI is committed to preventing such issues,” Grok added.
... continue reading