Two countries have blocked the Grok app after it was widely used to generate non-consensual near-nude deepfakes of women and children. A third country is currently carrying out an investigation.
Three US senators have asked Apple to temporarily remove both X and Grok from the US App Store due to “sickening content generation,” and we are still awaiting the company’s response …
AI-generated CSAM by Grok
The Grok AI tool is available both as a standalone app and through the X app. It is also available through the Grok tab on the X website.
There has been abundant evidence of Grok generating non-consensual, near-nude deepfakes of real individuals, taking a clothed photo and then digitally removing clothing to replace it with a bikini or other revealing clothing. Even more worryingly, some of these deepfakes were of children.
While nude imagery is theoretically blocked by Grok, some users have been using prompt language that works around this.
On Friday, three U.S. senators asked Apple to temporarily remove both apps from the App Store, noting that the non-consensual imagery included child sexual abuse materials (CSAM).
Senators Ron Wyden, Ed Markey, and Ben Ray Luján penned an open letter to the CEOs of Apple and Google, asking both companies to pull X and Grok apps “pending a full investigation” of “mass generation of nonconsensual sexualized images of women and children.”
The letter notes that ex-CEO Elon Musk has failed to act, and contrasts the lack of action by Apple and Google with their rapid removal of the ICEBlock app at the request of the White House. Musk’s only response has been to limit X image generation to paid subscribers, which seems the most cynical possible action, but the same feature is accessible to anyone through the Grok tab on both the X website and app.
Two countries block the Grok app
... continue reading