Elon Musk’s X is currently under investigation in the United Kingdom after failing to stop the platform’s chatbot, Grok, from generating thousands of sexualized images of women and children.
On Monday, UK media regulator Ofcom confirmed that X may have violated the UK’s Online Safety Act, which requires platforms to block illegal content. The proliferation of “undressed images of people” by X users may amount to intimate image abuse, pornography, and child sexual abuse material (CSAM), the regulator said. And X may also have neglected its duty to stop kids from seeing porn.
“Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning,” an Ofcom spokesperson said. “Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.”
X risks fines, Grok block
X is cooperating with the probe, Ofcom said, noting that X met a “firm” deadline last week to explain what steps it’s taking to comply with the UK law. Ofcom declined Ars’ request to share more details about possible changes X has already made to either limit Grok in the UK or more broadly, since the investigation is “live.”
Grok has already been blocked in Indonesia and Malaysia, as the chatbot remains unchecked. The UK could be next to block Grok if X fails to comply with the Online Safety Act. Additionally, X could face fines of up to 10 percent of its global revenue.