Woman felt 'dehumanised' after Musk's Grok AI used to digitally remove her clothes
Ms Smith shared a post on X about her image being altered, which was met with comments from those who had experienced the same - before others asked Grok to generate more images of her.
XAI, the company behind Grok, did not respond to a request for comment, other than with an automatically-generated reply stating "legacy media lies".
The BBC has seen several examples on the social media platform X of people asking the chatbot to undress women to make them appear in bikinis without their consent, as well as putting them in sexual situations.
A woman has told the BBC she felt "dehumanised and reduced into a sexual stereotype" after Elon Musk's AI Grok was used to digitally remove her clothing.
"Women are not consenting to this," she said.
"While it wasn't me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me."
A Home Office spokesperson said it was legislating to ban nudification tools, and under a new criminal offence, anyone who supplied such tech would "face a prison sentence and substantial fines".
The regulator Ofcom said tech firms must "assess the risk" of people in the UK viewing illegal content on their platforms, but did not confirm whether it was currently investigating X or Grok in relation to AI images.
Grok is a free AI assistant - with some paid for premium features - which responds to X users' prompts when they tag it in a post.
... continue reading