Meta sues app-maker as it cracks down on 'nudifying'
2 hours ago Share Save Tom Gerken Technology reporter Share Save
Getty Images
Meta has taken legal action against a company which ran ads on its platforms promoting so-called "nudify" apps, which typically using artificial intelligence (AI) to create fake nude images of people without their consent. It has sued the firm behind CrushAI apps to stop it posting ads altogether, following a cat-and-mouse battle to remove them over a series of months. In January, the blog FakedUp found 8,010 instances of ads from CrushAI promoting nudifying aps on Meta's Facebook and Instagram platforms. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a blog post.
"We'll continue to take the necessary steps - which could include legal action - against those who abuse our platforms like this." The growth of generative AI has led to a surge in "nudifying" apps in recent years. It has become such a pervasive issue that in April the children's commission for England called on the government to introduce legislation to ban them altogether. It is illegal to create or possess AI-generated sexual content featuring children.
Deepfakes and scams