UK to ban deepfake AI 'nudification' apps
19 minutes ago Share Save Liv McMahon Technology reporter Share Save
Getty Images
The UK government says it will ban so-called "nudification" apps as part of efforts to tackle misogyny online. New laws - announced on Thursday as part of a wider strategy to halve violence against women and girls - will make it illegal to create and supply AI tools letting users edit images to seemingly remove someone's clothing. The new offences would build on existing rules around sexually explicit deepfakes and intimate image abuse, the government said. "Women and girls deserve to be safe online as well as offline," said Technology Secretary Liz Kendall.
"We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes." Creating deepfake explicit images of someone without their consent is already a criminal offence under the Online Safety Act. Ms Kendall said the new offence - which makes it illegal to create or distribute nudifying apps - would mean "those who profit from them or enable their use will feel the full force of the law".
Nudification or "de-clothing" apps use generative AI to realistically make it look like a person has been stripped of their clothing in an image or video. Experts have issued warnings about the rise of such apps and the potential for fake nude imagery to inflict serious harm on victims - particularly when used to create child sexual abuse material (CSAM). In April, the Children's Commissioner for England Dame Rachel de Souza called for a total ban on nudification apps. "The act of making such an image is rightly illegal – the technology enabling it should also be," she said in a report. The government said on Thursday it would "join forces with tech companies" to develop methods to combat intimate image abuse. This would include continuing its work with UK safety tech firm SafeToNet, it said. The UK company developed AI software it claimed could identify and block sexual content, as well as block cameras when they detect sexual content is being captured. Such tech builds on existing filters implemented by platforms such as Meta to detect and flag potential nudity in imagery, often with the aim of stopping children taking or sharing intimate images of themselves.
'No reason to exist'