Skip to content
Tech News
← Back to articles

The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought

read original get Deepfake Detection Software → more articles
Why This Matters

The rise of deepfake technology in schools has led to a troubling increase in non-consensual, sexually explicit images of minors being created and shared, exposing students to severe emotional harm and legal risks. This widespread issue highlights the urgent need for improved digital literacy, stronger policies, and technological safeguards to protect vulnerable populations and prevent abuse. As deepfake capabilities become more accessible, the education and law enforcement sectors must adapt quickly to address this emerging threat.

Key Takeaways

It usually starts with a photo downloaded from social media.

Around the world, teenage boys are saving Instagram and Snapchat images of girls they know from school and using harmful “nudify” apps to create fake nude photos or videos of them. These deepfakes can quickly be shared across whole schools, leaving victims feeling humiliated, violated, hopeless, and scared the images will haunt them forever.

The deepfake crisis hitting schools started slowly a couple of years ago, but it has since grown considerably as the technology used to create the explicit imagery has become more accessible. Deepfake sexual abuse incidents have hit around 90 schools globally and have impacted more than 600 pupils, according to a review of publicly reported incidents by WIRED and Indicator, a publication focusing on digital deception and misinformation.

The findings show that since 2023, schoolchildren—most often boys in high schools—in at least 28 countries have been accused of using generative AI to target their classmates with sexualized deepfakes. The explicit imagery, containing minors, is considered to be child sexual abuse material (CSAM). This analysis is believed to be the first to review real-world cases of AI deepfake abuse taking place at schools globally.

As a whole, the analysis shows the worldwide reach of harmful AI nudification technology, which can earn their creators millions of dollars per year, and shows that in many incidents, schools and law enforcement officials are often not prepared to respond to the serious sexual abuse incidents.

Across North America, there have been nearly 30 reported deepfake sexual abuse cases since 2023—including one with more than 60 alleged victims, one where the victim was temporarily expelled from school, and others where pupils at multiple schools have allegedly been targeted simultaneously. More than 10 cases have been publicly reported in South America, more than 20 across Europe, and another dozen in Australia and East Asia combined.

The data collection and analysis for this map were produced in partnership between WIRED and Indicator.

The true scale of deepfake sexual abuse taking place in schools is likely much higher. One survey by United Nations children’s agency Unicef estimates that 1.2 million children had sexual deepfakes created of them last year. One in five young people in Spain told Save the Children researchers that deepfake nudes had been created of them. Child protection group Thorn found one in eight teens know someone targeted, and in 2024, 15 percent of students surveyed by the Center for Democracy and Technology said they knew about AI-generated deepfakes linked to their school.

“I think you’d be hard-pressed to find a school that has not been affected by this,” says Lloyd Richardson, director of technology at the Canadian Centre for Child Protection. “The most important thing is how we’re able to help the victims when this happens, because the effects of this can be massive.”

WIRED and Indicator’s analysis looked at incidents that have been publicly reported with specific details, such as locations of schools and potential victim counts. Mostly these are English-language reporting, with a lack of data being available for many countries. Many incidents are never reported in the press, may not include specific details if they are, and instead can be handled privately by schools and law enforcement officials.