The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought
Briefly

The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought
"The deepfake crisis hitting schools started slowly a couple of years ago, but it has since grown considerably as the technology used to create the explicit imagery has become more accessible."
"The findings show that since 2023, schoolchildren-most often boys in high schools-in at least 28 countries have been accused of using generative AI to target their classmates with sexualized deepfakes."
"As a whole, the analysis shows the worldwide reach of harmful AI nudification technology, which can earn their creators millions of dollars per year."
"Across North America, there have been nearly 30 reported deepfake sexual abuse cases since 2023-including one with more than 60 alleged victims."
AI-generated deepfake nude images are increasingly affecting schools worldwide, with nearly 90 institutions and over 600 students impacted. Teenage boys are using 'nudify' apps to create fake nude images from social media photos of girls. This trend has escalated as the technology has become more accessible, with incidents reported in at least 28 countries. The explicit imagery is classified as child sexual abuse material, and many schools and law enforcement agencies are unprepared to handle these serious cases.
Read at WIRED
Unable to calculate read time
[
|
]