
"The Metropolitan Police received 120 percent more complaints about non-consensual intimate image abuse (NCII), according to FOI data obtained by online safety provider Verifymy."
"Some 1,766 NCII complaints were made last year in Greater London, a 17 percent rise from 1,523 the year before, and more than double the 805 recorded in 2020."
"The Internet Watch Foundation said just last week that in 2025 there had been a more than 260-fold increase in AI-generated child sexual abuse videos from the year before."
"Social media platforms will have to remove any non-consensual images reported within 48 hours under the new Crime and Policing Bill, which is currently in the final stages of legislation."
Complaints about non-consensual intimate image abuse have increased by 120% in five years, with 1,766 cases reported last year in Greater London. The rise is attributed to AI-powered nudification tools that facilitate the creation of explicit images without consent. The Internet Watch Foundation reported a significant increase in AI-generated child sexual abuse videos. New legislation mandates social media platforms to remove non-consensual images within 48 hours and bans nudification tools. Victims will have three years to report these crimes, an extension from the previous six months.
Read at www.independent.co.uk
Unable to calculate read time
Collection
[
|
...
]