
AI image generators giving rise to child sex abuse material - BBC Newsnight
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.
Published: Mon 6 Jun 2022
Written by: politics.co.uk
A million of the very worst images of child sexual abuse the police have ever seized have been turned into digital fingerprints in an “unprecedented” drive to prevent criminal material being shared online.
One year since being set up, the Internet Watch Foundation’s dedicated hashing taskforce has analysed a million images, including some of the most severe content on the internet, from the Government’s Child Abuse Image Database (CAID) system and turned them into unique hashes.
These hashes act as digital fingerprints – unique digital codes – which are shared with law enforcement and service providers to flag up and block any attempts to share or distribute this material online.
Read more at politics.co.uk
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.