
AI image generators giving rise to child sex abuse material - BBC Newsnight
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.
Published: Wed 10 Nov 2021
Written by: Cambridgeshire Live
A specialised taskforce will stop the spread of child sexual abuse images by taking ‘digital fingerprints’ of each picture.
The team, set up by the Internet Watch Foundation, based in Histon, will assess and grade millions of the most severe images of child sexual abuse and ‘hash’ them, or create a unique digital fingerprint.
The fingerprint is then shared with tech companies globally, allowing them to be blocked or removed if anyone attempts to share them.
Read more at Cambridgeshire Live.
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.