
AI image generators giving rise to child sex abuse material - BBC Newsnight
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.
Published: Mon 21 Jun 2021
Written by: ITV News
In a "world-first", children worried about nude images and videos ending up online against their will, are able to report the material to help prevent it from being uploaded in the future.
Young people will be able to flag images and videos with the Internet Watch Foundation (IWF) charity via a new tool on the NSPCC’s Childline website before they have appeared online.
The Report Remove tool will allow under 18s to anonymously flag the material. IWF analysts will then review the content and create a unique digital fingerprint known as a hash, which will be shared with tech companies to help prevent it from being uploaded and shared.
Read more at ITV News.
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.