
AI image generators giving rise to child sex abuse material - BBC Newsnight
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.
Published: Wed 10 Nov 2021
Written by: Mail Online
Children who are worried that nude pictures and videos may end up online will be able to report the material to help prevent it from being uploaded in the future.
For the first time, young people will be able to flag images and videos with the Internet Watch Foundation (IWF) charity via a tool on the NSPCC’s Childline website before they have appeared online.
Under 18s will be able to flag the material via the Report Remove tool, described by the IWF as a “world-first”.
Read more at Mail Online.
The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.