Explore how IWF identifies and addresses non-photographic child sexual abuse imagery, including drawings and CGI, under UK legislation.
Britain will make it illegal to use artificial intelligence tools that create child sexual abuse images.
Huw Edwards’ offences highlight how WhatsApp can be abused by predators sharing criminal imagery of children, IWF warns. Dan Sexton, Chief Technology Officer at the IWF, appeared on national BBC Breakfast television this week (September 17) to warn Meta is not taking adequate steps to proactively prevent the sharing of child sexual abuse material on the platform.
IWF analysts use CAID and victim reports to verify teen abuse victims, helping remove illegal imagery that might otherwise be missed.
Innovations in detecting and removing child sexual abuse material have been made possible by a grant from Nominet.
Each day, a team of analysts faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse.
A day in the life of the IWF’s child abuse image taskforce. "They know they are about to witness some of the most upsetting things ever uploaded onto the internet"