A New System Is Helping Crack Down on Child Sex Abuse Images

Published:  Thu 7 Oct 2021

Written by:  WIRED

EACH DAY, A team of analysts in the UK faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse. And, each time they find a photo or piece of footage it needs to be assessed and labeled. Last year alone the team identified 153,383 web pages with links to child sexual abuse imagery. This creates a vast database that can then be shared internationally in an attempt to stem the flow of abuse. The problem? Different countries have different ways of categorizing images and videos.

Read more at WIRED.

AI image generators giving rise to child sex abuse material - BBC Newsnight

AI image generators giving rise to child sex abuse material - BBC Newsnight

The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.

17 July 2023 IWF In The News
Charity wants AI summit to address child sexual abuse imagery

Charity wants AI summit to address child sexual abuse imagery

A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.

17 July 2023 IWF In The News
Webpages containing the most extreme child abuse have doubled since 2020

Webpages containing the most extreme child abuse have doubled since 2020

Images of children aged as young as seven being abused online have risen by almost two thirds.

25 April 2023 IWF In The News