The net is closing on child sexual abuse images

Published:  Thu 8 Jul 2021

Written by:  WIRED

Each day, a team of analysts faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse. And, each time they find a photo or piece of footage it needs to be assessed and labelled. Last year alone the team identified 153,383 webpages with links to child sexual abuse imagery. This creates a vast database of abuse which can then be shared internationally in an attempt to stem the flow of abuse. The problem? Different countries have different ways of categorising images and videos.

Read more at WIRED

« Les images d’enfants victimes de violences sexuelles générées par l’IA prolongent la souffrance des victimes »

« Les images d’enfants victimes de violences sexuelles générées par l’IA prolongent la souffrance des victimes »

Le directeur de l’Internet Watch Foundation, Derek Ray-Hill, alerte, dans une tribune au « Monde », sur la production d’images pédocriminelles grâce à l’intelligence artificielle et sur la nécessité de les criminaliser.

31 July 2025 IWF In The News
Major increase in child sexual abuse videos made with AI

Major increase in child sexual abuse videos made with AI

Exclusive data shared with Channel 4 News shows that 1,300 videos have been found globally so far in 2025.

11 July 2025 IWF In The News
Zahl der KI-Videos von sexuellem Kindesmissbrauch nimmt stark zu

Zahl der KI-Videos von sexuellem Kindesmissbrauch nimmt stark zu

Pädokriminelle nutzen zunehmend künstliche Intelligenz, um Aufnahmen von Kindesmissbrauch zu erstellen. Ermittler warnen, dass sie dadurch möglicherweise weniger Fälle von echtem Missbrauch stoppen können.

11 July 2025 IWF In The News