One in five child abuse images found online last year were category A – report

Published:  Tue 25 Apr 2023

Written by:  Dan Milmo

The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.

Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.

The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.

Read more at The Guardian

« Les images d’enfants victimes de violences sexuelles générées par l’IA prolongent la souffrance des victimes »

« Les images d’enfants victimes de violences sexuelles générées par l’IA prolongent la souffrance des victimes »

Le directeur de l’Internet Watch Foundation, Derek Ray-Hill, alerte, dans une tribune au « Monde », sur la production d’images pédocriminelles grâce à l’intelligence artificielle et sur la nécessité de les criminaliser.

31 July 2025 IWF In The News
Major increase in child sexual abuse videos made with AI

Major increase in child sexual abuse videos made with AI

Exclusive data shared with Channel 4 News shows that 1,300 videos have been found globally so far in 2025.

11 July 2025 IWF In The News
Zahl der KI-Videos von sexuellem Kindesmissbrauch nimmt stark zu

Zahl der KI-Videos von sexuellem Kindesmissbrauch nimmt stark zu

Pädokriminelle nutzen zunehmend künstliche Intelligenz, um Aufnahmen von Kindesmissbrauch zu erstellen. Ermittler warnen, dass sie dadurch möglicherweise weniger Fälle von echtem Missbrauch stoppen können.

11 July 2025 IWF In The News