One in five child abuse images found online last year were category A – report

Published:  Tue 25 Apr 2023

Written by:  Dan Milmo

The most extreme form of child sexual abuse material accounted for a fifth of such content found online last year, according to a report.

Category A abuse represented 20% of illegal images discovered online last year by the Internet Watch Foundation, a UK-based body that monitors distribution of child sexual abuse material (CSAM). It found more than 51,000 instances of such content, which can include the most severe imagery including rape, sadism and bestiality.

The IWF annual report said the 2022 total for category A imagery was double the figure in 2020 and the increase was partly due to criminal sites selling videos and images of such abuse.

Read more at The Guardian

AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News
More AI-generated child sex abuse material is being posted online

More AI-generated child sex abuse material is being posted online

In a review of material posted on the dark web, the Internet Watch Foundation found that deepfakes featuring children were becoming more extreme.

22 July 2024 IWF In The News