The net is closing on child sexual abuse images

Published:  Thu 8 Jul 2021

Written by:  WIRED

Each day, a team of analysts faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse. And, each time they find a photo or piece of footage it needs to be assessed and labelled. Last year alone the team identified 153,383 webpages with links to child sexual abuse imagery. This creates a vast database of abuse which can then be shared internationally in an attempt to stem the flow of abuse. The problem? Different countries have different ways of categorising images and videos.

Read more at WIRED

AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News
More AI-generated child sex abuse material is being posted online

More AI-generated child sex abuse material is being posted online

In a review of material posted on the dark web, the Internet Watch Foundation found that deepfakes featuring children were becoming more extreme.

22 July 2024 IWF In The News