The net is closing on child sexual abuse images

Published:  Thu 8 Jul 2021

Written by:  WIRED

Each day, a team of analysts faces a seemingly endless mountain of horrors. The team of 21, who work at the Internet Watch Foundation’s office in Cambridgeshire, spend hours trawling through images and videos containing child sexual abuse. And, each time they find a photo or piece of footage it needs to be assessed and labelled. Last year alone the team identified 153,383 webpages with links to child sexual abuse imagery. This creates a vast database of abuse which can then be shared internationally in an attempt to stem the flow of abuse. The problem? Different countries have different ways of categorising images and videos.

Read more at WIRED

AI tools have put child sexual abuse ‘on steroids’, Home Secretary warns

AI tools have put child sexual abuse ‘on steroids’, Home Secretary warns

The Home Office said fake images are being used to blackmail children and force them to livestream further abuse.

2 February 2025 IWF In The News
UK makes use of AI tools to create child abuse material a crime

UK makes use of AI tools to create child abuse material a crime

Britain will make it illegal to use artificial intelligence tools that create child sexual abuse images.

1 February 2025 IWF In The News
Charity finds more than 500,000 child abuse victims

Charity finds more than 500,000 child abuse victims

An analyst who removes child sexual abuse content from the internet says she is always trying to stay "one step ahead" of the "bad guys".

8 December 2024 IWF In The News