The Internet Watch Foundation (IWF) has hashed more than a million images in a ‘major boost’ to internet safety.
Cambridgeshire mum Lillian* has one of the most unusual and, sometimes, harrowing jobs in the world.
Digital fingerprints of a million images of child sexual abuse have been created, the Internet Watch Foundation (IWF) has said.
More than nine in ten people in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services.
We have a powerful sense of mission, with clarity, focus and purpose to our work. Our one single task – beyond all else – is the elimination of child sexual abuse material online.
Explore how ICAP sites use pyramid-style schemes to distribute child sexual abuse material, increasing public exposure and aiding criminal profits.
In a new podcast released by the Internet Watch Foundation, the charity says introducing end-to-end encryption to messaging apps could hinder the detection and removal of child sexual abuse material from the internet.
AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.
A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.