Cambridgeshire mum Lillian* has one of the most unusual and, sometimes, harrowing jobs in the world.
Digital fingerprints of a million images of child sexual abuse have been created, the Internet Watch Foundation (IWF) has said.
Explore how ICAP sites use pyramid-style schemes to distribute child sexual abuse material, increasing public exposure and aiding criminal profits.
In a new podcast released by the Internet Watch Foundation, the charity says introducing end-to-end encryption to messaging apps could hinder the detection and removal of child sexual abuse material from the internet.
AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.
A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.
IWF supports the Online Safety Act by helping adult sites detect, remove, and prevent child sexual abuse imagery online.
Protect your platform with Image Intercept – the IWF’s hash-matching tool for small businesses. Detect known child sexual abuse content on your platform.
A list of ‘digital fingerprints’ of known child sexual abuse imagery allowing you to stop it on your networks, platforms and apps.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.
A specialised taskforce will stop the spread of child sexual abuse images by taking ‘digital fingerprints’ of each picture.