We have a powerful sense of mission, with clarity, focus and purpose to our work. Our one single task – beyond all else – is the elimination of child sexual abuse material online.
AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.
Cambridgeshire mum Lillian* has one of the most unusual and, sometimes, harrowing jobs in the world.
A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.
Protect your platform with Image Intercept – the IWF’s hash-matching tool for small businesses. Detect known child sexual abuse content on your platform.
A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.
A list of ‘digital fingerprints’ of known child sexual abuse imagery allowing you to stop it on your networks, platforms and apps.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
A specialised taskforce will stop the spread of child sexual abuse images by taking ‘digital fingerprints’ of each picture.
Discover the latest trends & data in the fight against online child sexual abuse imagery in the 2023 Annual Report from the Internet Watch Foundation (IWF).