IWF supports the Online Safety Act by helping adult sites detect, remove, and prevent child sexual abuse imagery online.
A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.
Explore the IWF 2026 AI CSAM Report. Discover why AI-generated child abuse videos increased by 26,385% in 2025 and the emerging risks of agentic AI and LoRAs.
A list of ‘digital fingerprints’ of known child sexual abuse imagery allowing you to stop it on your networks, platforms and apps.
Protect your platform with Image Intercept – the IWF’s hash-matching tool for small businesses. Detect known child sexual abuse content on your platform.
A specialised taskforce will stop the spread of child sexual abuse images by taking ‘digital fingerprints’ of each picture.
Discover the latest trends & data in the fight against online child sexual abuse imagery in the 2023 Annual Report from the Internet Watch Foundation (IWF).
IWF wants to help young people stay safe online by making sure you know what to do if you accidentally see sexual images or videos of someone you think might be under 18.
IWF reveals 2024 as the worst year for online child sexual abuse imagery urging the Prime Minister to strengthen the Online Safety Act and close critical loopholes.