Protect your platform with Image Intercept – the IWF’s hash-matching tool for small businesses. Detect known child sexual abuse content on your platform.
IWF and Black Forest Labs join forces to combat harmful AI-generated content. The partnership grants the frontier AI lab access to safety tech tools.
Explore the IWF 2026 AI CSAM Report. Discover why AI-generated child abuse videos increased by 26,385% in 2025 and the emerging risks of agentic AI and LoRAs.
Download essential guides for professionals on understanding, identifying and responding to AI-generated Child Sexual Abuse Material (CSAM). Developed by IWF & NCA.
A chilling excerpt from a new IWF report that delves into what analysts at the child protection charity currently see regarding synthetic or AI-generated imagery of child sexual abuse.
Explore still image trends in our 2025 Annual Data & Insights Report. We analyse the volume, age-range and severity of static abuse material.
New IWF partnership strengthens Bluesky’s ability to tackle child sexual abuse imagery
New Image Intercept tool offers smaller platforms free protection from criminal content, as teens face crisis of online sexual exploitation.
Explore how IWF confronts the rise of AI-generated child sexual abuse material, highlighting emerging threats and efforts to protect children online.
Understand the rise of AI-generated CSAM in our 2025 Annual Data & Insights Report. We analyse the evolving threats and challenges posed by AI tools.