Alongside ICMEC, we are building capacity around child online protection internationally with capacity building events and the launch of IWF reporting portals.
We've partnered with CHI to build capacity amongst international helpline staff to deal with online child sexual exploitation and abuse.
IWF is campaigning for an end to use of the phrase ‘child pornography’. There’s #NoSuchThing. It’s child sexual abuse imagery and videos.
Major IWF campaign to help boost child welfare and internet safety in Uganda and Zambia.
The European Parliament is taking a decisive stand against the rise of AI-generated child sexual abuse material (AI-CSAM), co-hosting a high-level briefing with the Internet Watch Foundation (IWF) to address this urgent threat. With a 380% increase in AI-CSAM reports in 2024, the Parliament is pushing for robust legal reforms through the proposed Child Sexual Abuse Directive. Key priorities include criminalising all forms of AI-generated CSAM, removing legal loopholes such as the “personal use” exemption, and enhancing cross-border enforcement. The IWF and the European Child Sexual Abuse Legislation Advocacy Group (ECLAG) urge the Council of the EU to align with Parliament’s strong stance to protect children and support survivors. This article highlights the scale of the threat, the evolving technology behind synthetic abuse imagery, and the critical need for updated EU legislation.
IWF wants to help young people stay safe online by making sure you know what to do if you accidentally see sexual images or videos of someone you think might be under 18.
AI-generated child sexual abuse videos have surged 400% in 2025, with experts warning of increasingly realistic, extreme content and the urgent need for regulation to prevent full-length synthetic abuse films.
New data reveals AI child sexual abuse continues to spread online as criminals create more realistic, and more extreme, imagery.