AI-generated child sexual abuse videos have surged 400% in 2025, with experts warning of increasingly realistic, extreme content and the urgent need for regulation to prevent full-length synthetic abuse films.
New data reveals AI child sexual abuse continues to spread online as criminals create more realistic, and more extreme, imagery.
The capacity for horrific images of AI-generated child sexual abuse to be reproduced at scale was underlined by IWF in the lead-up to the UK government’s AI Safety Summit.
Wednesday’s hearing brings into sharp focus the problems that organisations like ours, the Internet Watch Foundation, are dealing with every day.
The world’s leading independent open source generative AI company Stability AI, has partnered with the Internet Watch Foundation to tackle the creation of AI generated child sexual abuse imagery online.
New Member Safehire.ai says the organisation is proud to join the Internet Watch Foundation (IWF) as it strengthens a shared mission to protect children from online harm.
IWF and Black Forest Labs join forces to combat harmful AI-generated content. The partnership grants the frontier AI lab access to safety tech tools.
Professionals working with children and young people are being equipped with vital new guidance - developed by the IWF and National Crime Agency - to combat the growing threat of AI-generated child sexual abuse material.
The European Parliament is taking a decisive stand against the rise of AI-generated child sexual abuse material (AI-CSAM), co-hosting a high-level briefing with the Internet Watch Foundation (IWF) to address this urgent threat. With a 380% increase in AI-CSAM reports in 2024, the Parliament is pushing for robust legal reforms through the proposed Child Sexual Abuse Directive. Key priorities include criminalising all forms of AI-generated CSAM, removing legal loopholes such as the “personal use” exemption, and enhancing cross-border enforcement. The IWF and the European Child Sexual Abuse Legislation Advocacy Group (ECLAG) urge the Council of the EU to align with Parliament’s strong stance to protect children and support survivors. This article highlights the scale of the threat, the evolving technology behind synthetic abuse imagery, and the critical need for updated EU legislation.
Download essential guides for professionals on understanding, identifying and responding to AI-generated Child Sexual Abuse Material (CSAM). Developed by IWF & NCA.
AI giving offenders ‘DIY child sexual abuse’ tool, as dozens of child victims used in AI models, IWF warns MPs. The IWF has welcomed upcoming new legislation while giving evidence in Parliament this week.