CSA partners from around the world join forces to say No to Nudify Apps
On Safer Internet Day 2026, the IWF and child protection partners worldwide unite to call for a global ban on AI nudify apps and tools.
Published: Tue 5 Dec 2023
In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton.
Incredibly realistic child sexual abuse imagery can be generated easily, swiftly and in vast amounts by offenders using Artificial Intelligence (AI) technology. Recently, Internet Watch Foundation (IWF) Analysts found more than 10,000 AI-generated images shared on just one dark web child abuse forum. In the case of almost 2,600 of these images, the depictions of sexual child abuse were indistinguishable from real abuse images. You can read our latest report on how AI is being abused to create child sexual abuse imagery here.
This episode explores what needs to be done to try and control this explosion in harmful imagery online and how other AI or machine-learning tools could be used to counter the phenomenon. From responsible tech development, to deployment, and the content itself that the AI is being trained on – all this and more are discussed by Thorn’s Rebecca Portnoff, who has dedicated her career to building technology and driving initiatives to defend children from sexual abuse, and Dan Sexton who leads the development of world-leading technology to support the work of the IWF Hotline.
Listen to the full episode here.
On Safer Internet Day 2026, the IWF and child protection partners worldwide unite to call for a global ban on AI nudify apps and tools.
Parents across the world are calling for clearer, stronger action to keep children safe online.
The debate on the EU’s proposed Child Sexual Abuse Regulation (CSAR) has been dominated by one loud slogan. A slogan which may have dire consequences for the safety and wellbeing of millions of children worldwide.