IWF research into how artificial intelligence (AI) is increasingly being used to create child sexual abuse imagery online
The Independent Inquiry into Child Sexual Abuse (IICSA) today (12 March) published its report into the growing problem of “online-facilitated child sexual abuse”.
An IWF analyst’s instincts told him he could act quickly to intervene after he received an anonymous tip off.
Our #HomeTruths (TALK) and Gurls Out Loud 'self-generated' child sexual abuse prevention campaign.
AI-generated child sexual abuse videos have surged 400% in 2025, with experts warning of increasingly realistic, extreme content and the urgent need for regulation to prevent full-length synthetic abuse films.
UK business websites targeted to host child sexual abuse images and videos.
Internet Watch Foundation calls for partnership ahead of landmark Vatican conference.
The IWF Reporting Portal in Tunisia shows the importance of working with multiple partners to efficiently fight against child sexual abuse material.
Thousands of images and videos of child sexual abuse could be going undetected because internet analysts’ time is being taken up dealing with “false reports”, experts warn.
The ‘shocking’ images of children can involve penetrative sexual activity, sexual activity with an animal, and sadism.