IWF and NSPCC's Report Remove can support a young person in reporting sexual images shared online and enables them to get the image removed if it is illegal.
IWF research into how artificial intelligence (AI) is increasingly being used to create child sexual abuse imagery online
A new report from the IWF shows how the pace of AI development has not slowed as offenders are using better, faster and more accessible tools to generate new criminal images and videos.
IWF analysts have worked through the coronavirus lockdown to make sure children are kept safe.
The UK’s Internet Watch Foundation (IWF) and the USA’s National Center for Missing & Exploited Children (NCMEC) announce a landmark agreement to better protect children whose sexual abuse images are shared and traded on the internet.
Protect your Generative AI Model from the devastating harm caused by online child sexual abuse through corporate membership with the Internet Watch Foundation.
A chilling excerpt from a new IWF report that delves into what analysts at the child protection charity currently see regarding synthetic or AI-generated imagery of child sexual abuse.
Research report by PIER at Anglia Ruskin University, providing insight into girls and their parents' understanding of self-generated CSAM.