IWF working with the adult sector is vital if we’re serious about tackling child sexual abuse imagery online

Published:  Tue 8 Jul 2025

Written by:  IWF Interim CEO, Derek Ray-Hill

If IWF is serious about supporting efforts to eliminate child sexual abuse imagery online around the world, we have to be bold enough to take big decisions that prioritise impact.

As of this summer, the UK’s Online Safety Act demands that certain in-scope user to user services that allow pornography for UK audiences will have to proactively detect and remove known child sexual abuse material. Specifically, these measures include deploying hash-matching technology as well as detecting and removing content matching listed CSAM URLs. And all services that allow pornography must deploy highly effective age assurance to prevent children from accessing such content.

 We are committed to ensuring a safer online experience for children. The law is clear. And IWF’s tools and services that prevent the upload, storage, and sharing of child sexual abuse material are now available to companies that specialise in publishing adult content through our membership model.

Additionally, we were supported through 2023 and 2024 by an independent advisory board of experts who created a Standard of Good Practice for the adult sector. This is an important guide to support companies in preventing the upload and circulation of online child sexual abuse imagery. It recommends a layered approach to trust and safety mechanisms and all adult content providers that work with IWF will be recommended to follow the guide.

We are led by our mission and where there is illegal content that will continuously re-victimise children, we will do anything in our power to remove it.

We are increasingly seeing cases of sexual abuse imagery of teenage children. It’s sometimes challenging to verify the age of adolescents, but over the past year through our child reporting services and our unique connection to the Child Abuse Image Database (CAID) owned by the UK Government’s Home Office, we’ve been able to verify more young people in the imagery as under 18, and therefore, been able to take steps to remove that imagery, and create digital fingerprints of it (called hashes), to contribute to our growing hash list.

 

This is important work and here are two real-life examples:

CASE 1:

Our expert hotline analysts were regularly seeing sexual imagery of a female who could easily be mistaken for being in her 20s – and therefore an adult in the eyes of the law. The imagery of her was appearing on a wide range of websites, including adult entertainment sites.  

Thanks to the trust placed in us by the Home Office, our connection to the Child Abuse Image Database enabled us to check to see if she is a verified victim, and therefore child. On this particular day early in 2025, the person in the imagery was now in the database. She is a child.

We had seen her in images multiple times on revenge porn sites and extortion sites over months. Now, knowing she was a child, we immediately began to hash the images of her and get them taken offline with confidence, knowing she was a police-confirmed UK child victim.  

CASE 2:

In a separate case, a woman self-reported to IWF to confirm that she had been a child when sexualised images were taken of her and uploaded online. This was also confirmed by her local police force. It meant that the team at IWF could then work to get the images taken down and of course, hashed.

 

We have a valuable and trusted connection to CAID and good relationships with law enforcement. Moderators employed by platforms do not have the same connections that help prevent imagery of teenagers being uploaded and exploited online in the same way that we do.

The reason for working with the adult sector is clear. Even without the Online Safety Act mandating porn providers to step up and do what’s right, IWF has a responsibility to work on behalf of all the victims and survivors that we encounter to get their imagery removed, regardless of where it appears. 

Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

IWF CEO Kerry Smith welcomes TikTok’s decision to prioritise child protection over end‑to‑end encryption.

9 March 2026 Blog
Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Child safety is on the line - the EU must extend its temporary law before vital protections are turned off.

9 March 2026 Blog
CSA partners from around the world join forces to say No to Nudify Apps

CSA partners from around the world join forces to say No to Nudify Apps

On Safer Internet Day 2026, the IWF and child protection partners worldwide unite to call for a global ban on AI nudify apps and tools.

10 February 2026 Blog