How Upload Prevention Protects Children Online

Published:  Thu 9 Oct 2025

Today the IWF has published a paper explaining how technology can be used to prevent the spread of child sexual abuse material while upholding privacy – a method known as upload prevention. 

The rollout of end-to-end encryption (E2EE) messaging without any safeguards means services lose the ability to detect and remove child sexual abuse images and videos. In this blind spot, offenders thrive, while victims and survivors live with the constant threat of their abuse resurfacing. Upload prevention closes this gap.

This explainer is designed to break down the technical concepts around pre-encryption checks (upload prevention) in a clear and simple format, whilst redressing existing misconceptions on the issue. Services must implement upload prevention on end-to-end encrypted platforms, to ensure that known CSAM is detected and blocked before being shared within an end-to-end encrypted messaging platform. We also call for further investment and innovation in privacy-preserving technologies that can detect and takedown CSAM without compromising message confidentiality.

 

Illustration showing upload prevention model operating on E2EE

Diagram showing the upload prevention process when content is not known CSAM
Content is not known CSAM
Diagram showing the upload prevention process when content is known CSAM
Content is known CSAM

 

When it comes to action by policymakers, the European Union must adopt the Child Sexual Abuse Regulation without further delay. We are calling on Member States to push for progress on the Regulation to provide a permanent legal basis for voluntary and mandatory detection of CSAM across the EU. Protections must apply consistently, including when it comes to end-to-end encrypted environments.

As the Online Safety Act is implemented, the UK regulator Ofcom must regularly assess their position on technical feasibility to ensure platforms do not use this as a get-out clause to evade compliance. The UK government should also look to introducing additional legislation to ensure private messaging platforms, including those that are end-to-end encrypted, take the necessary steps to detect and block CSAM. 

Kerry Smith, Chief Executive of the IWF, said: "Our new paper shows exactly how child sexual abuse imagery can be prevented from being uploaded into even the most heavily encrypted environments without breaking encryption or compromising user privacy. It’s time for policy makers to build these protections into everyday life. There is no time to lose.”

 

Kerry Smith, IWF CEO
Kerry Smith, IWF CEO
Government confirms child sexual abuse and exploitation within scope of VAWG strategy after child protection groups’ letter to Home Secretary

Government confirms child sexual abuse and exploitation within scope of VAWG strategy after child protection groups’ letter to Home Secretary

Government says it will take “concrete actions” to protect victims of child sexual abuse and exploitation .

3 October 2025 Blog
AI chatbots and child sexual abuse: a wake-up call for urgent safeguards

AI chatbots and child sexual abuse: a wake-up call for urgent safeguards

Our analysts uncovered criminal material on a platform hosting multiple chatbot “characters” designed to let users simulate sexual scenarios with child avatars.

22 September 2025 Blog
No Loopholes: New Development Shows the EU Must Close the AI Gap through the Recast CSA Directive

No Loopholes: New Development Shows the EU Must Close the AI Gap through the Recast CSA Directive

A disturbing new development highlights exactly why comprehensive legislation cannot wait.

22 September 2025 Blog