How Upload Prevention Protects Children Online

Published:  Thu 9 Oct 2025

Today the IWF has published a paper explaining how technology can be used to prevent the spread of child sexual abuse material while upholding privacy – a method known as upload prevention. 

The rollout of end-to-end encryption (E2EE) messaging without any safeguards means services lose the ability to detect and remove child sexual abuse images and videos. In this blind spot, offenders thrive, while victims and survivors live with the constant threat of their abuse resurfacing. Upload prevention closes this gap.

This explainer is designed to break down the technical concepts around pre-encryption checks (upload prevention) in a clear and simple format, whilst redressing existing misconceptions on the issue. Services must implement upload prevention on end-to-end encrypted platforms, to ensure that known CSAM is detected and blocked before being shared within an end-to-end encrypted messaging platform. We also call for further investment and innovation in privacy-preserving technologies that can detect and takedown CSAM without compromising message confidentiality.

 

Illustration showing upload prevention model operating on E2EE

Diagram showing the upload prevention process when content is not known CSAM
Content is not known CSAM
Diagram showing the upload prevention process when content is known CSAM
Content is known CSAM

 

When it comes to action by policymakers, the European Union must adopt the Child Sexual Abuse Regulation without further delay. We are calling on Member States to push for progress on the Regulation to provide a permanent legal basis for voluntary and mandatory detection of CSAM across the EU. Protections must apply consistently, including when it comes to end-to-end encrypted environments.

As the Online Safety Act is implemented, the UK regulator Ofcom must regularly assess their position on technical feasibility to ensure platforms do not use this as a get-out clause to evade compliance. The UK government should also look to introducing additional legislation to ensure private messaging platforms, including those that are end-to-end encrypted, take the necessary steps to detect and block CSAM. 

Kerry Smith, Chief Executive of the IWF, said: "Our new paper shows exactly how child sexual abuse imagery can be prevented from being uploaded into even the most heavily encrypted environments without breaking encryption or compromising user privacy. It’s time for policy makers to build these protections into everyday life. There is no time to lose.”

 

Kerry Smith, IWF CEO
Kerry Smith, IWF CEO
Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

Tik Tok’s bold step puts children’s safety before the rush for extreme privacy - more should follow their example

IWF CEO Kerry Smith welcomes TikTok’s decision to prioritise child protection over end‑to‑end encryption.

9 March 2026 Blog
Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Child safety is on the line - the EU must extend its temporary law before vital protections are turned off.

9 March 2026 Blog
CSA partners from around the world join forces to say No to Nudify Apps

CSA partners from around the world join forces to say No to Nudify Apps

On Safer Internet Day 2026, the IWF and child protection partners worldwide unite to call for a global ban on AI nudify apps and tools.

10 February 2026 Blog