High public concern at spread of child sexual abuse images and videos in end-to-end encrypted spaces

Published:  Mon 10 Nov 2025

  • Survey shows 92% of people in the UK are concerned about the sharing of child sexual abuse on end-to-end encrypted (E2EE) messaging services

  • Just under nine in ten (88%) adults agree that UK government should require tech companies to use ‘upload prevention’ method in E2EE environments to detect images and videos of child sexual abuse
  • IWF says the safety method is a technically feasible way to detect child sexual abuse in E2EE spaces that preserves privacy and upholds the rights of victims and survivors

More than nine in ten people1 in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services and believe that companies should invest in safety measures to prevent this type of criminal content from being shared in this way.

Survey findings have revealed the depth of feeling among the UK public about the spread of child sexual abuse in E2EE environments and their support for the UK government to require companies to use ‘upload prevention’ to tackle the problem.

The survey results2 show that there is a ‘public demand’ for tech companies to proactively find and stop the spread of child sexual abuse images within E2EE messaging spaces, said the Internet Watch Foundation (IWF), the UK charity on the front line in the fight to stop child sexual abuse on the internet.

Safeguarding Minister Jess Phillips said it is clear the British public wants greater protections for children online and urged tech companies to put children’s safety first.

The IWF is calling for the upload prevention method to be used as a technically feasible way to detect known child sexual abuse images and videos in E2EE environments.

End-to-end encryption is a tool to make messages private so they cannot be looked at by anyone else. When a message is encrypted, the information is scrambled into an unreadable format. Only the sender and intended recipient hold the keys needed to ‘unlock’ or decrypt a message. Without the key, the contents remain scrambled and concealed.

The IWF has recently published a new paper that explains how E2EE environments make it harder to find criminal content and outlines how messaging platforms can prevent the spread of known child sexual abuse images and videos without breaking encryption or trespassing on user privacy.

The paper highlights how E2EE messaging services have become a major channel for the distribution of child sexual abuse content in the absence of technical safeguards to protect children and details how upload prevention is designed to stop the sharing of known child sexual abuse material in E2EE environments before it is sent.

Now, 88% of UK adults surveyed3 agree that the UK government should require companies to use the upload prevention method to detect images and videos of children being sexually abused before they are sent through E2EE services.

Upload prevention works by creating “digital fingerprints” of files, known as a hash. This hash is then compared to a secure database of hashes of material that has already been confirmed as child sexual abuse material (a “hash list”). If there is a match then the image is blocked from being uploaded. A trusted body such as IWF is responsible for the maintenance and governance of the hash list.

 

Photograph of Kerry Smith, IWF CEO.
Kerry Smith, IWF CEO

Internet Watch Foundation CEO Kerry Smith, said: “End-to-end encrypted platforms cannot be safe havens for criminals to revictimise survivors and share images and videos of child sexual abuse.

“Last year was the worst on record for the number of reports of child sexual abuse identified by the IWF and we know that offenders are increasingly turning to everyday messaging platforms that offer E2EE because they feel they can do so without discovery. Failure to look for known images and videos of child sexual abuse only serves to protects offenders.

“The survey shows a clear public demand for tech companies to ensure that the rights and privacy of the children who use their platforms are paramount and to take all necessary efforts to ensure they are protected online. Upload prevention is a proven safety measure that protects privacy as it acts to identify and block criminal content before encryption.

“Tech companies have a duty to protect all their users online and this safety tool provides them with the means to do so in a way that upholds both the security of private communications and the fundamental rights of victims and survivors.”

Photograph of Jess Phillips, Minister for Safeguarding and Violence Against Women and Girls
Jess Phillips, Minister for Safeguarding and Violence Against Women and Girls

Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips, said:

“It is clear that the British public want greater protections for children online and we are working with technology companies so more can be done to keep children safer.

“The design choices of platforms cannot be an excuse for failing to respond to the most horrific crimes, which in some circumstances have led to children taking their own lives. If companies don’t comply with the Online Safety Act they will face enforcement from the regulator.

"Through our action we now have an opportunity to make the online world safer for children, and I urge all technology companies to invest in safeguards so that children’s safety comes first.”

Helen Rance, Deputy Director of Child Sexual Abuse and Modern Slavery & Human Trafficking at the National Crime Agency, said: "While encryption offers significant benefits for user security, the rapid and widespread adoption of end-to-end encryption by major tech companies is occurring without adequate consideration for public safety – ultimately placing users at risk.

“Tech giants must not use end-to-end encryption as a shield to avoid responsibility for preventing illegal activity on their platforms, particularly in spaces accessed by children.

“The broad implementation of privacy-enhancing technologies, without a balanced approach to user safety, undermines platforms’ ability to detect and prevent harm and hampers law enforcement efforts to investigate child sexual abuse.”

Rani Govender, Policy Manager for Child Safety Online at the NSPCC, said: “The rapid roll-out of end-to-end encryption in its current form has made it near impossible to detect and remove child sexual abuse material from private messaging platforms.

“Children are paying the price with images of them, many showing horrendous child abuse, being shared at scale online amongst communities of criminals.

“Tech firms should be responding to this public outcry and investing in solutions that protect both safety and privacy, not dragging their feet while harm spreads behind closed doors.”

As part of the implementation of the Online Safety Act, the UK regulator Ofcom has stated that measures to prevent the upload of child sexual abuse material or takedown illegal content only apply to platforms where it is ‘technically feasible’. The IWF is calling for Ofcom to regularly assess its position on technical feasibility to ensure platforms do not use this as a get-out clause to evade compliance.

Kerry continued: “It is crucial that Ofcom conducts a regular review of technology to ensure ‘technical feasibility’ does not provide services with a loophole to avoid their safety duties. When it is technically feasible for a service to implement a safety measure, Ofcom must amend the codes to reflect the change.”

The survey was run by polling company Savanta in October 2025 and included 1,796 UK adults.

More than 290,000 webpages of child sexual abuse, or links to that content, were confirmed by the IWF analysts in 2024. Each page can contain hundreds, if not thousands, of indecent images of children. This is the most child sexual abuse webpages the IWF has ever discovered in its 29-year history and is a five per cent increase on the 275,650 webpages identified in 2023.

1 Over nine in ten (92%) UK adults say they are either very concerned (82%) or somewhat concerned (10%) about images and videos of children being sexually abused being shared through end-to-end encrypted messaging; and over nine in ten (91%) UK adults say that companies should invest in safety measures to prevent images and videos of children being sexually abused being sent through end-to-end encrypted messaging.

2 Savanta surveyed 1,796 UK adults online between 10th and 13th October 2025 using their public omnibus. Data were weighted to be representative of the UK by age, gender, region and social grade. Participants were allowed to not answer some questions owing to their sensitive nature. As a result, base sizes vary across questions. Percentages are reported based on a rebased sample, meaning only respondents who opted to answer a particular question are included in those calculations. Data on these questions have still been weighted to be nationally representative by age, gender, region and social grade.

Just under nine in ten (88%) UK adults agree that the UK government should require companies to use the upload prevention method to detect images and videos of children being sexually abused before they are sent through end-to-end encrypted services. 66% strongly agree and 22% agree. Just 2% of UK adults disagree that the UK government should impose this requirement.

Europe in ‘last chance saloon’ as new paper shows child sexual abuse can be blocked before being shared in E2EE services

Europe in ‘last chance saloon’ as new paper shows child sexual abuse can be blocked before being shared in E2EE services

IWF paper sets out how end-to-end encrypted messaging can be protected from child sexual abuse without breaking encryption.

9 October 2025 News
How Upload Prevention Protects Children Online

How Upload Prevention Protects Children Online

Using privacy-preserving technology to stop child sexual abuse material.

9 October 2025 Blog