Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips, said:
“It is clear that the British public want greater protections for children online and we are working with technology companies so more can be done to keep children safer.
“The design choices of platforms cannot be an excuse for failing to respond to the most horrific crimes, which in some circumstances have led to children taking their own lives. If companies don’t comply with the Online Safety Act they will face enforcement from the regulator.
"Through our action we now have an opportunity to make the online world safer for children, and I urge all technology companies to invest in safeguards so that children’s safety comes first.”
Helen Rance, Deputy Director of Child Sexual Abuse and Modern Slavery & Human Trafficking at the National Crime Agency, said: "While encryption offers significant benefits for user security, the rapid and widespread adoption of end-to-end encryption by major tech companies is occurring without adequate consideration for public safety – ultimately placing users at risk.
“Tech giants must not use end-to-end encryption as a shield to avoid responsibility for preventing illegal activity on their platforms, particularly in spaces accessed by children.
“The broad implementation of privacy-enhancing technologies, without a balanced approach to user safety, undermines platforms’ ability to detect and prevent harm and hampers law enforcement efforts to investigate child sexual abuse.”
Rani Govender, Policy Manager for Child Safety Online at the NSPCC, said: “The rapid roll-out of end-to-end encryption in its current form has made it near impossible to detect and remove child sexual abuse material from private messaging platforms.
“Children are paying the price with images of them, many showing horrendous child abuse, being shared at scale online amongst communities of criminals.
“Tech firms should be responding to this public outcry and investing in solutions that protect both safety and privacy, not dragging their feet while harm spreads behind closed doors.”
As part of the implementation of the Online Safety Act, the UK regulator Ofcom has stated that measures to prevent the upload of child sexual abuse material or takedown illegal content only apply to platforms where it is ‘technically feasible’. The IWF is calling for Ofcom to regularly assess its position on technical feasibility to ensure platforms do not use this as a get-out clause to evade compliance.
Kerry continued: “It is crucial that Ofcom conducts a regular review of technology to ensure ‘technical feasibility’ does not provide services with a loophole to avoid their safety duties. When it is technically feasible for a service to implement a safety measure, Ofcom must amend the codes to reflect the change.”
The survey was run by polling company Savanta in October 2025 and included 1,796 UK adults.
More than 290,000 webpages of child sexual abuse, or links to that content, were confirmed by the IWF analysts in 2024. Each page can contain hundreds, if not thousands, of indecent images of children. This is the most child sexual abuse webpages the IWF has ever discovered in its 29-year history and is a five per cent increase on the 275,650 webpages identified in 2023.