The debate on the EU’s proposed Child Sexual Abuse Regulation (CSAR) has been dominated by one loud slogan. A slogan which may have dire consequences for the safety and wellbeing of millions of children worldwide.
'Chat Control.'
The phrase has flooded social media feeds and inboxes across Brussels, framing the proposal as an attack on democracy – conflating reasonable and trusted safety measures with state over-reach and surveillance.
But this rhetoric obscures a far more uncomfortable truth – that the EU is hamstrung when it comes to confronting one of the most serious crimes of our time: the sexual abuse of children online. A crime which the continent desperately needs to wake up to.
Make no mistake – if the EU does not find a solution soon, it will make itself increasingly attractive to criminals.
Every day, thousands of images and videos of children suffering sexual abuse circulate across digital platforms. This is a scandal. The internet is allowing a global trade in children’s suffering and misery to proliferate like never before.
And last year, 62% of the child sexual abuse material (CSAM) found by analysts at the IWF was hosted on European servers.
These are not abstract numbers, and this is not a theoretical issue – these figures represent real children, real crimes, and real suffering that continues long after the physical abuse has ended.
The solution to the spread of much of this imagery exists. It’s safe and trusted. It’s being used in many forms already to keep whole swaths of the internet safe. It has as much “control” over your “chat” as your spam filter or virus guard, and it could make a massive difference to safety of the most vulnerable children overnight.
Why wouldn’t we use it?
Currently, the systems that could help detect and remove such content are voluntary. Some platforms act responsibly; others do not.
The Danish Presidency’s latest proposal would extend and make permanent this voluntary framework, preventing a legal vacuum when the temporary ePrivacy exemption expires in April 2026. This is an important step to ensure companies can continue detecting and removing CSAM – but voluntary measures alone are not enough.
The claim that this adds up to surveillance reveals a misunderstanding of the technology and schema at hand. We must not allow a poor grasp of how this tech works to torpedo genuine debate over how to best protect children online.
Our new paper clearly demonstrates how child sexual abuse imagery can be prevented from being uploaded into end-to-end encrypted spaces in the first place. Users can be protected from the spread of this content without any need for anyone to ever be snooping on chats.
Offenders should not be given anywhere to hide online. Services that use end to-end encryption can uphold both the privacy of communications and the fundamental rights of victims and survivors by adopting upload prevention. There does not need to be a compromise.
The principles of safeguards, oversight mechanisms, and transparency requirements must remain central as negotiations continue – alongside a clear commitment to review whether stronger, mandatory measures are needed in future.
Privacy is vital, as is accountability. We are not advocating that anyone “break” or weaken end-to-end encryption, but block a narrow category of illegal, harmful material that destroys lives. The choice is not between privacy and protection, but between indifference and compassion.
Children, and their privacy, are currently missing from the debate. Their human rights are violated first when they are abused, and second, when that abuse is recorded, shared, and perpetuated online. These images and videos show them at their most vulnerable – we should intervene to prevent them ever existing or being shared. The real outrage is failing to act when we could.
Protecting children is not in opposition to supporting their freedom, but an essential part of ensuring their rights in the digital age.
So, ‘Stop Chat Control’ may be catchy, but it offers no solutions to the children who are being exploited every day, whose images are being traded in all spaces across the internet, and who rely on us – legislators, platforms, and citizens – to act.
Evidence, expertise, and responsibility should guide action. Europe can lead the way in protecting privacy and stopping the circulation of child sexual abuse material.