Why the EU’s temporary law allowing companies to detect child sexual abuse online must be extended

Published:  Mon 9 Mar 2026

All parliamentarians must back the extension to the temporary derogation, retaining its full scope. Europe should be asking how to do more to protect children online. Not how to do less. 

The internet plays a significant role in the distribution of child sexual abuse material (CSAM) and in enabling offenders to contact and groom children. For many years, technology companies have proactively used tools to detect this abuse on their platforms and report it to authorities. 

In 2020, changes to European Union privacy rules created uncertainty about whether some of these voluntary detection practices could continue for certain types of online services. To address this, the EU adopted a temporary legal measure known as the temporary or interim derogation to the ePrivacy Directive. This allows companies to continue using technologies – such as IWF’s Hash List – to detect and report child sexual abuse material while remaining fully compliant with EU privacy and data protection law. 

The measure was designed as a temporary bridge until the EU adopts a permanent framework to tackle child sexual abuse online. However, it is due to expire in April 2026, and must now be extended again to avoid a gap in the legal framework. 

This legal cover is sometimes nicknamed “chat control 1.0” by those who oppose companies detecting and removing child sexual abuse material online. Many of the claims circulating online reflect a profound misunderstanding about the technologies used and their purpose. Read more about this misinformation campaign, and why we must counter it, here


What is the temporary derogation? 

Under EU privacy law, services are generally not allowed to scan the content of electronic communications. At the same time, many companies already use technologies that help identify child sexual abuse material and detect attempts to groom children online. Without a specific legal exemption, these safety tools could be interpreted as being in conflict with privacy rules. 

The temporary derogation resolves this potential conflict by explicitly stating that online services can use their detection tools on a voluntary basis, while remaining compliant with EU law. It does not require companies to ‘scan’ messages, but provides legal certainty for those that choose to use safety measures on a voluntary basis to detect illegal material. 


What does the law allow companies to do? 

The temporary framework allows online platforms to continue using established, highly targeted technologies to detect child sexual abuse. These include tools that can: 

  • Identify known child sexual abuse images and videos using digital fingerprints (“hash matching”) 
  • Detect previously unidentified abuse material 
  • Identify patterns that may indicate online grooming 
  • Report confirmed cases to law enforcement or specialised organisations that process abuse reports 

These technologies are highly privacy-preserving and widely used across the internet. They are deployed successfully in the background on platforms we all use every single day. They play a vital role in identifying victims and stopping the spread of material that revictimises victims and survivors with every share, view and click. 

 

Why is the law important? 

The temporary derogation is essential because it provides legal certainty for companies working to detect and report child sexual abuse on their platforms. Without it, some companies may feel they cannot safely continue these efforts due to the risk of conflicting with EU privacy rules. 

We have already seen the real-world consequences of legal uncertainty. In late 2020, when companies were unsure whether detecting abuse was permitted under EU law, reports of child sexual abuse material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) dropped by 58% in just 18 weeks. 

Databases of known child sexual abuse material do not appear spontaneously. They are built because previously unknown material is first detected, verified and confirmed by authorised bodies such as the Internet Watch Foundation. This process depends on technology companies deploying a layered set of tools, including hash-matching, AI-based classifiers, and human moderation, that work together to identify both known and previously unknown CSAM.  

Fewer reports translate into fewer opportunities for authorities to identify victims and intervene. Legal certainty is not just a technicality; it is critical to the proactive work done by companies to protect children online. 

Since the EU Parliament last debated these issues, the digital landscape has changed dramatically. Generative AI technologies now make it possible to create child sexual abuse material that has never existed before, content that is inherently unknown and cannot be detected through traditional hash-matching systems alone. Unlike previously circulating material, this content cannot simply be flagged by comparing it to existing databases. 

This shift fundamentally changes the detection challenge. Companies now face the urgent need to identify even greater volumes of previously unseen abuse. Legal uncertainty risks slowing or halting these efforts. 

Asking companies to deactivate systems developed over more than a decade to keep users safe would mean that child sexual abuse continues unchecked and children remain unseen. A drop in reports would not reflect a drop in abuse. 

Last year was a record year for the number of URLs actioned by the IWF. The prevalence and volume of CSAM online is enormous. 

This is a highly dangerous moment for the EU to require technology companies to dismantle their safety systems. 


Why does the law need to be extended? 

The temporary derogation was always intended as a short-term solution while the EU negotiates a permanent law to combat child sexual abuse online. This proposal for a permanent law is called the Child Sexual Abuse Regulation.  

That legislation is still under negotiation and is unlikely to be finalised before the current rules expire in April 2026. Without an extension, companies could once again face legal uncertainty. It is important that co-legislators are able to continue negotiations without the pressure of a legal gap.


What happens next in the European Parliament? 

EU lawmakers are currently deciding whether to extend the temporary rules.  

The European Commission has proposed extending the measure until 2028 to provide enough time for negotiations on the permanent law to conclude. EU Member States have already agreed that they would like the legal cover to continue until 2028 with its current scope.  

Neither of these positions would imply asking companies to do anything new or different to detect CSAM on their platforms. This is about maintaining the current baseline of protection. 

The situation in the European Parliament has become more complicated. On Monday 2 March, the Parliament’s Civil Liberties Committee (LIBE), which is responsible for the file, voted on the proposal. Although some individual amendments were adopted, the committee failed to approve the report as a whole. 

Under the Parliament’s rules, the file must now be referred to the full European Parliament, along with a recommendation to reject the proposal. 

Before the plenary vote, all Members of the European Parliament (MEPs) had the opportunity submit new amendments.  

During the vote on Wednesday 11 March, MEPs will first decide whether to accept the committee’s recommendation to reject the extension. If that rejection is not supported, they will then vote on amendments and on the proposal itself. 

Because the current framework expires in April 2026, lawmakers have only a short window to agree an extension.  


Call from the IWF 

We are calling on all parliamentarians to back the extension to the temporary derogation, retaining its full scope.

There is no time to hold a parallel debate on issues that are already being addressed in the ongoing negotiations on the Child Sexual Abuse Regulation. To ensure that detection and reporting of child sexual abuse material continue uninterrupted, the status quo must be maintained until permanent rules are adopted. 

Europe should be asking how to do more to protect children online. Not how to do less. 

IWF urges EU leaders to act now on child sexual abuse as 109 organisations demand robust CSAR

IWF urges EU leaders to act now on child sexual abuse as 109 organisations demand robust CSAR

IWF joins 108 civil society groups urging EU leaders to pass strong laws now to tackle the growing crisis of child sexual abuse online.

9 February 2026 News
Strong public support for EU child sexual abuse legislation as abuse imagery rockets

Strong public support for EU child sexual abuse legislation as abuse imagery rockets

IWF calls on lawmakers in Germany, Italy and Poland to respect the will of their electorates and pass laws allowing tech companies to detect for child sexual abuse images and videos.

16 January 2026 News
“AI child sexual abuse imagery is not a future risk – it is a current and accelerating crisis”

“AI child sexual abuse imagery is not a future risk – it is a current and accelerating crisis”

IWF CEO Kerry Smith calls for complete EU ban of AI abuse content at high-level meeting of global experts in Rome.

20 November 2025 News