Why is the law important?
The temporary derogation is essential because it provides legal certainty for companies working to detect and report child sexual abuse on their platforms. Without it, some companies may feel they cannot safely continue these efforts due to the risk of conflicting with EU privacy rules.
We have already seen the real-world consequences of legal uncertainty. In late 2020, when companies were unsure whether detecting abuse was permitted under EU law, reports of child sexual abuse material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) dropped by 58% in just 18 weeks.
Databases of known child sexual abuse material do not appear spontaneously. They are built because previously unknown material is first detected, verified and confirmed by authorised bodies such as the Internet Watch Foundation. This process depends on technology companies deploying a layered set of tools, including hash-matching, AI-based classifiers, and human moderation, that work together to identify both known and previously unknown CSAM.
Fewer reports translate into fewer opportunities for authorities to identify victims and intervene. Legal certainty is not just a technicality; it is critical to the proactive work done by companies to protect children online.
Since the EU Parliament last debated these issues, the digital landscape has changed dramatically. Generative AI technologies now make it possible to create child sexual abuse material that has never existed before, content that is inherently unknown and cannot be detected through traditional hash-matching systems alone. Unlike previously circulating material, this content cannot simply be flagged by comparing it to existing databases.
This shift fundamentally changes the detection challenge. Companies now face the urgent need to identify even greater volumes of previously unseen abuse. Legal uncertainty risks slowing or halting these efforts.
Asking companies to deactivate systems developed over more than a decade to keep users safe would mean that child sexual abuse continues unchecked and children remain unseen. A drop in reports would not reflect a drop in abuse.
Last year was a record year for the number of URLs actioned by the IWF. The prevalence and volume of CSAM online is enormous.