IMCO’s draft opinion weakens online child protection in the European Union

Published:  Wed 1 Mar 2023

IWF joins a coalition of child protection charities in call for European Parliament’s influential Internal Markets and Consumer Protection committee (IMCO) to amend its draft opinion that could reverse years of progress in detecting child sexual abuse material online.

The letter signed by 42 organisations including Missing Children Europe, Terres De Hommes and ECPAT International, calls for EU policymakers to ensure that detection of new child sexual abuse material and grooming is covered by the new proposed regulation to prevent and combat child sexual abuse.  

The letter also criticises the removal of age verification and claims that technology can meet high privacy standards, explaining that the new legislation adds in additional safeguards to already effective measures to prevent the spread of this material online. 

The IMCO draft report seeks a vastly reduced scope for the Regulation which the letter’s signatories say prioritises the rights of perpetrators of abuse over those of victims and survivors of sexual abuse. 

The full letter reads: 

European Parliament IMCO Committee draft report threatens children’s safety and fails to understand and respond to child sexual abuse online.  

On 8 February 2023, the European Parliament’s Committee on the Internal Market and Consumer Protection (IMCO) published its draft report on the European Commission’s proposal to prevent and combat child sexual abuse. The draft report seeks a vastly reduced scope for the Regulation, demonstrating a misguided interpretation of the Commission’s proposal, and a flawed understanding of the complex dynamics of child sexual abuse. It prioritises the anonymity of perpetrators of abuse over the rights of victims and survivors of sexual abuse and seeks to reverse progress made in keeping children safe as they navigate or are harmed in digital environments that were not built with their safety in mind. 

We, a civil society coalition united in defence of children’s right to protection from violence and abuse, feel obliged to address the direct threat to children by IMCO’s draft report:  

Detection of unknown child sexual abuse material (CSAM) and grooming should not be optional. We must take action to address both.   

Excluding the possibility of detecting so far unknown CSAM and grooming is to deliberately look away from a crime. Only looking for what has already been verified is refusing to help a child in need or from preventing the harm from ever happening.  

Databases of known CSAM only exist because unknown CSAM can be detected and identified. ‘Unknown’ CSAM refers to new photos and videos of children being sexually abused or exploited that have not before been detected and categorised. Once detected and categorised by law enforcement and child protection organisations, new CSAM becomes ‘known’ CSAM. 

Over one third (34%) of children have been asked to do something sexually explicit online they were uncomfortable with or did not want to do. This is one example of grooming. Children exposed to sexually explicit content and grooming report similar levels of trauma symptoms (i.e. clinically diagnosable PTSD) to victims of penetrative offline sexual offences. To tackle this crime at scale, online service providers must deploy the preventative technologies foreseen by the Regulation as mandatory following a risk assessment. 

Prevention and user reporting are essential, but will not solve the issue alone. 

We agree with the draft report that there is a need for more prevention. However, decades of experience show that effective prevention of child sexual abuse (CSA) online requires a combination of prevention methods which include targeted technologies, education, and referral mechanisms for children and caregivers. 

While we also agree on the need for robust, accessible, and accountable user reporting mechanisms on all platforms, in reality, the proportion of known and new CSAM identified through user reports will always remain small. This is because of the massive volume of illegal content circulating online and because reporting sexual abuse is not as simple as reporting a stolen credit card. It is complicated for bystanders and highly complex for victims.  

We know that up to 83% of children do not report or tell anyone about sexual abuse or grooming happening to them. Victims may not know their abuse has been recorded, some victims are too young to speak out, older children need help to remove their sexual images. Both adults and children often don’t report because of shame, fear, threats from the offender not to report or even indifference. If they do report, they are likely to do so to child-friendly and safe reporting mechanisms, such as child helplines and child-focused hotlines 

Prevention of harm starts with verifying the age of a user 

Children can be denied entry to a bar or a cinema based on their age. Why should access to online platforms which also pose a threat to their welfare be any different? Where a platform cannot be made safe by design, assessing the age of users on a platform is a safeguarding measure which must remain available to service providers and should be carried out using effective and privacy-preserving age assurance techniques.  

Technology can meet a high standard of privacy and protection 

Technologies are built to meet high standards of privacy-protection, data minimisation, proportionality, and transparency, these neither restrict nor undermine encryption. What they do is detect the illegal distribution of illegal content depicting child sexual abuse. What really matters is ensuring strong safeguards to establish a global standard. This is a policy decision, and should be based on the best interests of the child as enshrined in EU and international law.  

Detection Orders are designed to balance rights and ensure a democratic process  

Detection orders will only be issued for approved technologies and using a risk assessment, with supervision by national courts and scrutiny by an independent EU Centre. This means that all deployed technologies must meet existing EU standards before they can be used. The focus on error rates bypasses the role of existing legal frameworks such as the 2011 CSA Directive, and human review.  

The IMCO draft report seeks to make the detection process even more complicated. Checks and balances are essential and were built into the existing proposal. Excessive red tape risks letting perpetrators of sexual abuse off the hook. Proactive mitigation measures can ensure that children who are being sexually abused continue to be on our radar and can get the help they need today, not in 18-24 months. 

The scale of this problem requires us to act at scale and to use the efficiency offered by technology, just as we do in every other aspect of the digital transformation of societies. We call upon EU policy-makers to ensure the new EU Regulation covers detection of known CSAM, unknown CSAM and grooming in order to continue protecting children on and offline. We equally urge EU policy-makers to establish effective age verification and assessment requirements. 

Written by the Steering Group of the European Child sexual abuse Legislation Advocacy Group (ECLAG), a coalition of NGOs working to ensure children’s right and protection.  

Brave Movement, ECPAT, Missing Children Europe, Internet Watch Foundation, Terre des Hommes, Thorn 
 

Signatories: 

  1. Innocence in Danger e.V., Germany 
  2. Child Rescue Coalition 
  3. ECPAT Austria 
  4. Brave Movement 
  5. ECPAT Norway 
  6. NSPCC United Kingdom 
  7. ISPCC Ireland 
  8. Lightup Norway 
  9. ECPAT Korea 
  10. Hintalovon Foundation - ECPAT Hungary 
  11. The Lucy Faithfull Foundation United Kingdom 
  12. The International Centre for Missing and Exploited Children 
  13. Missing Children Switzerland 
  14. Missing Persons Families Support Centre, Lithuania 
  15. Lasten perusoikeudet - Children´s Fundamental Rights ry, Finland 
  16. Instituto de Apoio à Criança - Portugal 
  17. Marie Collins Foundation, United Kingdom 
  18. ASTRA-Anti-trafficking action, Serbia 
  19. The Smile of the Child, Greece 
  20. ITAKA Foundation, Poland 
  21. Stiftung Digitale Chancen / Digital Opportunities Foundation, Germany 
  22. “Hope For Children” CRC Policy Center, Cyprus 
  23. Child Helpline International 
  24. Association for the Prevention and Handling of Violence in the Family (SPAVO), Cyprus 
  25. S.O.S Il Telefono Azzurro Onlus, Italy 
  26. Fundación ANAR, Spain  
  27. International Justice Mission 
  28. End Violence Global Partnership  
  29. Defence for Children - ECPAT the Netherlands 
  30. Eurochild 
  31. eLiberare, Romania 
  32. Augusta Associates, LLC 
  33. Network for Children’s Rights, Greece 
  34. Child10, Sweden 
  35. Childnet, United Kingdom 
  36. UK Safer Internet Centre 

Tags

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Conversation with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog