Search Results

6 results
  1. EU Failure on Child Safety: Why CSAM Detection Laws Must Be Restored

    The EU ePrivacy derogation has lapsed, leaving children vulnerable. We must restore legal certainty for CSAM detection. Protecting children is not a threat to privacy; it is a fundamental human right.

  2. Europe is about to make it illegal to protect children online

    On 3 April, essential child protection systems used by technology companies to detect and remove online child sexual abuse material will become illegal to operate in the EU unless the European Parliament votes to extend the current legal framework. A temporary law allowing voluntary detection is expiring, and political deadlock has stalled a permanent solution. This will create a dangerous legal vacuum that perpetrators are aware of and poised to exploit. Proven tools like hash‑matching - which do not compromise privacy - would be forced offline, enabling millions of known abusive images to resurface. Research shows these systems deter offenders and make access harder; disabling them will reverse this progress. MEPs have one final chance to act by voting for an amendment that preserves protections for children across Europe.

  3. EU failure on temporary derogation puts children at risk

    The IWF warns that the EU’s failure to extend the temporary derogation will force platforms to halt proactive detection of child sexual abuse, putting children at serious risk.

  4. AI-generated child sexual abuse: now cannot be the moment the EU downs tools

    The IWF’s latest AI report exposes rapidly escalating harms to children as the EU moves to scale back the tools that detect and remove child sexual abuse material online. The charity warns that the EU must act urgently to criminalise AI‑generated abuse and preserve essential detection systems before risks intensify further.

  5. Charity urges for ‘zero tolerance’ of ‘dangerous’ AI child sexual abuse in EU as content reaches record high

    A new IWF report reveals record levels of AI‑generated child sexual abuse imagery and alarming insight into how offenders are exploiting emerging technologies. The charity is urging EU lawmakers to introduce a zero‑tolerance ban on AI‑generated abuse and the tools used to create it.

  6. Tech companies and protection experts call for EU to act now to plug gap in online safety laws

    From 3 April, the EU will become the only region worldwide without legal certainty allowing technology companies to detect child sexual abuse material online, prompting urgent warnings from child protection experts and global tech organisations. A coalition of 246 civil society groups and major industry players has condemned lawmakers for failing to extend the temporary legal framework that permitted privacy‑preserving detection tools, leaving companies unsure whether safeguarding systems remain lawful. With the EU already hosting the highest concentration of known child sexual abuse material - 62% of confirmed webpages in 2024 - experts warn the situation will worsen, reducing detections, hampering investigations, and emboldening offenders. As the EU’s proposed permanent legislation remains deadlocked, industry leaders and protection advocates stress that immediate action is essential to prevent increased harm to children across Europe and beyond.