Search Results

456 results
  1. AI-generated child sexual abuse: now cannot be the moment the EU downs tools

    The IWF’s latest AI report exposes rapidly escalating harms to children as the EU moves to scale back the tools that detect and remove child sexual abuse material online. The charity warns that the EU must act urgently to criminalise AI‑generated abuse and preserve essential detection systems before risks intensify further.

  2. A million of the worst child sexual abuse images graded by ‘elite’ taskforce

    The ‘shocking’ images of children can involve penetrative sexual activity, sexual activity with an animal, and sadism.

  3. Fight against online child sexual abuse content is being won in the UK, but the global threat remains as big as ever, report says

  4. “AI child sexual abuse imagery is not a future risk – it is a current and accelerating crisis”

    IWF CEO Kerry Smith calls for complete EU ban of AI abuse content at high-level meeting of global experts in Rome

  5. UK makes use of AI tools to create child abuse material a crime

    Britain will make it illegal to use artificial intelligence tools that create child sexual abuse images.

  6. Non-photographic child sexual abuse

    Explore how IWF identifies and addresses non-photographic child sexual abuse imagery, including drawings and CGI, under UK legislation.

  7. Huw Edwards’ offences highlight how WhatsApp can be abused by predators sharing criminal imagery of children, IWF warns

    Huw Edwards’ offences highlight how WhatsApp can be abused by predators sharing criminal imagery of children, IWF warns. Dan Sexton, Chief Technology Officer at the IWF, appeared on national BBC Breakfast television this week (September 17) to warn Meta is not taking adequate steps to proactively prevent the sharing of child sexual abuse material on the platform.

  8. AI chatbots and child sexual abuse: a wake-up call for urgent safeguards

    IWF analysts uncover platform hosting chatbot “characters” designed to let users simulate sexual scenarios with child avatars.

  9. New clustering tech ‘revolution’ helps analysts assess child sexual abuse imagery in seconds

    Innovations in detecting and removing child sexual abuse material have been made possible by a grant from Nominet.

  10. UK teen’s sex abuse imagery identified thanks to IWF analysts’ pioneering work with policing database

    IWF analysts use CAID and victim reports to verify teen abuse victims, helping remove illegal imagery that might otherwise be missed.

  11. Public exposure to ‘chilling’ AI child sexual abuse images and videos increases

  12. Our participation at the Independent Inquiry into Child Sexual Abuse