Search Results

459 results
  1. IWF Podcast Warns Introducing End-to-end Encryption to Messaging Apps Could Hinder Detection of Child Sexual Abuse Imagery

    In a new podcast released by the Internet Watch Foundation, the charity says introducing end-to-end encryption to messaging apps could hinder the detection and removal of child sexual abuse material from the internet.

  2. High public concern at spread of child sexual abuse images and videos in end-to-end encrypted spaces

    More than nine in ten people in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services.

  3. Strong public support for EU child sexual abuse legislation as abuse imagery rockets

  4. IWF urges Apple not to abandon new plans to help keep children safe online

    The Internet Watch Foundation is supporting calls for Apple not to abandon new plans to help keep children safe online.

  5. AI advances could lead to more child sexual abuse videos, watchdog warns

    IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

  6. UK teen’s sex abuse imagery identified thanks to IWF analysts’ pioneering work with policing database

    IWF analysts use CAID and victim reports to verify teen abuse victims, helping remove illegal imagery that might otherwise be missed.

  7. Shorter working days, counselling and table tennis: How the Internet Watch Foundation (IWF) takes care of its staff

  8. Sharing goals globally

  9. Is this the UK’s toughest job?

  10. Changes to UK Government’s Online Safety Bill welcomed

    The Internet Watch Foundation (IWF) supports an amendment to the Online Safety Bill which will demand the development of new technologies to better detect child sexual abuse material online.

  11. Charities join forces for child sexual abuse study

    The Internet Watch Foundation and the Lucy Faithfull Foundation are embarking on a project to understand sex offenders’ internet habits when viewing online child sexual abuse material. The study is possible thanks to a grant from the International Foundation For Online Responsibility (IFFOR).

  12. ‘Disturbing’ AI-generated child sexual abuse images found on hidden chatbot website that simulates indecent fantasies