Search Results

52 results
  1. New EU-funded safety tech will help reduce viewing and demand for child sexual abuse images and videos

    A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.

  2. ‘Worst nightmares’ come true as predators are able to make thousands of new AI images of real child victims

    AI-Generated Child Abuse Sexual Imagery Threatens to “Overwhelm” Internet

  3. AI chatbots and child sexual abuse: a wake-up call for urgent safeguards

    IWF analysts uncover platform hosting chatbot “characters” designed to let users simulate sexual scenarios with child avatars.

  4. ‘Disturbing’ AI-generated child sexual abuse images found on hidden chatbot website that simulates indecent fantasies

  5. AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

    AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

  6. AI must be a force for good and not a threat to children

    The capacity for horrific images of AI-generated child sexual abuse to be reproduced at scale was underlined by IWF in the lead-up to the UK government’s AI Safety Summit.

  7. What did we learn from the US Senate hearing over online harms?

    Wednesday’s hearing brings into sharp focus the problems that organisations like ours, the Internet Watch Foundation, are dealing with every day.

  8. Hive partners with IWF to reduce the spread of child sexual abuse imagery online

  9. AI giving offenders ‘DIY child sexual abuse’ tool, as dozens of child victims used in AI models, IWF warns MPs

    AI giving offenders ‘DIY child sexual abuse’ tool, as dozens of child victims used in AI models, IWF warns MPs. The IWF has welcomed upcoming new legislation while giving evidence in Parliament this week.

  10. Safehire.ai joins IWF to reinforce digital safeguarding in recruitment

    New Member Safehire.ai says the organisation is proud to join the Internet Watch Foundation (IWF) as it strengthens a shared mission to protect children from online harm.

  11. Child sexual abuse material vs ‘child porn’: why language matters

    The term ‘child porn’ is misleading and harmful. Learn why the correct term is child sexual abuse material (CSAM), and how we can protect children from online abuse.

  12. Heimdal joins fight against child sexual abuse material online

    Global cybersecurity company Heimdal has joined forces with the Internet Watch Foundation to tackle child sexual abuse imagery online and make the internet a safer space for users.