Search Results

141 results
  1. Online Safety Act: UK Tech Companies must now Tackle Illegal Harms including Child Sexual Abuse Imagery

    As Ofcom’s Illegal Harms Codes come into force, platforms are required to implement robust measures to protect users from CSAM and illegal content.

  2. Evaluation of IWF Reporting Portals

    Read the key findings from the 2025 evaluation of IWF International Reporting Portals, covering global reach, impact, challenges and next steps.

  3. IWF analysts finding fifteen times more child sexual abuse content online than they were ten years ago

    Expert analysts have taken action against 200,000 websites containing child sexual abuse material

  4. IWF joins leading policing researchers to tackle online grooming and sexual imagery of children

    The findings will be ‘invaluable’ in turning the tide on the threat children are facing from online predators.

  5. Under sixes manipulated into ‘disturbing’ sexual abuse while playing alone online as IWF says regulation can’t wait

    Internet Watch Foundation sees the most extreme year on record in 2023 Annual Report and calls for immediate action to protect very young children online.

  6. Public exposure to ‘chilling’ AI child sexual abuse images and videos increases

  7. New tech enables thousands of additional child victims to be counted in sexual abuse images for the first time

  8. 'Staggering' scale of online threat to children revealed as report says 850,000 people in UK could pose sexual risk to children

    The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.

  9. Call for experts to help tackle growing threat of ‘self-generated’ online child sexual abuse material

  10. Landmark data sharing agreement to help safeguard victims of sexual abuse imagery

    The UK’s Internet Watch Foundation (IWF) and the USA’s National Center for Missing & Exploited Children (NCMEC) announce a landmark agreement to better protect children whose sexual abuse images are shared and traded on the internet.

  11. Child sexual abuse material vs ‘child porn’: why language matters

    The term ‘child porn’ is misleading and harmful. Learn why the correct term is child sexual abuse material (CSAM), and how we can protect children from online abuse.

  12. IWF working with the adult sector is vital if we’re serious about tackling child sexual abuse imagery online

    IWF supports the Online Safety Act by helping adult sites detect, remove, and prevent child sexual abuse imagery online.