Europe remains the world’s largest hoster of child sexual abuse imagery with 62% of known images and videos being traced to a European Union country* in 2021.
We have a powerful sense of mission, with clarity, focus and purpose to our work. Our one single task – beyond all else – is the elimination of child sexual abuse material online.
OnlyFans in ‘groundbreaking’ partnership with Internet Watch Foundation - Content platform will share expertise and technical knowledge to help fight the spread of CSAM on the internet.
The Internet Watch Foundation is supporting calls for Apple not to abandon new plans to help keep children safe online.
The Internet Watch Foundation (IWF) and its partners blocked at least 8.8 million attempts by UK internet users to access videos and images of children suffering sexual abuse during lockdown
The term ‘child porn’ is misleading and harmful. Learn why the correct term is child sexual abuse material (CSAM), and how we can protect children from online abuse.
In conjunction with partners in the private and public sector, we regularly run campaigns aimed at raising awareness & prevention of child sexual abuse online.
Global cybersecurity company Heimdal has joined forces with the Internet Watch Foundation to tackle child sexual abuse imagery online and make the internet a safer space for users.
The European Parliament is taking a decisive stand against the rise of AI-generated child sexual abuse material (AI-CSAM), co-hosting a high-level briefing with the Internet Watch Foundation (IWF) to address this urgent threat. With a 380% increase in AI-CSAM reports in 2024, the Parliament is pushing for robust legal reforms through the proposed Child Sexual Abuse Directive. Key priorities include criminalising all forms of AI-generated CSAM, removing legal loopholes such as the “personal use” exemption, and enhancing cross-border enforcement. The IWF and the European Child Sexual Abuse Legislation Advocacy Group (ECLAG) urge the Council of the EU to align with Parliament’s strong stance to protect children and support survivors. This article highlights the scale of the threat, the evolving technology behind synthetic abuse imagery, and the critical need for updated EU legislation.