The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.
A specialised taskforce will stop the spread of child sexual abuse images by taking ‘digital fingerprints’ of each picture.
Discover the latest trends & data in the fight against online child sexual abuse imagery in the 2023 Annual Report from the Internet Watch Foundation (IWF).
IWF wants to help young people stay safe online by making sure you know what to do if you accidentally see sexual images or videos of someone you think might be under 18.
IWF reveals 2024 as the worst year for online child sexual abuse imagery urging the Prime Minister to strengthen the Online Safety Act and close critical loopholes.
New Zealand’s largest telecommunications and digital services company, Spark, joins the Internet Watch Foundation (IWF), to help keep the internet free from child sexual abuse content.
The Internet Watch Foundation (IWF) can confirm that the number of reports of child sexual abuse imagery online actioned for removal in the first half of 2015, was significantly higher than in 2014.
In conjunction with partners in the private and public sector, we regularly run campaigns aimed at raising awareness & prevention of child sexual abuse online.
We've partnered with CHI to build capacity amongst international helpline staff to deal with online child sexual exploitation and abuse.
IWF confirms it has begun to see AI-generated imagery of child sexual abuse being shared online, with some examples being so realistic they would be indistinguishable from real imagery.