We don’t buy-in data. We’re the source.

Published:  Fri 23 Jun 2017

As IWF’s Commercial Relationship Manager, I speak to internet service providers, domain registries, hosting companies, and, of course, our Members every day. I recently spoke to a filtering company that asked me where we (the IWF) buy our data from to compile our services. When I explained that we don’t buy in any data, I was left with the distinct impression that they didn’t believe me.

I understand that the nature of our work is very unique and it may be hard to believe that our analysts manually assess and remove thousands of images of child sexual abuse each week. But to them who do this every day, they’re creating a solution.

As one of our analysts, Isobel, says in our 2016 annual report: “I know that going to work means that I'm going to remove images of child sexual abuse, I'm going to stop people stumbling across these images, I'm going to disrupt paedophiles from posting or sharing these images, and I'm going to stop these young victims being re-victimised over and over again.”

53% of the images removed last year were of children aged 10 or younger. 2% even showed children as young as 2 being sexually abused.

Our in-house expert analysts assess every single report that comes into our Hotline. If the image is found to be criminal, they then make sure to get it removed from the internet as quick as possible.

Each webpage we find showing child sexual abuse imagery are added to the IWF URL List - an encrypted list of individual webpages that contain child sexual abuse material. This allows companies to filter and block these pages while we work internationally behind the scenes to get the imagery removed.

Each image we find to be criminal is added to the IWF Image Hash List. The Hash List is a list of ‘digital fingerprints’ – unique codes – that directly relate to individual child sexual abuse images. When deployed to a company, our ‘hashes’ stop anyone from uploading, downloading, viewing or hosting such images on our Members’ platforms. We currently have more than 247k hashed images. This means that our analysts have viewed, assessed and hashed almost a quarter of a million images showing children being sexually abused, tortured, and raped. Again, all done at source.

We’re a non-for-profit body, set up by the online industry to help you keep your networks safe. We’re not only the source. We’re independent from police and government. We’re the trusted body of the internet industry. 

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Conversation with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog