We don’t buy-in data. We’re the source.

Published:  Fri 23 Jun 2017

As IWF’s Commercial Relationship Manager, I speak to internet service providers, domain registries, hosting companies, and, of course, our Members every day. I recently spoke to a filtering company that asked me where we (the IWF) buy our data from to compile our services. When I explained that we don’t buy in any data, I was left with the distinct impression that they didn’t believe me.

I understand that the nature of our work is very unique and it may be hard to believe that our analysts manually assess and remove thousands of images of child sexual abuse each week. But to them who do this every day, they’re creating a solution.

As one of our analysts, Isobel, says in our 2016 annual report: “I know that going to work means that I'm going to remove images of child sexual abuse, I'm going to stop people stumbling across these images, I'm going to disrupt paedophiles from posting or sharing these images, and I'm going to stop these young victims being re-victimised over and over again.”

53% of the images removed last year were of children aged 10 or younger. 2% even showed children as young as 2 being sexually abused.

Our in-house expert analysts assess every single report that comes into our Hotline. If the image is found to be criminal, they then make sure to get it removed from the internet as quick as possible.

Each webpage we find showing child sexual abuse imagery are added to the IWF URL List - an encrypted list of individual webpages that contain child sexual abuse material. This allows companies to filter and block these pages while we work internationally behind the scenes to get the imagery removed.

Each image we find to be criminal is added to the IWF Image Hash List. The Hash List is a list of ‘digital fingerprints’ – unique codes – that directly relate to individual child sexual abuse images. When deployed to a company, our ‘hashes’ stop anyone from uploading, downloading, viewing or hosting such images on our Members’ platforms. We currently have more than 247k hashed images. This means that our analysts have viewed, assessed and hashed almost a quarter of a million images showing children being sexually abused, tortured, and raped. Again, all done at source.

We’re a non-for-profit body, set up by the online industry to help you keep your networks safe. We’re not only the source. We’re independent from police and government. We’re the trusted body of the internet industry. 

IWF’s Dan Sexton explains vital role new European proposal could have in preventing the widespread sexual abuse, rape, and sexual torture of child victims online

IWF’s Dan Sexton explains vital role new European proposal could have in preventing the widespread sexual abuse, rape, and sexual torture of child victims online

Dan explains the vital role the proposal could have in preventing the widespread sexual abuse, rape, and sexual torture of child victims online

1 June 2022 Blog
IWF calls for changes to Bill to ensure it does not disrupt current mechanisms for stopping child sexual abuse on the internet

IWF calls for changes to Bill to ensure it does not disrupt current mechanisms for stopping child sexual abuse on the internet

Today (May 24), the Online Safety Bill begins its next stage as MPs begin the line-by-line scrutiny of the legislation.

24 May 2022 Blog
Not all Encryption is the same: social media is not ready for End-to-End Encryption

Not all Encryption is the same: social media is not ready for End-to-End Encryption

IWF CTO Dan Sexton explains the differences in the technology behind the debate.

14 March 2022 Blog