No more excuses.

Published:  Tue 13 Jun 2017

I remember the days before I began my career at the IWF. I loved the internet - for so many reasons. I loved sitting in my living room and talking to my friends on social media, I loved laughing at funny cat videos, I loved doing my shopping online. But I had no idea that behind the scenes of those large companies who kept me entertained, they were busy implementing new technologies to fight the distribution of online child sexual abuse. Why did they do this? Because they cared about their platforms. And because they wanted to protect innocent users from having to witness distressing images of children.

Fast forward to my three years at the Internet Watch Foundation. I’ve witnessed thousands upon thousands of platforms still offering a safe place for offenders to share scenes of children being raped. I wonder now: Is this because they don’t care? Or are they doing the best they can? I don’t know the answers to these questions but what I do know is that there is always more that can be done. There’s always more that can be achieved by working together and ensuring we make it as hard as possible for offenders to host and share this material on the internet.

I’ve sat in the seat of an Internet Content Analysts for half a year. My job was to assess and remove thousands of images of children being sexually abused. To clarify, these aren’t pictures of children enjoying family bath time, and these are certainly not pictures of children building sand castles on the beach. All those are children - some as young as newborns - were abused in the most horrific ways. As if it isn’t bad enough to have been through these horrific acts, those children have to live knowing that these images are being shared and traded with on the internet – the internet that you and me use every day. These images aren’t hidden in the so-called ‘dark web’, they aren’t exclusive to invite-only forums. They are everywhere, on all platforms of the internet that make it possible for them to share.

But what if a tool existed which companies could use to stop the hosting, uploading and sharing of those images once for all? What if a list existed that contained digital fingerprints of child sexual abuse images? Well it does. One of the most important projects I’ve been a part of at the IWF is the IWF Image Hash List, a list of thousands digital fingerprints relating directly to images of child sexual abuse. Companies implementing this list into their systems can block the images from appearing on their services. Once for all.

Don’t get me wrong – I’m not saying you can snap your fingers and rid your services of all these criminal images. But I am saying that we're here to ensure we start this process now to make the internet a safe place to use and an impossible one to share these crimes on.

Not only do we help you step by step and consult with you to build the ‘hashes’ into your services, but we also offer ways of working with a cloud based system to implement a solution to stop these images from appearing on your systems from the day you sign up. 

As someone who sees these images day by day, watching victims grow up, knowing their images are seen by millions a day, I ask you one thing: What's stopping you?

There can be no excuse when it comes to stopping the revictimisation of children on the internet. There should be no place on the internet that allows those images to be uploaded and shared.

This is a call to action. There are solutions out there. Now is the time to work together and make the Internet a hostile place for this horrible crime that affects too many children in the world. 

Wouldn’t it be amazing to tell the victims of these crimes that the chances of someone stumbling across their images is zero? This might look like an impossible win but I truly believe that if we all come together and implement new technology, this is achievable. This will be the day I can finally pick up my bag and leave work. This will be the best day of my life. 

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Conversation with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog