CEO's foreword

Susie Hargreaves OBE

 

2023 was a remarkable year. We saw our highest number of companies opting to work with IWF to take our datasets and services to keep their online businesses and platforms safer. More than 200 companies are now in membership, keeping their five billion or more customers and user accounts better protected from online child sexual abuse.  

Whilst that is a positive step, and we expect going into 2024 that this number will increase given that the UK now has an Online Safety Act, I can’t help but wonder about all the webpages of child sexual abuse that we didn’t get around to identifying and removing last year. We work closely with our industry partners, our sister hotlines, with law enforcement and policy makers, but this problem is bigger than all of our efforts put together.  

It’s also important to note that the internet is a vast space. Having a near-unique ability to proactively search for child sexual abuse material has made our efforts far more effective than working from public reports alone.

Therefore, it’s important to remember this caveat with any data presented in this space: The data we report on, year on year, can only ever reflect what we have discovered. We cannot possibly know the true number of webpages showing children aged 7-10, or the true number of unique domains being abused to show child sexual abuse material as, sadly, there is too much out there for any of us – yet – to know the full extent of this content.  

What we can do, however, is provide a detailed analysis on what we see, to help others working across the sector. And this year we’re able to provide analysis on two distinct datasets – one is based on URLs (or webpages) which may contain one, but often many individual images and videos. The other is on images and videos.

On our URL/webpage-based analysis, or what we’ve always referred to as ‘reports’, our analysts log the age of the youngest child they see in a sexual abuse image, and the most severe category they identify within an image. This may happen to appear within the same image, or it may be across two different images or videos on the same URL.  

An image or video-level analysis is much more granular, and the data reported from an image level analysis will arguably provide a greater depth and understanding of what is happening across a whole number of individual child sexual abuse images and videos. 

An example of an image level analysis can be found in our case study providing an insight into “self-generated” child sexual abuse featuring 3–6-year-olds.  

In this annual report, you can read an overview of the number of webpages we assessed and confirmed as child sexual abuse; about the vast and growing number of individual unique image hashes (digital fingerprints) we now hold and share with industry and law enforcement; about where we traced child sexual abuse imagery to with a larger analysis of the issue as seen across domains and not just URLs; and several case studies.  

Whilst the problem is vast and needs all of us to stand together to tackle it, we hope the analysis we provide year on year adds value to this sector, and specifically your work.