Reports assessment

 

The IWF Hotline is split into two workstreams as illustrated below; reports assessments and imagery assessments. On this page, we focus on the outcomes of the reports assessment workflow. 

 

Hotline assessment overview diagram showing reports assessment

Reports analysis

The IWF’s mission is to detect, disrupt, remove and prevent online child sexual abuse imagery. Our analysts assess each report against UK legal guidelines.  

424,047 reports were assessed by the IWF (an 8% increase from 2023):

 

  • 424,031 were reports of URLs or from child reporting services.  
  • 16 were reports of newsgroups.  
  • 267,788 (63%) came from the proactive work of our analysts.  

This equates to one report every 74 seconds.  

291,273 reports were confirmed to either contain criminal imagery of child sexual abuse, link to this criminal imagery or were found advertising it (a 6% increase from 2023).  

 

  • 290,637 were URL's.
  • 636 were reports received from our child reporting services
  • 91% of the reports assessed as criminal were found to contain ‘self-generated’ imagery.  
    • 94% of these 'self-generated' reports showed only girls.  

Every 108 seconds a report showed a child being sexually abused.   

Every report to the IWF represents either a single URL or a direct report via our child reporting services (Report Remove and Meri Trustline).   

Each URL could contain one, tens, hundreds or even thousands of individual child sexual abuse images or videos.