We have a powerful sense of mission, with clarity, focus and purpose to our work. Our one single task – beyond all else – is the elimination of child sexual abuse material online.
Sadly, every four minutes we find a webpage showing a child being sexually abused. We’re here to stop that.
IWF assesses and gets removed from the internet millions of individual child sexual abuse images and videos every year.
That fact that we found 118 instances of child sexual abuse imagery on Pornhub between 1 Jan 2017, and 29 October 2019 - a period of almost three years – has been used by those who wish to both defend Pornhub, or campaign against them.
Our view is clear: one instance is one too many.
The keys to understanding this data are knowing several things, some of which set us apart from our partners who do similar work:
- That IWF performs a human-eyes assessment of every report that we receive;
- That we – just like other hotlines – are challenged by the need to accurately ‘age’ a child that we see in the imagery; if there’s a chance that an individual could be an adult, unless we have proof that they were a victim of sexual abuse such as police evidence, then we’re not able to take action. And finally,
- That whilst child sexual abuse imagery is largely illegal the world-over, different laws govern the definition of what is ‘child sexual abuse’ in different countries. Put simply, there is no one international standard.
The National Centre for Missing and Exploited Children (NCMEC) in the USA receives many millions of reports from industry each year of suspected child sexual abuse and exploitation. This is due to a mandatory reporting law in the US, meaning that US-based companies have to report into them. NCMEC’s remit is also wider than IWF’s work, and therefore the reports that they receive from industry do not necessarily tally with the deliberately focused remit of IWF.
Whilst IWF receives reports from the public, police and industry it is to a much lesser extent and the UK does not have mandatory reporting laws. Comparing our data to theirs is providing a skewed picture to the arguments being presented in the media recently about levels of child sexual abuse material on both social networks and adults sites alike. It is, as we say in the UK, like comparing apples with pears.
Whilst we receive many thousands of reports from members of the public each year who are concerned about the possibility of children being sexually exploited on adult sites, the vast majority of those reports, when assessed, are not found to fail UK law.
The reality for us is that the majority of child sexual abuse content we assess is traced back to image hosts and cyberlockers in the Netherlands.
The world is lucky to have the many hotlines like IWF around the world with a mission to remove child sexual abuse imagery. But it works against our mission and efforts when data and working methods are conflated in reports.
There should be no excuses for any instances of child sexual abuse material. There should only be a determination to do what’s right for children. The truth is, it is individuals who make up companies and individuals who make up governments and countries. What is clear to me is that many individuals need to step up and do more so that children don’t pay the price of us having an online world.