A “low-level” image is still a child whose abuse and suffering goes on.

Published:  Wed 1 Mar 2017

Blog post by IWF CEO Susie Hargreaves OBE

The recent comments by Chief Constable Simon Bailey appear to have divided people who work in the area of child sexual exploitation into two camps – supporters, or objectors.

We work closely with the police. We're aware of the tremendous pressure on their workload and limited resources to tackle the problem.

But, there's no excuse to allow people get away with viewing child sexual abuse images - even a low-level image is a real child being sexually abused.

As one colleague from another charity commented to me yesterday, “what a day to be a paedophile”.

Regardless of resources, you don’t give out the message that viewing even a low-level image will bring few consequences.

Now for myth-busting.

Myth 1: “It’s only an image”. Or, “it’s just a low-level image”.

We have 20 years’ experience viewing and assessing child sexual abuse images and videos online. Images of children are rarely a one-off. They’re often part of a series. One so-called low-level image captures the early moments of a child being forced into more severe sexual acts. David Hill, one of 4,000 child migrants sent to Australia, emotionally told the Independent Inquiry into Child Sexual Abuse this week that some children never move on from their abuse. Imagine knowing your images are out there, being shared for someone’s sexual gratification? Images are real children, who suffered real abuse.

Myles Bradbury was an Addenbrookes Hospital paediatrician who abused children in his care. In 2012 Toronto Police alerted UK police at the Child Exploitation and Online Protection (CEOP) Centre but intelligence was not acted upon quickly enough as the offending was considered low-level. He was later found to be a serious contact abuser.

Coral Jones, mother of April Jones who was murdered by paedophile Mark Bridger in 2012, said to us today: “You don’t know what pictures Bridger was looking at before attacking my daughter. He was looking at images hours before he went after April. It doesn’t matter what images paedophiles look at. Anything can trigger them of. That child could be lucky, or might not be lucky. That family could go through hell. There are men out there looking at photos of children. A child could have a photo taken of their abuse and they will have to live with that for the rest of their life. People can be cruel.”

Myth 2: The internet industry does nothing.

The internet industry in the UK has virtually eradicated child sexual abuse content. Our latest global figures show that just 0.1% of this content globally is hosted in the UK. If images are viewed and shared in the UK, it’s because people do this, using the images, which they access from their homes in the UK, found in other countries. We work globally to have these images removed but there’s only so much which can be done. If other countries’ internet industry took the same stance as those in the UK, there would be fewer hiding places for these criminal images.

Myth 3: Social media is “awash” with child sexual abuse imagery.

In fact, social networks are one of the least likely places we find child sexual abuse content. High-traffic sites make hosting and sharing this content difficult, because companies like Facebook and Twitter won’t stand for it.  

The former detective, Mark Williams Thomas, who unveiled the Jimmy Savile offences in 2012, said on This Morning (last week) that social networks are “awash” with child sexual abuse images. Last year we found just 1% of child sexual abuse images on social networking sites globally. Hardly awash.


What services we identified as hosting child sexual abuse content in 2016


Image host




Banner site


Image board








Web archive


Social Networking site


Image store


Search provider


Video Channel


Chat site



Myth 4: No organisation proactively identifies child sexual abuse content.

IWF has been actively searching for this content since 2014. We did this following the horrific murders of Tia Sharp and April Jones in order to make us, and the internet industry, more effective. Here’s the proof: In 2013, the last complete year of figures before IWF active searches were introduced, 13,182 reports were found to contain child sexual abuse imagery. In 2015, the first full year that we actively searched, 68,092 reports were confirmed as illegal images or videos. That’s an increase of 417%.

Myth 5: It's all on the dark web.

The “dark web”, or hidden services, are challenging for everyone as the location of the hosting server can’t be traced, meaning removing child sexual abuse images is almost impossible.

We use the dark web to find child sexual abuse content, but what we commonly see are hundreds or even thousands of links to child sexual abuse imagery hosted on image hosts and cyberlockers on the open web, not the dark web. Therefore, it means we can get this content removed. The number of newly identified hidden services we saw last year declined from 79 in 2015 to 41 in 2016.

Ultimately, the job is far from being done. We take a zero-tolerance approach and we're world leaders. Being successful in the UK, means we’ve just got more to do internationally. There are some good guys among the internet industry who work with us with eagerness and responsibility.

There is more to be done, but let’s not give the impression that it’s okay to view images – even low-level images. Our ask to the public is this: keep reporting low level images. Don’t think we won’t take action. We will.

Susie's blog piece can also be found at the Huffington Post online.


Notes to editors:

Contact: Emma Hardy, Director of External Relations [email protected] +44 (0) 1223 203030 or +44 (0) 7929 553679.

Susie Hargreaves is available for media interviews.

Contact us for a print-quality photograph of IWF CEO Susie Hargreaves, OBE.

The 2016 statistics quoted in this press release will be published in the latest IWF Annual Report 2016 on 3 April 2017.

What we do:

We make the internet a safer place. We help victims of child sexual abuse worldwide by identifying and removing online images and videos of their abuse. We search for child sexual abuse images and videos and offer a place for the public to report them anonymously. We then have them removed. We’re a not for profit organisation and are supported by the global internet industry and the European Commission.

For more information please visit www.iwf.org.uk.

The IWF is part of the UK Safer Internet Centre, working with Childnet International and the South West Grid for Learning to promote the safe and responsible use of technology.

Teenage boys targeted as hotline sees ‘heartbreaking’ increase in child ‘sextortion’ reports

Teenage boys targeted as hotline sees ‘heartbreaking’ increase in child ‘sextortion’ reports

The IWF and NSPCC say tech platforms must do more to protect children online as confirmed sextortion cases soar.

18 March 2024 News
Pioneering chatbot reduces searches for illegal sexual images of children

Pioneering chatbot reduces searches for illegal sexual images of children

A major 18-month trial project has demonstrated a first-of-its-kind chatbot and warning message can reduce the number of online searches that may potentially be indicative of intent to find sexual images of children.

29 February 2024 News
“Trailblazing” partnership takes aim at criminals profiting from child sexual abuse online

“Trailblazing” partnership takes aim at criminals profiting from child sexual abuse online

Criminals running commercial child sexual abuse ‘brands’ are taking advantage of a ‘loophole’ to remain online. This new partnership aims to shut them down for good.

7 February 2024 News