The IWF’s self-regulatory model is working

Published:  Wed 31 Jan 2018

Last week the Prime Minister used her speech to the World Economic Forum in Davos, Switzerland, to once again berate technology companies, telling them that they “still need to do more in stepping up to their responsibilities for dealing with harmful and illegal content online.”

She particularly highlighted child sexual abuse, modern slavery, terrorist and extremist content as areas in which companies could not “simply stand by while their platforms are used to facilitate this type of activity.”

Last week was also an important week for the Internet Watch Foundation (IWF). On Monday evening a panel convened in Westminster aimed at improving the dialogue between politicians, the technology industry and the public about the challenges faced in curing some of the internet’s ills mentioned by the Prime Minister.

The panel made up of the Home Affairs Select Committee Chair Yvette Cooper MP, Science and Technology Committee Member Vicky Ford MP, Karim Palant from Facebook, Jamie Bartlett from thinktank Demos and our Chair Andrew Puddephatt launched the first paper in Demos’ technology briefing series, which focussed purely on the challenge of dealing with the spread of child sexual abuse imagery online.

The paper sought to highlight the scale of the challenge faced by the technology companies, outlining the emerging trends and analysing the response of the industry to the challenge. It also aimed to debunk some of the common myths circulating in the public domain and in the media, such as “investment in technology being a silver bullet solution to the issue”, “that policing could arrest its way out of the problem” and that “social media is primarily to blame” for the spread of child sexual abuse material online and made seven recommendations for how future government policy could help to address the problem.

Perhaps the most important myth to debunk from the perspective of the IWF, is that “the industry does not self-regulate”, that “there are simply no advantages to self-regulation” and the notion that if the Government simply introduced greater regulation to deal with the problem that it would disappear.

Demos report launch

The UK has been leading the way internationally in addressing the spread of child sexual abuse material online, based on the model of self-regulation. When we were founded in 1996, 18 percent of the world’s child sexual abuse material was hosted in the UK and thanks to our partnership approach with the internet industry, law enforcement and the UK Government, we have now seen that figure drop to less than one percent.

This is clear evidence that our approach of self-regulation is working and we believe that if other countries were to follow the model of the IWF, there would be significantly fewer images in distribution than there are today.

We believe that our success in removing this content is because there is a clearly defined legal framework for our analysts to work within because of the offences contained under the Protection of Children Act (1978) and the Criminal Justice Act (1988). This is supported by the Sentencing Council’s guidelines in 2014, which categorises the images into three levels, and is used by our analysts to determine the severity of the abuse they see. Something which doesn’t exist for other forms of content, (modern slavery, terrorism and extremism), as mentioned by the Prime Minister in her Davos speech.

As an organisation, we are independent of both the Government and the industry. This means our analysts are free from interference and can objectively review content in line with UK law. Each image we remove is viewed by a highly trained analyst and in borderline cases an image is reviewed by a second analyst. Where there is not agreement between them, the hotline manager has the final say. This ensures that our work stands up to the highest possible standards and has led to us developing a high quality standard that is respected by Government, law enforcement and industry and is subject to annual review from a judge.

Our takedown times are the fastest in the world, in one case in the past year the offending content was removed within six minutes. 52 percent of illegal child sexual abuse material is removed in less than one hour and all notices to takedown that were issued in 2016 in the UK were taken down within two days.

We are investing in the latest technology to stop the spread of this material online, using the backing and technical expertise of the internet industry.

Both Google and Microsoft have also provided us with technical engineering expertise in-house to help us in our fight against the spread of this material. The fact that our 130+ Members include some of the most influential technology companies in the world, as well as some of the smaller start-ups, further demonstrates just how seriously the industry are taking this issue.

The Prime Minister was right to raise the issue of online responsibility in her speech at Davos and we share her vision to make the “UK the safest place to go online”. We believe that our self-regulatory model is one of global best practice which epitomises all that the Prime Minister is seeking to achieve.

Tags

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Converstaion with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog