Our response to the European Commission’s consultation on tackling illegal content online

Published:  Thu 5 Apr 2018

On 1st March this year the European Commission published recommendations on measures to effectively tackle illegal content online. The Commission’s recommendations come at a time when a number of EU member states are considering their own proposals about dealing with illegal content online.

President Macron has talked about how ‘fake news’ is a very real threat to liberal democracies. In March, it was reported that the text of proposed legislation on fake news is ready and would be read in the General Assembly in May of this year.

In Denmark, a law has recently been introduced relating to the blocking of sites that host illegal content and stipulates that any blocking would be subject to a court order.

While in Germany, perhaps the most draconian legislation to tackle illegal content online came into effect in January of this year. The “NetzDG” (Network Enforcement Act) law obliges social network companies to remove content “that is manifestly” unlawful within 24 hours, or face fines of up to €50m. The law also requires social networks to produce half-yearly reports on the handling of complaints about unlawful content on their platforms.

The introduction of these laws, however, have not been without their challenges. In Germany there have been calls by several politicians to reform this legislation. They want to ensure that it is not just the responsibility of tech companies to tackle the issue of illegal content online, but a matter for wider society.

In the UK, we’ve seen numerous calls from politicians including the Prime Minister, Theresa May, for tech companies “to do more”. The Department of Digital, Culture, Media and Sport (DCMS) is proposing a new code of practice and a social media levy in its recent green paper on Internet Safety.

The EU Commission’s proposals to tackle illegal content online include:

  • Hosting providers and Member States being prepared to submit all monitoring information to the Commission, upon request, within six months (three months for terrorist content) in order for the Commission to assess whether further legislation is required.
  • Recommends introducing definitions for “illegal content” and “trusted flaggers”.
  • Fast track procedures should be introduced for materials referred by trusted flaggers.
  • Hosting providers to publish a list of who they consider to be a “trusted flagger”.
  • Automated takedown of content is encouraged, but should have safeguards such as human oversight.
  • Terrorist content should be removed within one hour.

The IWF has a long-established history of working with policymakers in Europe to tackle illegal child sexual abuse imagery online. As recently as December 2017, our work was referenced in the European Parliament’s Committee for Civil Liberties, Justice and Home Affairs report on the implementation of Directive 2011/093/EU, on combatting child sexual abuse and sexual exploitation of children and child pornography.

This report recommends a number of IWF measures such as blocking, proactive searching and image hashing to tackle the spread of this material online. Our work has also been mentioned in numerous plenary debates by MEPs and in a hearing of the European Parliament’s Committee on Culture and Education.

We believe that our approach has proved effective because of our model of self-regulation and the partnership approach that we take with the internet industry, governments (UK and EU) and law enforcement. This is evidenced by the impact we have had over the past 21 years in driving down the amount of illegal child sexual abuse imagery hosted in the UK from 18% in 1996, to less than 1% this year. Put simply, our approach is working.

We do, however, recognise that more needs to be done, if we are to achieve our mission of eradicating child abuse imagery and videos online.

The European Commission’s recent consultation is a positive step. The attempts to define illegal content and trusted flaggers are a good starting point on which to build, however, learning the lessons from the implementation of the “NetzDG” law in Germany, we believe that tech companies should not be responsible for the assessment of content and defining what is and isn’t illegal. We believe it’s important that any assessment process is independent of industry and free from Government interference.

However, in order for these proposals to be successful, all parties need to trust the assessment of trusted flaggers of illegal content. The system also requires the right checks and balances, needs to be open to challenge and for decisions to be reviewed. We believe the best way of doing this is to ensure that decisions of trusted flaggers can be subjected to judicial review.

In 2014, the IWF published a human rights audit carried out by the former Director of Public Prosecutions Lord Ken MacDonald. It was designed to ensure that our work was entirely consistent and compliant with human rights and EU law.

Tags

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Converstaion with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog