IWF ‘outraged’ at Meta decision to prioritise privacy of paedophiles over children’s safety
Meta announces rollout of end-to-end encryption on its platforms, beginning with Facebook Messenger.
Andrew Puddephatt was selected for the role of Chair because of his vast experience in human rights, specialising in freedom of expression, and technology policy – a unique combination which directly suits the IWF and its remit. The announcement comes as the tenure of its current chair, Sir Richard Tilt OBE, comes to an end after six years.
My professional and personal background is in human rights where I’ve campaigned to increase human rights protection for 25 years. Joining IWF – for me – is a continuation of that commitment as I believe the abuse of children is one of the most egregious human rights abuses. For the last twelve years I’ve been active in the field of internet policy, running my own successful company working with civil society groups, governments and companies to promote a human rights approach to internet policy so I have a good understanding of the internet environment, of the challenges in dealing with illegal content and the importance of bringing a rights-based framework to the issue.
I’ve had a varied career since I left Cambridge University – working as an electrician, a computer programmer, a community worker in the East End and an NGO leader as Director of Liberty, Charter 88 and the international free expression organisation Article 19. I was also a councillor in Hackney and leader for a while as well as spending a year out looking after my daughter for a year when she was four months old.
As Director of Liberty I worked with my colleague Francesca Klug, then Director of the Civil Liberties Trust to launch a campaign for a Bill of Rights for Britain, a campaign that led seven years later to the passing of the Human Rights Act. Working with a staff team we ran a two-year campaign highlighting how human rights was relevant to a range of social issues and devised the mechanism for incorporating the Bill of Rights through a parliamentary mechanism which as adopted in legislation. Moving to Charter 88 I was part of the coalition that worked with the incoming government in 1997 to prioritise the Human Rights Act as one of the first pieces of legislation of the new government.
I had two main reasons. Firstly, the sexual abuse of children is one of the worst human rights abuses that I can imagine and I want to be part of an organisation that tries to stop the abuse online. Secondly I’m very struck by the way the IWF works. The mood around internet-based services has become increasingly dystopian with concerns about illegality, hate content, misogyny, trolling etc. Many governments and companies are wrestling with how to deal with bad content. The Internet Watch Foundation has a unique model, providing a direct service to industry but one which is independent and accountable, and which works efficiently and effectively to remove images of child sex abuse online. Not only did I think the IWF was valuable in itself but I also believe the lessons of the Internet Watch Foundation are applicable to other issues that preoccupy policymakers.
I think there are several challenges in the future for the Internet Watch Foundation. Firstly, the technology itself is changing and it is possible that as encryption becomes more powerful and effective there is the risk the child sex abuse images online will be harder to find and remove. On the other hand the fact that people wish to make money from child sex abuse images means that there will always need to be a degree of publicly accessible material so we will have to monitor this carefully. Another challenge is that the political mood (incorrect in my view) is that child sex abuse online is a problem that has been fixed and the other issues such as violent extremism or misogyny are more important. I believe there is no permanent fix to child sex abuse images online as long as people are willing to abuse and exploit children for personal gain and therefore it is a constant battle to keep on top of the problem. We need to maintain political focus on the issue and explain the uniqueness of the Internet Watch Foundation model. That uniqueness lies in the fact that we’re not a government service, we are not part of the police, we are an independent self-regulatory body that seeks to operate in accordance with the law and human rights principles. I think it will be a challenge to explain to the public in the future why this is the best way of dealing with the problem.
The most important thing to achieve is to identify the maximum number of child sex messages online and ensure they are removed promptly. Every hour an image stays online a child continues to be exploited, often brutally, so it is vital that we’re able to ensure a very rapid removal of images. This means reaching a wider range of internet service providers so that we can help them respond to the information we have. Secondly, we need to understand how best to work with similar projects overseas including those which do not have our quality assured self-regulatory model. We have to decide what kind of relationship we should be having with organisations that are effectively part of foreign police forces, or state services of some kind. And how we interact with hotlines or initiatives elsewhere which are much less effective. These are the challenges of the next few years.
The IWF has, in a sense, a twin aspect. It is both a membership organisation for the industry itself, but because it’s also constituted as an independent charity with independent trustees such as myself, it is not part of industry and is able to challenge industry bodies if they fail to remove images efficiently. In that sense we face both ways - outwards to the public as a service to them, removing illegal material but also able to challenge companies who fail to perform. It’s the independence of IWF as a charity, linked to a range of organisations, that gives us the ability to act independently of government and independently of the companies themselves.
We don’t need more teeth – we need to ensure that we reach an even wider range of companies, particularly in sectors such as telecommunications, while making sure that our membership responds to our recommendations rapidly.
I think there are several threats. An obvious one is technological change - if we’re not able to keep up-to-date with the way the technical landscape is shifting we could fail. Secondly there are other organisations which get information directly from the police but which don’t have the same level quality assurance as IWF or the same degree of independence. But as a taxpayer service it could tempt companies who otherwise would have to pay for the service themselves. In the IWF model it is the companies that pay for the service of removing child sex abuse images rather than the taxpayer which in my opinion is the proper approach – internet service companies make money from those services and should pay for dealing with the problems that ensue. But we will need to continue to persuade government and the companies themselves that this is the best approach.
If the internet regulation does not comply with human rights norms on restrictions on free speech and information then there is a clash of course. But in the case of IWF I’m confident that it operates within the parameters of human rights norms and values. IWF does not seek to restrict any images other than those which are illegal – we do not deal with obscene or pornographic material (even though many people dislike that it is available online). We only recommend removing images that should be removed to protect the rights of children, images which show children being abused online, which are clearly illegal and defined in law as such. One criticism that could be made by human rights activists is that there is no judicial oversight of individual recommendations to remove content. The simple reason for this is the sheer volume of images means it will be impossible to establish an effective judicial process to review each one individually. Instead experienced IWF analysts work to an exact legal definition what constitutes child abuse and their work is subject to a rigorous process of quality assurance. The organisation as a whole is subject to judicial review, so if someone feels we got a decision wrong then they’re entitled to take us to court and challenge the decisions we made. Given the global nature of the internet and the sheer volume of material that is available, there is no other practical means of dealing with the volume of child abuse online. To fail to act on child abuse would be to tolerate a wider and more serious human rights violation.
I think most industry bodies are doing the best they can to deal with child sexual abuse material online given the volume and complex nature of the problem, and given the pressure they are under from regulators another bodies to deal with a whole raft of other problems. Of course not every company is acting responsibly - there are some very prominent providers of services that we feel should do a lot more. And there are a very small number of providers – not our members of course – who are major sources of child sex abuse material and persist because they knowingly make money from the abuse of children. The only possible recourse for these is law enforcement.
I hope to see the IWF widely regarded as the exemplary model for dealing with child sexual abuse online, with a strong and engaged membership and with a good public understanding of what we do.
I think the government has taken a strong view on child sexual abuse material online and has always been supportive of the work of IWF while accepting its necessary independence. In fact there is support across the political spectrum for the IWF, I’m pleased to say. My own view is the internet can only flourish if there is an effective system of self-regulation for dealing with bad content. The complexity of operating across many different jurisdictions, with many different definitions of bad content, and with the ability of repressive governments in different parts of the world to use the concept of child sex abuse to remove perfectly legal images and content (for example of issues such as sexuality) means that self-regulation is the only reliable way of balancing free speech and the removal of abusive content. We must continue to defend the independence of the IWF. Governments are not equipped or capable of operating with the speed and efficiency necessary to remove child sex abuse material. Our independent self-regulatory model, drawing upon the relevant experience of a wide range of stakeholders, working in close partnership with industry is one that can give the public confidence.
Of course some companies could deal with criminal content themselves but in practice it is simpler, more legitimate and far more cost effective for there to be an independent team of expert analysts with a deep experience of dealing and identifying with this material working for industry as a whole. In addition, it would be hard for small internet service providers to develop their own internal expertise - in practice small organisations would not have the resource to do this. With the IWF model the larger companies effectively support the smaller to provide a resource available to all.
Not at all. I think IWF fulfils its human rights responsibilities although it’s perhaps not always been clear in the past that it thinks of itself as a human rights organisation. Human rights is not language the organisation has chosen to use in the past. Its name is also very unfortunate because IWF does not watch the whole internet (and has no interest in watching the whole internet) or policing content in general. I think we should be clear that we’re actually fulfilling a human rights responsibility in carrying out this important function. We have subjected ourselves to an independent human rights review carried out by the respected Ken MacDonald, who was Director of Public Prosecutions of England and Wales between 2003 and 2008. The review made a number of recommendations that the organisation has adopted. During my chairmanship I will continue to keep our human rights responsibilities under review and clearly in focus.
Meta announces rollout of end-to-end encryption on its platforms, beginning with Facebook Messenger.
Ms Brown will take over from Andrew Puddephatt OBE, who has held the post since 2017.