Encryption Vs. Privacy: In Conversation with Professor Hany Farid

Published:  Sun 31 Jul 2022

The Government has been urged to take steps to encourage tech companies to step up in detecting child sexual abuse material online, as web safety experts say the “time is now” to take action.

Prof Hany Farid, image analysis expert at the University of California in Berkeley, said the UK’s Online Safety Bill, and legislation in Brussels set the scene for real change in the way companies prevent the spread of sexual images and videos of children.

Prof Farid made the remarks in response to a report by Dr Ian Levy and Crispin Robinson, respectively the technical heads of the UK’s National Cybersecurity Centre and GCHQ.

The report, published last week, made headlines after suggesting tech companies should move ahead with technology that scans for child abuse imagery essentially on users’ phones themselves.

According to the report, there is no reason why client-side scanning cannot be implemented safely in a way which will retain privacy for users while protecting children from grooming offences and the spread of child sexual abuse material.

Speaking exclusively to the IWF as part of its new podcast series, Prof Farid, said children are facing dangers online, and that politicians need to step in to make sure tech companies are incentivised to bring in better child protection measures.

He said a “heavy hand” would be needed for companies which resisted.

Prof Farid said: “These are not just teenagers playing with their sexuality, these are incredibly young – zero to two months old to three months old being sexually assaulted, and I think we as a society have to say not on our watch. This is a technology that takes a step in the right direction.”

He added: “I think the time is now, particularly with the Online Safety Bill making its way through the UK Government, and with the DSA and the DMA making its way through Brussels, I believe this is now the time for the companies to say we are going to do this, we’re going to do it on our terms. And, if they don’t, then I think we have to step in with a very heavy hand and insist they do.”

Listen to our podcast below.

Prof Farid said the report from Levy and Robinson made it clear that privacy does not have to come and the expense of child protection.

He said: “We have been made to believe there is a false choice here. Either you have privacy or you have security for kids, and I think that is a false choice.

“We routinely scan on our devices, on our email, on our cloud services for everything including spam and malware and viruses and ransomware and we do that willingly because it protects us. It protects our devices and, without that, without the ability even within end-to-end encryption, to scan for harmful content to our devices, we would be dead in the water.

“I don’t think it is hyperbolic to say that, if we are willing to protect ourselves, then we should be willing to protect the most vulnerable among us.

“It is the same basic core technology, and I reject those that say this is somehow giving something up. I would argue this is, in fact, exactly the balance that we should have in order to protect children online and protect our privacy and our rights.”

Prof Farid said, in the light of the report’s findings, he would now urge Apple to continue with an initiative it floated in 2021 when they said they would perform “on-device matching”, scanning against a database of known child sexual abuse image hashes.

Hashes are a unique digital fingerprint of known child sexual abuse material. Images on devices in the US will be flagged if their digital fingerprint strongly matches that of known abuse imagery in the database.

The feature called NeuralHash was shelved after a backlash from campaigners concerned the move would be a breach of users’ privacy.

Prof Farid said: “The push back was from a relatively small number of privacy groups. I contend that the vast majority of people would have said sure, this seems perfectly reasonable, but yet a relatively small but vocal group put a huge amount of pressure on Apple and I think Apple, somewhat cowardly, succumbed to that pressure.

“I think they should have stuck their ground and said this is the right thing to do and we are going to do it. And I am a strong advocate of not just Apple doing this, but Snap doing this, and Google doing this – all the online services doing this.”

Prof Farid said the main reason steps like this have not yet been introduced are that there is no financial incentive for companies to take action.

“We can’t turn a blind eye to the real harms we are seeing on these services which are affecting tens of thousands, hundreds of thousands of kids around the world. I think we have to make a decision as to who do we want to be as a society.

Mike Tunks, Head of Policy and Public Affairs at the IWF, said: “For the last few years, the Government has been saying we want tech companies to do more about tackling child sexual abuse in end-to-end encrypted environments.

“As we know, at the minute, there is no technology that can do that, but this paper sets out some ways in which that can be achieved.”

 

How online predators use privacy apps. New podcast episode from the IWF

How online predators use privacy apps. New podcast episode from the IWF

In Converstaion with Tegan Insoll, Head of Research at Suojellaan Lapsia, and Dan Sexton, Chief Technology Officer at the IWF

15 February 2024 Blog
What did we learn from the US Senate hearing over online harms?

What did we learn from the US Senate hearing over online harms?

By Susie Hargreaves OBE, Internet Watch Foundation CEO

1 February 2024 Blog
AI – the power to harm and to help. New podcast episode from the IWF

AI – the power to harm and to help. New podcast episode from the IWF

In Conversation With Thorn’s Head of Data Science Rebecca Portnoff and IWF Chief Technology Officer Dan Sexton

5 December 2023 Blog