IWF 'increasingly concerned about impact of pornography on the lives of young children' as age verification measures not included in new Bill

Published:  Wed 19 May 2021

There are fears sexual behaviour could be “normalised” for children, putting them at greater risk from internet grooming, unless greater checks are in place to prevent them accessing pornography online.

Last week, the UK Government published its draft Online Safety Bill. The measures it set out aim to “put an end to harmful practices, while ushering in a new era of accountability and protections for democratic debate”.

But importantly, the Bill also currently seeks to repeal Part 3 of the Digital Economy Act (2017) which ensured that providers of adult websites had to have effective Age Verification measures in place to stop children accessing its services.

Susie Hargreaves OBE, Chief Executive of the Internet Watch Foundation (IWF) and Director of the UK Safer Internet Centre, said: “While we welcome the renewed commitment to children’s safety online, we are disappointed to see a stronger line has not been taken on age verification for adult sites.

 “In the last year, we have seen a 77% increase in self-generated indecent images of children. This is where children have been tricked, coerced, or bullied into performing sexually in footage which is then captured and distributed by internet sex predators.

Last week, the UK Government published the draft draft Online Safety Bill which aims to “put an end to harmful practices online.

“We are increasingly concerned about the impact of pornography on the lives of young children and their ability to form normal relationships.

“It is not healthy for children to learn about sex through pornography, and we fear this may lead to the societal normalisation of sexual violence or behaviours.”

A number of the IWF’s Parliamentary Champions have raised concerns about this during the Queen’s Speech debates in Parliament over the past week.

Miriam Cates MP (Con) for Penistone and Stockbridge, who held a meeting with the IWF recently said: “I welcome the laying out of the online harms Bill in the Queen’s Speech and its publication in draft, but even if it passes swiftly through Parliament, realistically we are perhaps two or more years away from the protections being enacted. Part 3 of the Digital Economy Act 2017, which would enforce age verification for access to pornography sites, is ready to go, so I urge the Government to implement that legislation now.”

Lord McNally, Lord Farmer and Lord Clement-Jones all also mentioned this omission from the legislation in their speeches in Parliament yesterday, with the Shadow Lib-Dem spokesperson for Digital stating: “In particular, there is the exclusion of commercial pornography where there is no user-generated content.”

The IWF was also one of 60 organisations, MPs and Peers who have signed an open letter to the Prime Minister calling on the Government to implement Part 3 of the Digital Economy Act, led by IWF Champion Floella Benjamin ahead of the Queen’s Speech last week. It also includes a number of IWF Champions including Lord McNally, Lord Clement-Jones, Baroness Ludford, Baroness Walmsley, other members of the House of Lords Digital and Communications Committee and the former Minister of State, Margot James.

At last week’s DCMS Select Committee the Secretary of State, Oliver Dowden, was pressed on the issue by IWF Champion Julie Elliott MP: “Why have you opted for the current online safety proposals to be limited to user-generated content only? The decision means you are excluding some commercial pornography services as things stand now. There is an obvious and predictable risk that even more companies will exploit this loophole to avoid regulation. Why have you opted for that model?”

In response, the Secretary of State, Oliver Dowden said: “We are taking this Bill forward for pre-legislative scrutiny. I do not have a closed mind on this. If we could find a commensurate way of providing wider protection for children within it—and I do not say this to belittle it—that is one bauble that I might be open to hanging on the Christmas tree. There is a strong case for doing that. That is essentially where we are.”

The Bill sets out a number of new measures, including provisions for social media sites, websites, apps and other services hosting user-generated content or allowing people to talk to others online to remove and limit the spread of illegal and harmful content such as child sexual abuse material.

While the renewed commitment to online safety and child protection has been broadly welcomed, concerns have been raised that the Bill does not include stricter age verification measures to ensure children can not access pornographic material online.

IWF welcomes new online safety rules but warns more still needs to be done to make sure children are safe online

IWF welcomes new online safety rules but warns more still needs to be done to make sure children are safe online

The Age Appropriate Design Code, which comes into effect today (September 2), sets out 15 standards that online services need to follow.

2 September 2021 Statement
IWF welcomes Apple’s 'promising' plans to scan devices for child sexual abuse material

IWF welcomes Apple’s 'promising' plans to scan devices for child sexual abuse material

' They have acknowledged that, while customers’ privacy must be respected and protected, it cannot be at the expense of children’s safety.'

6 August 2021 Statement
IWF applauds tech companies' commitment to continue scanning for child sexual abuse material despite new EU ePrivacy rules

IWF applauds tech companies' commitment to continue scanning for child sexual abuse material despite new EU ePrivacy rules

“Anything which weakens protections for children is unacceptable. Christmas is supposed to be a time where we look out for the youngest and most vulnerable. It is not the time to give predators a free pass to share videos of abuse and rape"

21 December 2020 Statement