Public exposure to ‘chilling’ AI child sexual abuse images and videos increases
Hotline actions more webpages of criminal AI content on clear web in past six months than in entire previous year
Published: Mon 13 Sep 2021
Tech companies have been warned not to introduce encryption unless they can guarantee they can keep platforms free of illegal content, as MPs call for more to be done to protect children from online grooming and sexual abuse.
A new report published today (September 13) by the All-Party Parliamentary Group (APPG) on Social Media highlights the increasing dangers of children being bullied or coerced into producing images or videos of their own sexual abuse by adult predators on the internet.
The APPG calls on the Home Office to review all relevant legislation to ensure it is as easy as possible for children to have their images removed from the internet.
The report, “Selfie Generation: What’s behind the rise of self-generated indecent images of children online?”, sets out 10 recommendations the UK Government and the tech industry must adopt to safeguard children online.
The report is the result of an inquiry which drew on oral evidence from academics, children’s charities, law enforcement and industry.
The APPG for Social Media was established in 2018, to mitigate the negatives and promote the benefits of social media. The UK Safer Internet Centre (of which the IWF is part) provides the Secretariat for the APPG and promotes the safe responsible use of technology.
The APPG’s chairman, Labour MP Chris Elmore (Ogmore), said social media companies are “fundamentally failing” to keep children safe, and that companies need to “wake up and get a grip” in finding and removing images and videos of children suffering sexual abuse.
Mr Elmore said: “It is all too often the case that laws and lawmakers find themselves playing catch-up when it comes to effective regulation of the online media landscape.
“The pace of technological change has meant that policy, legal reform, and standards of best-practice in this area are simply not fit for purpose. And this virtual game of cat and mouse has appalling real-world consequences.
“It’s high time that we take meaningful action to fix this unacceptable mess. Children are daily at real risk of unimaginable cruelty, abuse and - in some instances - death.
“Social Media companies are fundamentally failing to discharge their duties, and simply ignoring what should be an obvious moral obligation to keep young users safe.
“They need to get a grip, with institutional re-design, including the introduction of a duty-of-care on the part of companies toward their young users. Firms must be more pro-active and forthcoming when it comes to rooting out abuse images.
“There is an urgent need for social media platforms to be transparent with young users about the mechanisms available to them to remove and complain about these harmful images.”
Susie Hargreaves OBE, Director at the UK SIC, said: “This report serves as a stark reminder that we can all be doing more to make sure children are kept safe online.
“We see the fall-out of abuse and, when children are targeted and made to abuse themselves on camera by criminal adult predators, it has a heart-breaking effect on children and their families.
“There is hope, and there are ways for children and young people to fight back. The Report Remove tool we launched this year with Childline empowers young people to have illegal images and videos of themselves removed.
“We have also campaigned to help children and young people, as well as their parents, understand the potential dangers which lurk online.”
“New legislation will also help make a difference, and the forthcoming Online Safety Bill is a unique opportunity to make the UK a safer place to be online, particularly for children.”
The report comes as MPs and Peers begin their scrutiny of the Government's Draft Online Safety Bill.
Mr Elmore said the Bill will offer an opportunity for meaningful reforms, but said there must be robust age-verification requirements on websites hosting adult content.
He added: “Social media companies should not encrypt their service, unless they can guarantee that they can still remove illegal content and cooperate with law enforcement in the same way they do now.
“They need to stop putting profits before the safety of kids online, and accept that warm words and algorithms just won’t do the job.”
Hotline actions more webpages of criminal AI content on clear web in past six months than in entire previous year