IWF research into how artificial intelligence (AI) is increasingly being used to create child sexual abuse imagery online
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
The Internet Watch Foundation assessed more than 50,000 reports to its hotline during 2013. Today (7 April) it reveals the latest trends in assessing and removing child sexual abuse images from the internet.
IWF analysts say ‘insidious’ commercial child sexual abuse sites are driving more and more extreme content online.
“Imagine your darkest moments exposed to an unknown number of people. Then imagine strangers watching your pain for sexual satisfaction. That’s what happens for some of the children whose abuse images we see online."
IWF and Cyber safety technology company, White Bullet, announce their collaboration to stop the monetisation of child sexual abuse images and videos through digital advertising.
The IWF will provide hashes of child sexual abuse images to the online industry to speed up the identification and removal of this content worldwide.
The Internet Watch Foundation welcomes the Government’s commitment to ‘upgrade’ a database in a bid to tackle online child sexual abuse material.
Quickline joins their nationwide initiative to provide a trusted and secure service to help protect people from exposure to child sexual abuse images online.