Discover how end-to-end encryption works & why upload prevention is key to preventing the spread of child sexual abuse imagery and protecting privacy.
Chris Hughes, who has worked at the Internet Watch Foundation (IWF) for nearly nine years, oversees the IWF’s hotline and leads a team of analysts whose job is to assess images and videos of suspected child sexual abuse to help get them removed from the internet.
A chilling excerpt from a new IWF report that delves into what analysts at the child protection charity currently see regarding synthetic or AI-generated imagery of child sexual abuse.
As Ofcom’s Illegal Harms Codes come into force, platforms are required to implement robust measures to protect users from CSAM and illegal content.
Key legislation aimed at preventing online harms will return to Parliament next month following fears it could have been dropped altogether.
More people in Britain are concerned about websites showing the sexual abuse of children than other types of illegal, illicit or‘harmful’ internet content. However, more than half of people in Britain currently say that they either wouldn’t know how to report it if they were to encounter it (40%) or would just ignore it (12%).
In an urgent letter to the Home Secretary, 10 leading children’s rights groups warn children ‘bear the brunt’ of sexual abuse both on and offline.
Experts warn that 1% of the entire male population could be ‘interested in sex with prepubescent children’.
A unique safety tech tool which uses machine learning in real-time to detect child sexual abuse images and videos is to be developed by a collaboration of EU and UK experts.
Analysts are finding 15 times as much child sexual abuse material on the internet as they were 10 years ago, leaving them battling a "tidal wave of criminal material".
The most extreme child sexual abuse imagery hosted in the EU is “spiralling out of control” as lawmakers are urged to clamp down on criminals using the continent as a toxic warehouse for dangerous material.