UK children get tool to help stop nude images being shared online

Published:  Tue 22 Jun 2021

Written by:  The Guardian

Children in the UK who are worried that nude pictures and videos may end up online will be able to report the material to help prevent it from being uploaded in the future.

For the first time, young people will be able to flag the content with the Internet Watch Foundation (IWF) via a tool on Childline’s website before it has appeared online.

Under-18s will be able to flag the material via the “report remove” tool, described by the IWF as a “world first”. Analysts at the internet safety charity will then review the content and create a unique digital fingerprint known as a hash, which will be shared with tech companies to help to prevent it from being uploaded and shared.

Read more at The Guardian

AI image generators giving rise to child sex abuse material - BBC Newsnight

AI image generators giving rise to child sex abuse material - BBC Newsnight

The BBC’s been investigating the rise in child sex abuse material resulting from the rapid proliferation of open-source AI image generators.

17 July 2023 IWF In The News
Charity wants AI summit to address child sexual abuse imagery

Charity wants AI summit to address child sexual abuse imagery

A leading children's charity is calling on Prime Minister Rishi Sunak to tackle AI-generated child sexual abuse imagery, when the UK hosts the first global summit on AI safety this autumn.

17 July 2023 IWF In The News
Webpages containing the most extreme child abuse have doubled since 2020

Webpages containing the most extreme child abuse have doubled since 2020

Images of children aged as young as seven being abused online have risen by almost two thirds.

25 April 2023 IWF In The News