UK children get tool to help stop nude images being shared online

Published:  Tue 22 Jun 2021

Written by:  The Guardian

Children in the UK who are worried that nude pictures and videos may end up online will be able to report the material to help prevent it from being uploaded in the future.

For the first time, young people will be able to flag the content with the Internet Watch Foundation (IWF) via a tool on Childline’s website before it has appeared online.

Under-18s will be able to flag the material via the “report remove” tool, described by the IWF as a “world first”. Analysts at the internet safety charity will then review the content and create a unique digital fingerprint known as a hash, which will be shared with tech companies to help to prevent it from being uploaded and shared.

Read more at The Guardian

AI advances could lead to more child sexual abuse videos, watchdog warns

AI advances could lead to more child sexual abuse videos, watchdog warns

IWF warns of more AI-made child sexual abuse videos as tools behind them get more widespread and easier to use

22 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News
More AI-generated child sex abuse material is being posted online

More AI-generated child sex abuse material is being posted online

In a review of material posted on the dark web, the Internet Watch Foundation found that deepfakes featuring children were becoming more extreme.

22 July 2024 IWF In The News