AI advances could lead to more child sexual abuse videos, watchdog warns

Published:  Mon 22 Jul 2024

Advances in artificial intelligence are being used by paedophiles to produce AI-generated videos of child sexual abuse that could increase in volume as the technology improves, according to a safety watchdog.

The majority of such cases seen by the Internet Watch Foundation involve manipulation of existing child sexual abuse material (CSAM) or adult pornography, with a child’s face transplanted on to the footage. A handful of examples involve entirely AI-made videos lasting about 20 seconds, the IWF said.

The organisation, which monitors CSAM around the world, said it was concerned that more AI-made CSAM videos could emerge as the tools behind them become more widespread and easier to use.

Read the full article at The Guardian

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

How the sending of one photo led an 11-year-old girl to become a victim of physical sex abuse

The girl sent a photo to a boy in her class before the image and her phone number were added to all-male online chat groups - she later started disappearing before being abused by "unknown men".

23 July 2024 IWF In The News
AI being used to generate deepfake child sex abuse images based on real victims, report finds

AI being used to generate deepfake child sex abuse images based on real victims, report finds

The tools used to create the images remain legal in the UK, the Internet Watch Foundation says, even though AI child sexual abuse images are illegal.

22 July 2024 IWF In The News
AI-generated images of child sexual abuse uses real victims as reference material

AI-generated images of child sexual abuse uses real victims as reference material

AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.

22 July 2024 IWF In The News