Advances in artificial intelligence are being used by paedophiles to produce AI-generated videos of child sexual abuse that could increase in volume as the technology improves, according to a safety watchdog.
The majority of such cases seen by the Internet Watch Foundation involve manipulation of existing child sexual abuse material (CSAM) or adult pornography, with a child’s face transplanted on to the footage. A handful of examples involve entirely AI-made videos lasting about 20 seconds, the IWF said.
The organisation, which monitors CSAM around the world, said it was concerned that more AI-made CSAM videos could emerge as the tools behind them become more widespread and easier to use.
Read the full article at The Guardian.