A growing and dangerous trend
Nudifying tools make it easy to create photo-realistic nude or sexual imagery of people.
Though marketed for adults, they’re increasingly misused to create indecent images of children, often by other children, and have been linked to sexual extortion. Once created, these images can be shared and reused endlessly, causing lasting harm.
In the UK, nearly one in five confirmed reports of nude or sexual imagery of children and young people made to the Report Remove service involve digitally altered or manipulated content, including through AI or nudification apps.
Report Remove, run by the Internet Watch Foundation and Childline, allows children and young people to confidentially report nude or sexual imagery of themselves that has been shared or is at risk of being shared online. The growing presence of fake imagery in these reports shows how rapidly this technology is being abused.