AI ‘nudification’ apps are to be banned following an IWF campaign as new data reveals nearly one in five reports of nude or sexual imagery of young people involves some form of faked or digitally altered imagery.
New data released today (Thursday, December 18) shows 19% of confirmed reports of nude or sexual imagery of children and young people made to the UK’s Report Remove helpline involved imagery which has been digitally altered or manipulated, including with AI or nudification apps.
Report Remove, which is run by the Internet Watch Foundation and Childline, is a first of its kind service which allows children and young people in the UK to confidentially self-report nude or sexual imagery of themselves which has been, or it at risk of being, shared online.
Now, the UK Government is announcing plans to outlaw AI apps which digitally remove clothing or ‘nudify’ subjects of photographs – apps which have been abused to create nude imagery of children.
The move comes after months of campaigning by the IWF and others who have argued the technology makes it too easy to create fake nude or sexual imagery of real children.
Alongside this, the Government has announced in plans to encourage tech firms to bring in on-device protections for children, saying their ambition is to make it impossible to send, receive, or share nude or sexual imagery with children’s devices.
The plans are being brought in under the Government’s Violence Against Women and Girls (VAWG) strategy. Labour’s 2024 manifesto pledged to halve violence against women and girls in a decade.
IWF data shows that, in 2024, 98% of confirmed images and videos of AI generated child sexual abuse where sex was recorded involved girls.