CSA partners from around the world join forces to say No to Nudify Apps

Published:  Tue 10 Feb 2026

On Safer Internet Day 2026, child protection organisations from around the world are united behind a clear message. Nudifying apps and AI functionalities that remove clothing from imagery should have no place in our digital future.

The IWF, alongside Child Helpline International, INHOPE, NCMEC, Offlimits, Safe Online and the We Protect Global Alliance, and supported by more than 100 organisations and individuals globally, are calling for the explicit and universal banning of nudifying tools.

There is no positive use case for these tools – they only serve to humiliate, harass and exploit and can serve as a means to inflict further abuse.

 

A growing and dangerous trend

Nudifying tools make it easy to create photo-realistic nude or sexual imagery of people.

Though marketed for adults, they’re increasingly misused to create indecent images of children, often by other children, and have been linked to sexual extortion. Once created, these images can be shared and reused endlessly, causing lasting harm.

In the UK, nearly one in five confirmed reports of nude or sexual imagery of children and young people made to the Report Remove service involve digitally altered or manipulated content, including through AI or nudification apps.

Report Remove, run by the Internet Watch Foundation and Childline, allows children and young people to confidentially report nude or sexual imagery of themselves that has been shared or is at risk of being shared online. The growing presence of fake imagery in these reports shows how rapidly this technology is being abused.

 

Progress in the UK

In December 2025, following campaigning by the IWF and partners, the UK Government announced plans to outlaw AI apps that digitally remove clothing or nudify people in images.

These measures sit within the Government’s Violence Against Women and Girls (VAWG) strategy and align with Labour’s 2024 manifesto commitment to halve violence against women and girls within a decade - an ambition that cannot be met without addressing technology-facilitated abuse.

Our own data underlines why this pledge matters. In 2024, 98 percent of confirmed AI generated child sexual abuse imagery where sex was depicted involved girls.

We urge the UK Government to urgently introduce and implement a ban on nudify apps and tools, including requirements on services to take adequate steps to prevent users from generating sexual imagery. 

 

Why global alignment is essential

Nudifying tools are developed, distributed and monetised across borders. As long as they remain legal and accessible somewhere, they will continue to be used to harm children everywhere.

That is why this global coalition is calling upon governments and legislators to urgently enact and enforce regulation, at the latest within the next two years, to prohibit nudifying tools and ensure they are universally inaccessible. 

For example, the European Commission should formally classify AI-powered nudification apps as a prohibited practice under the EU AI Act, as well as using the full powers of the Digital Services Act to push platforms to mitigate systemic risks to users, remove illegal content, and restrict functionality for generating intimate, non-consensual imagery.

 

Saying No to Nudfiy

Every day these tools remain available is another day they can be used to cause irreversible harm. The message from child protection organisations worldwide is clear.

 

Nudifying apps should not exist.

 

On Safer Internet Day 2026, we are calling on governments, technology companies and citizens everywhere to act now.

The statement is available here.  

Click here to sign up to the statement. 

AI nudification app ban and on-device protections for children welcomed following IWF campaign

AI nudification app ban and on-device protections for children welcomed following IWF campaign

The Government’s VAWG Strategy will outline new measures to prevent nudes being sent, received, or shared on teens’ phones.

18 December 2025 News
“AI child sexual abuse imagery is not a future risk – it is a current and accelerating crisis”

“AI child sexual abuse imagery is not a future risk – it is a current and accelerating crisis”

IWF CEO Kerry Smith calls for complete EU ban of AI abuse content at high-level meeting of global experts in Rome.

20 November 2025 News
AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools

The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.

12 November 2025 News