AI imagery getting more ‘extreme’ as IWF welcomes new rules allowing thorough testing of AI tools
The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.
Your URL List Global Deployment Audit has been submitted.
We'll be in touch if we need any further information.
The IWF welcomes new measures to help make sure digital tools are safe as new data shows AI child sexual abuse is still spreading.
While providing legal certainty is desirable, the IWF says voluntary detection alone is not enough to meet the scale of the child sexual abuse crisis online.
More than nine in ten people in the UK say they are concerned at how images and videos of children being sexually abused are shared through end-to-end encrypted (E2EE) messaging services.
The debate on the EU’s proposed Child Sexual Abuse Regulation (CSAR) has been dominated by one loud slogan. A slogan which may have dire consequences for the safety and wellbeing of millions of children worldwide.
Three years ago, when Pinsent Masons set out to unite their communities to raise money for the Internet Watch Foundation (IWF), no one could have predicted how far their idea would go or how many people would still be moving for the cause three years later.