We’re calling for an AI Bill that includes key measures to ensure safety-by-design becomes a non-negotiable standard in AI development.
AI used to generate deepfake images of child sexual abuse uses photos of real victims as reference material, a report has found.
A new report from the IWF shows how the pace of AI development has not slowed as offenders are using better, faster and more accessible tools to generate new criminal images and videos.
Throughout the lockdown period the IWF Hotline has remained operational, but social distancing measures meant some activities had to be scaled back.
Explore how IWF analyses child sexual abuse imagery by age group, highlighting trends and challenges in identifying and removing illegal content.
Discover how IWF classifies child sexual abuse imagery by severity, using UK Sentencing Council guidelines to inform removal and prevention efforts.