Low-Rank Adaptations (LoRAs) are being abused to fine tune the creation of photo-realistic child sexual abuse images, based on real images of a child. AI-generated child sexual abuse is now so realistic it is being treated as real abuse. The generation of this imagery has also increased the potential for the revictimisation of survivors of this heinous crime.
You can read more about the threat of AI technology being used to produce images of child sexual abuse in our report published in 2023 here.
As an organisation providing AI tools, there are measures you can take to protect your technology from this frightening development, by working with the Internet Watch Foundation. Our Image Hash List with over 2.6million confirmed images of child sexual abuse material (CSAM) can be a vital tool in stopping known CSAM images being uploaded and used for training on your tools to produce more synthetic images of CSAM.
AI-generated imagery of child sexual abuse has progressed at such a “frightening” rate that the Internet Watch Foundation (IWF) is now seeing the first convincing examples of AI videos depicting the sexual abuse of children.
Our updated AI report published in July 2024 provides further insight in the use of AI technology to produce incredibly realistic ‘deepfake’, or partially synthetic, videos of child rape and torture are made by offenders using AI tools that add the face or likeness of another person to a real video. Read more here.
AI technology providers can help stop this abuse by joining IWF in membership and gaining access to our full suite of services.
By joining IWF as a Member, you can free victims of child sexual abuse from repeated trauma and anxiety caused by having images of their exploitation shared online, via AI technology and services, sometimes long after the physical abuse itself has stopped.
Our fees are based on your company size and industry sector ensuring that companies of all sizes can access our services.
Generative AI Model providers can block access to webpages/sites containing abusive child sexual abuse content, which causes significant harm to children, without impacting user rights, protecting survivors from repeat victimisation and making the internet a safer place for everyone.
Interested in joining or want to hear more? Complete our membership enquiry form or get in touch with our team by emailing [email protected] and we’ll get in touch or you can call us on +44 (0)1223 20 30 30.
Want to hear more? Get in touch with our team by emailing [email protected] and we’ll get in touch with you. If you’d prefer, you can call us on +44 (0)1223 20 30 30.