Our new Taskforce of analysts shows just how this partnership with the tech community works. Thanks to a grant from Thorn, we recruited a team to work with the UK Government, to hash two million images from their Child Abuse Image Database (CAID). Hashing is a process that produces a unique code, a type of digital fingerprint for each picture of confirmed child sexual abuse.
The CAID images, collected from the computers of suspects and offenders during police investigations, include images of child sexual abuse. Some of these records of abuse might never have made it onto the internet.
By assessing, hashing and classifying these images and sharing our hash lists with our partners, we hope that we can prevent those images from ever being uploaded online. If we can do this successfully, if we can prevent this suffering being shared online, we will be protecting victims and making the internet a safer place.
To help our Taskforce of analysts do this, our Tech Team built a pioneering new tool called IntelliGrade. This ground-breaking tool allows us to grade and hash criminal images in a way that meets legal and classification rules in different countries around the world.
For the first time, we can share hashes of child sexual abuse with multiple partners and countries worldwide. It’s a huge step forward and will help give more survivors of abuse the peace of mind in the knowledge that we’re blocking criminals from repeatedly sharing their suffering.
You can support our work and help children by donating, joining us as a corporate member, or getting involved in fundraising.
Do the right thing and become an IWF hero.