We build tech-for-good to help protect children online.

Defending young people online and stopping child sexual abuse imagery from being shared across the internet is a tough battle. We believe in giving our people the very best weaponry, the most advanced technical tools to fight for what’s right.

Technology is at the centre of everything we do. That’s why we’re developing tech-for-good, to help us stay ahead of the criminals who ruthlessly abuse children and legitimate services online. And, to give our expert human analysts a technological advantage.

Spearheading our tech-for-good approach is vital. The fight against criminals who abuse children, then upload the record of their suffering online and sometimes even profit from this torture is ever-changing. The challenge our analysts face doesn’t stand still. Technology never sleeps.

We share our portfolio of tools and services with our Members, to help the tech community keep their services and products safe. To help them protect survivors of child sexual abuse from having a record of that suffering shared on the internet, over and over again.

We know that criminals are often quick to abuse new developments in technology. So, it’s critical we work in partnership with leading tech companies, law enforcement, governments and charities to constantly stay one step ahead.

Our new Taskforce of analysts shows just how this partnership with the tech community works. Thanks to a grant from Thorn, we recruited a team to work with the UK Government, to hash two million images from their Child Abuse Image Database (CAID). Hashing is a process that produces a unique code, a type of digital fingerprint for each picture of confirmed child sexual abuse.

The CAID images, collected from the computers of suspects and offenders during police investigations, include images of child sexual abuse. Some of these records of abuse might never have made it onto the internet.

By assessing, hashing and classifying these images and sharing our hash lists with our partners, we hope that we can prevent those images from ever being uploaded online. If we can do this successfully, if we can prevent this suffering being shared online, we will be protecting victims and making the internet a safer place.

To help our Taskforce of analysts do this, our Tech Team built a pioneering new tool called IntelliGrade. This ground-breaking tool allows us to grade and hash criminal images in a way that meets legal and classification rules in different countries around the world.

For the first time, we can share hashes of child sexual abuse with multiple partners and countries worldwide. It’s a huge step forward and will help give more survivors of abuse the peace of mind in the knowledge that we’re blocking criminals from repeatedly sharing their suffering.

You can support our work and help children by donating, joining us as a corporate member, or getting involved in fundraising.

Do the right thing and become an IWF hero.

Our IntelliGrade tool.
IWF coder

Our services

We provide a unique range of services to help our Members defend their customers, staff and children, wherever they are in the world. Today, over 160 tech companies including some of the giants of the industry, have signed up to use our services.

IntelliGrade

IntelliGrade

Ground-breaking tech to help companies and law enforcement fight back against criminals who trade, share and upload imagery showing the sexual abuse of children.

Report Remove Girl with Report Remove information on phone

Report Remove

IWF and NSPCC's Report Remove can support a young person in reporting sexual images or videos shared online and enables them to get the image removed if it is illegal.

Man in front of computer screens

IWF Crawler

Our intelligent web crawler uses pioneering technology to scan web pages on the internet searching out images and videos showing the sexual abuse of children so our analysts can work to have them removed.

Woman with laptop

reThink Chatbot

We’re working in partnership with the End Violence Fund and the Lucy Faithfull Foundation to develop an innovative new chatbot to intervene and stop people looking at child sexual abuse imagery online before they’ve committed a crime.