How the IWF helps victims and makes the internet a safer place

Published:  Fri 26 Jan 2018

Every time Hayley, whose name has been changed to protect her identity, receives a brown envelope through the door, her heart pounds. She throws the brown envelope in a pile with others, and vows never to look at it. Within these envelopes are a reminder of the darkest time of her life – when she was being sexually abused as a child.

Hayley and her sister were abused from birth. Both of them told the Internet Watch Foundation (IWF) that they receive two to three of these brown envelopes from law enforcement each week- notifications which detail another incident where an image of their horrific abuse was shared another time by an offender on the internet. She also told us that as well as spending her life out in public worrying that anyone she encounters might have seen pictures of her, she is terrified all the time that anyone could approach her and physically hurt her.

Hayley and her sister are just two victims of child sexual abuse which the IWF has regularly helped. The IWF Hotline remains constantly vigilant, not only processing reports from distressed members of the public who have stumbled across child sexual abuse content, but also proactively searching for this content, hunting it down so fewer offenders worldwide have access to it, while stopping them from continuously sharing the images on the open and dark web.

The IWF’s journey started in 1996 in a small office on the outskirts of Cambridge. At this point, the UK hosted a whopping 18 percent of known child sexual abuse imagery on the internet. The analysts on the team got to work; by 2003 the figure fell below one percent, and it now stands at 0.1 percent.

With big internet industry partners, the IWF has made the internet a safer place where fewer people stumble across traumatising child sexual abuse content. Meanwhile, our image Hash List has allowed child sexual abuse victims to sleep better at night knowing that their images can never resurface again. This year the IWF, after developing the latest technology, will be able to better detect previously unseen images, rather than ones already on our database of digital fingerprints (hashes), which could see law enforcement across the world finding and saving new victims from abuse quicker than ever before. A third beneficiary of the IWF’s work is our Members, whose networks remain free from illegal content using a range of services, including the URL List and Keywords List.

Every time a child sexual abuse victim’s image is shared, they suffer revictimisation. As the IWF spreads its roots across the world through its international Reporting Portals, it is saving more and more victims from sexual rape and torture. Children’s lives are irrevocably changed after child sexual abuse, and the IWF is here to make sure they no longer have to fear the images of their suffering being seen and shared. The IWF’s work as one of three partners in the UK Safer Internet Centre furthers its goal to protect children on the internet by helping to prevent children getting into unsafe situations, on top of its work helping victims after sexual abuse.

Please help victims like Hayley by spreading the word about the IWF’s work. You can sign up to our Facebook, Twitter or LinkedIn pages, and encourage your local schools and other organisations to sign up to Safer Internet Day.

 

*Kate’s name has been changed to protect her identity.

AI-generated child sexual abuse: now cannot be the moment the EU downs tools

AI-generated child sexual abuse: now cannot be the moment the EU downs tools

A new IWF report reveals record levels of AI‑generated child sexual abuse imagery and alarming insight into how offenders are exploiting emerging technologies. The charity is urging EU lawmakers to introduce a zero‑tolerance ban on AI‑generated abuse and the tools used to create it.

24 March 2026 Blog
AI-generated child sexual abuse: why safety by design must be the next step

AI-generated child sexual abuse: why safety by design must be the next step

We’re calling for an AI Bill that includes key measures to ensure safety-by-design becomes a non-negotiable standard in AI development.

24 March 2026 Blog
Europe is about to make it illegal to protect children online

Europe is about to make it illegal to protect children online

The European Parliament has one last chance to save its child protection system.

23 March 2026 Blog