Teens face ‘crisis’ of online sexual exploitation as charity says major new Government-backed scheme will ‘put rocket boosters’ on Online Safety Act

Published:  Wed 23 Apr 2025

IWF announces ‘ground-breaking’ decision to give thousands of smaller platforms free protection from millions of child sexual abuse images and videos as new report reveals scale of online threat to children.  

  • IWF’s Annual Data and Insights Report reveals it discovered record levels of child sexual abuse webpages online in 2024, with spike in sexual imagery of teens. 
  • Teenagers facing escalating threats from AI, grooming, sextortion, and leaked imagery, experts warn.  
  • Minister welcomes ‘vital’ new Image Intercept tool which will shield platforms from criminal imagery. 
  • Digital fingerprints of almost three million images and videos of child sexual abuse can now be used to block criminal content, for free, in move which will ‘instil hope’ in thousands of victims and survivors across the UK, and help bring platforms in line with requirements of the Online Safety Act.  

Teenagers are facing unprecedented online sexual abuse and exploitation, new data reveals, as the IWF unveils “ground-breaking” move to give thousands of sites and online platforms free access to a powerful new safety tool. 

New IWF data published today (April 23) in the Internet Watch Foundation (IWF) Annual Data and Insights Report reveals the charity has seen a spike in sexual abuse imagery of teenagers under 18. 

According to the report, 2024 saw the discovery of more child sexual abuse webpages than ever before, with the escalating numbers being driven in part by new threats, like AI-generated child sexual abuse, sexually coerced extortion, or “sextortion”, and malicious sharing of nudes or sexual imagery.  

The IWF says under 18s are now facing a “crisis” of sexual exploitation and risk online. 

In response, the charity is today announcing it will be making a powerful new safety tool available for free to smaller platforms - a move which will protect a raft of sites from millions of child sexual abuse images and videos.  

Image Intercept logo

 

Image Intercept is a new tool which has been made possible through funding support from the Home Office. 

It is the first free to use initiative in the UK aimed at preventing the spread of known illegal content depicting the sexual abuse of children.  

Small businesses, online platforms, and start-ups can now, with a simple code plug-in, block imagery using the IWF’s database of 2,883,015 individual hashes of criminal imagery which the charity has amassed.  

It will help give whole swathes of the internet 24-hour protection and, crucially, will help achieve the safety aims set out under the Online Safety Act. 

Derek Ray-Hill, IWF Interim CEO
Derek Ray-Hill, IWF Interim CEO

Derek Ray-Hill, Interim Chief Executive at the IWF, said: “Young people are facing rising threats online where they risk sexual exploitation, and where images and videos of that exploitation can spread like wildfire. New threats like AI, and sexually coerced extortion are only making things more dangerous.  

“Many well intentioned and responsible platforms do not have the resources to protect their sites against people who deliberately upload child sexual abuse material. That is why we have taken the initiative to help these operators create safer online spaces by providing a free-of-charge hash checking service that will identify known criminal content. 

“This is a major moment in online safety, and anyone with an online platform where users can upload content now has the chance to join the fight and help deliver the aspirations of the Online Safety Act. Together we can present a stone wall to those looking to spread child sexual abuse imagery.” 

Technology Secretary Peter Kyle said Image Intercept is a “powerful” example of how new tech innovations can start to turn the tide on child sexual abuse imagery online.  

He said: “Visiting the IWF earlier this year was one of the most shocking and moving days I have experienced as Technology Secretary. I saw firsthand the scale and sinister methods criminals are using to prey on young people, often beginning in what should be the safest place for a child – their bedroom. 

“The rise in cases of sextortion, child sexual abuse material, and most recently AI-generated abuse material shows how threats to young people online are constantly evolving. But during my visit I also saw the extraordinary dedication of the IWF teams working daily to protect children from further harm and meet these new threats. 

“Their new Image Intercept tool is a powerful example of how innovation can be part of the solution in making online spaces safer for children, a goal which - working with IWF and other partners - this government is committed to delivering.” 

In 2024, the IWF confirmed a record-breaking 291,273 reports of child sexual abuse imagery.  

Today’s Data and Insights report reveals 97% of the reports in which victims’ sex was recorded* (or 278,492 reports) showed the sexual abuse of girls only – an increase of 14,246 since 2023. 

The report also shows a change in the size of older age groups appearing in abuse imagery in 2024. Reports** where 14-to-15-year-olds have been recorded as the youngest child seen have risen 35%, with 5,457 recorded in 2024 compared to 4,056 in 2023.  

The number of reports** where 16-to-17-year-olds were the youngest child seen has risen 67% from 1,202 reports in 2023 to 2,010 in 2024.  

This age group can be challenging to identify in imagery but, through IWF’s active work with UK law enforcement, images can be cross-referenced with Government’s CAID (Child Abuse Image Database) or to reports received via Report Remove. This means IWF analysts are able to determine the precise age of victims and correctly identify criminal imagery. 

This is important for older teens whose imagery is most at risk of being mistaken for imagery of adults and not always picked up by moderators on platforms.  

Now, as Ofcom begins to enforce aspects of the Online Safety Act requiring companies to tackle illegal harms, including child sexual abuse imagery, the UK-based IWF is encouraging companies to use Image Intercept to protect their platforms.    

Megan Hinton was 14 when sexual images and videos of her were shared online. She said the knowledge this imagery could be being shared and viewed by strangers still affects her, and she lives in fear. 

She said today’s move from the IWF will ‘instil hope’ in thousands of victims and survivors across the UK. 

Megan, who is now a Victim and Survivor Advocate for child protection charity, the Marie Collins Foundation***, said: “The impact of child sexual abuse material cannot be understated. The images and videos captured as part of my abuse at age 14 remain an omnipresent fear in my life and I am revictimised every time these are viewed.  

“As a survivor I live in fear of being recognised by those who may have seen my images, and this very real anxiety is only increasing with technological advancements in AI now providing offenders with further opportunities to manipulate the imagery. 

“Today’s announcement will instil hope in thousands of victims and survivors across the UK. To know that tools exist which proactively identify, block and prevent the spread of child sexual abuse material is a powerful remedy to alleviate the pervasive anxiety that survivors live with. 

“The ground-breaking decision to extend IWF’s protection, free of charge, to smaller platforms demonstrates to survivors that they are not alone and that there is unwavering and steadfast commitment to eradicate child sexual abuse material online.” 

Jess Phillips, Minister for Safeguarding and Violence against Women and Girls, said the Government is committed to improving safety, and said Image Intercept will be “vital” in protecting children online.  

She said: “The IWF’s latest findings are deeply disturbing, and show how urgent it is for us to tackle the growth of online child sexual abuse and intensify our efforts to protect children from these heinous crimes. 

“I want to thank the IWF for all the work they do to shine a light on this issue. Their ‘Image Intercept’ initiative, funded by the Home Office, will be vital in helping to stop the re-victimisation of children who have been exploited online.  

“But we must also hold technology platforms accountable. If they are to be safe places for our children, they must invest in technologies that block this harmful content and stop predators being able to access and groom children online. 

“This government is going further by introducing new measures so anyone who possesses an AI tool designed to create illicit images or owns manuals teaching them to how to do so will rightly face time behind bars. We will continue to support the robust implementation of the Online Safety Act and will not hesitate to go further if necessary to keep our children safe.” 

As well as having powers to proactively seek out child sexual abuse imagery, the IWF’s expert taskforce also hashes imagery reported by the public and police, as well as from CAID, to which it has privileged access.  

Children and young people also report sexual imagery of themselves to the IWF via the dedicated Report Remove system in the UK, and Meri Trustline in India.  

Today’s IWF report reveals that almost a third of the images and videos the IWF hashed in 2024, (210,572 or 29%) included Category A child sexual abuse – the most extreme imagery which can depict rape, sadism, or even bestiality.  

By using the IWF’s new hash checking tool, businesses and platforms can make sure none of the images or videos on our Hash List can ever be shared again on their sites.  

Dan Sexton, IWF Chief Technology Officer
Dan Sexton, IWF Chief Technology Officer

Dan Sexton, Chief Technology Officer at the IWF, said: “By making millions of hashes of known child sexual abuse imagery available for free, in a simple tool for businesses and platforms, we can dramatically change the landscape and make thousands of platforms hostile spaces for criminals. It will put rocket boosters on the Online Safety Act.  

“It’s revolutionary. Blocking child sexual abuse imagery is technically a solved problem. Our skilled analysts know where criminal content is and can go after it on the open web, proactively stepping in to get it removed at source, and building the tools to stop them being shared again.  

“Once we’ve found and hashed a child sexual abuse image, it should never be allowed to be uploaded online again. Children are facing a crisis online and deserve a safer internet. Image Intercept takes us one step closer to making that a reality.” 

Almudena Lara, Child Protection Policy Director at Ofcom, welcomed the development of new protections online.  

She said: “These findings are deeply concerning and underpin why it’s so important that companies act quickly to prevent this sickening material from being shared on their sites and apps. It’s good to see new protections being developed to deal with this horrific content. 

 “Under the UK’s online safety laws, platforms must now take steps to tackle illegal content and activity, and the fight against child sexual abuse is our first priority. On the day these new duties came into force, we launched an enforcement programme in this area. 

 “We’ve been working closely with the IWF and others as we’ve developed our detailed rules and identified platforms at high risk of being used by predators. Online service providers who fail to introduce the necessary protections should expect to face enforcement action.” 

Chris Sherwood, CEO at the NSPCC, said child sexual abuse imagery online is an “urgent crisis” and welcomed Image Intercept and the protection it will bring.  

He said: “The record levels of child sexual abuse imagery being reported by the IWF are a stark and urgent crisis. Behind each image is a child whose life has been irrevocably changed through grooming and exploitation, who faces re-traumatisation every time that image is reshared. 

“We welcome the IWF’s new Image Intercept tool, which will give smaller platforms the support they need to meet their obligations under the Online Safety Act and remove this abhorrent content. Ofcom must hold tech companies accountable for using tools like this to help stop the spread of child sexual abuse material. Likewise, they must also focus on identifying and disrupting perpetrators at the source, stopping abuse before it damages lives. 

 “Childline’s Report Remove service is here for any young person under 18 who wants to speak to a professional and confidentially report sexual images and videos of themselves. Through the service, the IWF and Childline can help get these images removed and prevent them from being shared in the future.” 

Image Intercept logo

How do I sign up? 

Anyone running a small platform, forum, tech business, or start-up which could be at risk of abuse by criminals sharing child sexual abuse imagery should contact the IWF to ask about the new hash-checking service.  

The IWF has a wide range of tools and services, and can advise on the best ways to prevent your platform becoming a target for abuse by criminals. 

Visit iwf.org.uk/imageintercept for more information.  

* In 2024, the IWF discovered child sexual abuse, or links to it, on a record-breaking 291,273 webpages. Some of these reports were pages with links, rather than imagery, so no data on sex of children was recorded. The total number of reports with sex recorded is 287,406. 

** IWF analysts record the age of the youngest child seen in any imagery on a URL or in reported via child reporting services. This can mean another child or multiple children of other ages, or the same age, may also be visible on the same URL but they are not recorded.    

Each report assessed by IWF could contain one, tens, hundreds or thousands of individual child sexual abuse images or videos.  

*** The Marie Collins Foundation is a UK charity dedicated to supporting victims of technology-assisted Child Sexual Abuse on their recovery journey. Visit www.mariecollinsfoundation.org.uk/  

New partnership builds connections to prioritise children’s safety online

New partnership builds connections to prioritise children’s safety online

Glide joins the IWF to help eliminate child sexual abuse material.

28 April 2025 News
Charity raises alarm over surge in level of child sexual abuse imagery hosted in EU

Charity raises alarm over surge in level of child sexual abuse imagery hosted in EU

Call for Member States to come together and push forward with ‘desperately needed’ child protection laws as thousands of webpages containing children’s sexual abuse traced back to EU servers.

23 April 2025 News
Peer39 joins with IWF to prevent offenders profiting from the promotion of child sexual abuse content

Peer39 joins with IWF to prevent offenders profiting from the promotion of child sexual abuse content

New collaboration between advertising expert and Internet Watch Foundation will hit criminals in their pockets.

2 April 2025 News