The latest data from the Internet Watch Foundation (IWF) reveals a record rise in UK children reporting online sexual extortion, with the Report Remove service now handling an average of nine cases a week. In 2025, the helpline saw a 66% increase in self-reports from under‑18s, confirming 1,175 cases involving harmful imagery — more than a third linked to sexually coerced extortion. Criminals are increasingly exploiting young people’s nude imagery to demand money, further content, or compliance, often using aggressive threats and personal information to create fear and control. Report Remove, run by the IWF in partnership with Childline, allows young people to block or remove nude images of themselves from the internet — even before they are shared. The majority of sextortion cases involved boys aged 14–17, highlighting a growing trend in targeted online abuse. Childline counsellors continue to support children facing blackmail, fear, and isolation. The service remains free, confidential, and available to any young person worried about their imagery being shared online.
The National Crime Agency estimates there to be between 550,000 and 850,000 people in the UK who pose varying forms of sexual risk to children.
The Queen used her speech at the state opening of Parliament to reaffirm the Government's commitment to develop legislation to make the internet safer for children and "vulnerable" users.
The UK’s Internet Watch Foundation (IWF) and the USA’s National Center for Missing & Exploited Children (NCMEC) announce a landmark agreement to better protect children whose sexual abuse images are shared and traded on the internet.
IWF CEO Kerry Smith calls for complete EU ban of AI abuse content at high-level meeting of global experts in Rome
On 28 April 2025, the IWF hosted MPs, peers, and staffers in Parliament to discuss the urgent findings of our 2024 Annual Data & Insights Report.
IWF analysts uncover platform hosting chatbot “characters” designed to let users simulate sexual scenarios with child avatars.
The IWF’s latest AI report exposes rapidly escalating harms to children as the EU moves to scale back the tools that detect and remove child sexual abuse material online. The charity warns that the EU must act urgently to criminalise AI‑generated abuse and preserve essential detection systems before risks intensify further.