Reporting services for children

 

IWF provides two services which enable children to report sexual images and videos of themselves. The first service is available to UK-based children via Report Remove. We additionally launched a new service in partnership with the RATI Foundation in 2023 for children based in India.

It’s important to note that each service operates differently, and within different legal frameworks.

 

Report Remove

To support young people to remove sexual images or videos of themselves online, the IWF and NSPCC developed the world-first Report Remove tool which was launched in June 2021.

The NSPCC’s Childline service ensures that the young person is safeguarded and supported throughout the process and the IWF assesses the reported content and takes action if it meets the threshold of illegality. The content is given a unique digital fingerprint (a hash) which is then shared with internet companies to help prevent the imagery from being uploaded or redistributed online.

This solution provides a child-centered approach to image removal which can be done entirely online. The young person does not need to tell anyone who they are, they can make the report at any time, and further information and support is always available from Childline.

Young people create or sign into a Childline account which then allows them to receive Childline email updates about their report. Young people can use this email service for ongoing support, and they can contact a Childline counsellor via online chat and via their freephone number. They can also access relevant information and advice, self-help tools and peer support on the Childline website.

  • In 2023, we received 792 reports through the Report Remove tool, a 324% increase on 2022.
  • We were able to take action on five times as many of these reports in 2023 (508) compared to 2022 (101). 

 Report Remove and sexually-coerced extortion

  • 201 reports received by Report Remove contained comments indicating sextortion.
  • 45% or 160 reports of the 354 actionable reports from boys were because of sexually coerced extortion.

We’ve seen how boys are typically lured into what they believe are mutual exchanges of sexual images where they often think that they are sharing images with a peer or older person.

We know that sexually-coerced extortion is behind these specific reports from boys as they have included evidence of this within their report. This could be a chat log where the young person has demonstrated that they are being coerced or where a collage of images has been created by the offender, overlaid with threatening text.

 

Analysis of all ‘actionable’ reports

These are reports which were assessed as containing images and/or videos of child sexual abuse according to UK legislation. Children can report both URLs that contain their sexual image or the individual images and videos themselves via the Report Remove service.

Of the 508 reports we assessed as criminal images, most contained Category C images (76%), and, similar to last year, more boys reported actionable images to us than girls, with boys making up 70% of the total.

Report Remove - overview by sex

This chart provides an overview of the sex of the child depicted in the images or videos sent through the Report Remove tool.

 

Breakdown by sex and age group of child depicted

Rotate your device

 

This chart shows how boys aged 16 and 17 represent the biggest user group of the Report Remove service.

 

Report Remove - breakdown by sex and severity

Rotate your device

 

Most images and videos reported by young people through Report Remove (which are assessed as child sexual abuse material) fall into Category C, with a notable amount of imagery of boys assessed as Category B.

  • Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism.
  • Category B: Images involving non-penetrative sexual activity.
  • Category C: Other indecent images not falling within categories A or B.

 

Report Remove image-level analysis

 

Reports can contain one or more images or videos and we have therefore provided an analysis of the individual images and videos reported to us below. This data is drawn from IntelliGrade, where we create hashes of this imagery to provide to industry Members to prevent its recirculation online.

Report Remove images and videos overview by sex

 

Report Remove images and video overview by sex and age group

Rotate your device

 

Report Remove breakdown of images and videos by sex and severity

Rotate your device

 

Meri Trustline

In collaboration with the RATI Foundation, an Indian child protection organisation, young people can submit online content that the IWF will assess and seek to have blocked and/or removed if it breaches UK law.

The platform has been developed by non-profit tech organisation Tech Matters in collaboration with the IWF. It builds on existing Tech Matters software, Aselo, a cloud-based contact centre platform used by the RATI Foundation to run Meri Trustline, its helpline for children facing online harms in India.

The Trustline can be accessed using any preferred method of communication channel, such as WhatsApp, email or X formerly known as Twitter, along with a traditional phone number.

If a young person raises a concern over online sexual images, a Trustline counsellor will be able to use the tailored software to generate a reporting link and share that in the chat with the user.

Clicking on the link will then give the young person a secure single-use connection to a webpage where they can directly submit images and URLs to IWF analysts for assessment. Designing the platform in this way reduces the burden on young users, making it easier for them to report content that may be very distressing and perhaps shameful to discuss, in a safe and supported counsellor-led environment.

In 2023, six reports were received by IWF through Meri Trustline. Three were actionable.