Child sexual abuse material vs ‘child porn’: why language matters

Published:  Tue 1 Jul 2025

In 2024, the Internet Watch Foundation confirmed 291,273 reports of child sexual abuse material (CSAM) online, the highest number ever recorded. That’s nearly 800 reports a day, each containing content that shows the sexual abuse of a child.

Yet the phrase often still used to describe this content is ‘child pornography.’

This article will break down what CSAM is, what makes it different, and why linguistics matters in the fight to protect children. We’ll also explore the legalities behind CSAM, the growing role of technology, what’s being done to stop this abuse, and how you can be part of the solution.

What is ‘child porn’?

There is No Such Thing as ‘child pornography’. This phrasing implies consent and even legality, as pornography is legal in most countries for consenting adults. But children cannot legally or ethically consent to sexual activity, and they can never be complicit in their abuse.

That’s why the Internet Watch Foundation (IWF) is calling for an end to the outdated and damaging use of the phrase ‘child pornography’. It’s time to use language that reflects the true nature of these acts and respects the victims of these crimes.

While the term is still used in some legislation and media, it’s not the right language. What people often call ‘child porn’ is more accurately known as child sexual abuse material (CSAM).

What is CSAM (child sexual abuse material)?

CSAM includes any images or videos that show the sexual abuse or exploitation of a child.

CSAM takes many forms. Sometimes it’s the result of grooming, where someone builds trust with a child online and then manipulates them into sharing explicit images. In other cases, it involves sexually coerced extortion (sometimes called ‘sextortion’), which is when a child is blackmailed into sending more imagery or money under threat.

And now, with the rise of technology, some CSAM is AI-generated but disturbingly realistic. Even if no real child was directly involved in the sexual abuse, these images still feed demand and normalise abuse, especially when the AI models have been trained on images of real children.

In most countries, creating, sharing or viewing child sexual abuse material (CSAM) is a serious criminal offence. But beyond the law, it’s about protecting children and treating them with the care and respect they deserve. Using the term CSAM helps us focus on what matters: stopping abuse and standing up for children’s safety and dignity.

CSAM vs ‘child porn’: why language matters

The words we use matter. When people say ‘child porn,’ it can sound almost like a category of adult content, but it’s evidence of a child being abused. 

This has real-world consequences. During Operation Ore, a major UK police investigation into people accessing child abuse images, media reports often incorrectly used the phrase ‘child porn.’ The result was sensationalism, and in some cases, even less public empathy for victims. It blurred the reality of what those images represented. In some historical cases, courts have handed down lighter sentences because the material was framed as pornography rather than what it truly is: abuse.

That’s why we use the term CSAM, or child sexual abuse material, because children cannot consent to their own abuse. By avoiding the phrase ‘child porn’ and using clear, accurate language like CSAM, we put responsibility where it belongs: on the offender.

When people say ‘child porn,’ it can sound almost like a category of adult content, but it’s evidence of a child being abused.

Is CSAM illegal?

Yes, CSAM is illegal. Across the globe, it is criminalised under international agreements like the UN Convention on the Rights of the Child. Laws vary slightly, but the core message is the same: children must be protected from exploitation, and CSAM is a direct violation of their rights and safety.

Every image or video showing CSAM documents a moment of exploitation. It’s not just illegal because it’s disturbing or offensive; it’s illegal because it records a crime. That material often continues to circulate online for years, repeatedly victimising the person every time it’s viewed, shared, sold or downloaded. This ongoing harm is part of why most countries take such a strong stance against it.

Unfortunately, not every country enforces these laws equally. Some developing nations have weaker legal frameworks, limited law enforcement capacity or outdated definitions that don’t fully cover online content. In certain places, possession might not be explicitly criminalised, or enforcement may be inconsistent, making it harder to prosecute offenders or protect victims effectively.

That said, there is a growing global effort to strengthen laws and improve cooperation between countries.

What is being done to stop CSAM?

Thankfully, powerful work is being done to stop the spread of CSAM. At the Internet Watch Foundation (IWF), we are one of the key organisations leading this effort.

Our latest report highlights a sharp rise in material where children have often been groomed or coerced online. To combat this and other types of CSAM, the IWF uses advanced tools like Image Intercept technology to identify and block known abuse material before it can even be uploaded.

Organisations like INTERPOL, ECPAT and INHOPE also work with governments worldwide to detect, report and remove CSAM and bring offenders to justice.

Want to be part of the solution? Whether you’re an individual or a company, you can help fund vital work and raise awareness. Visit the IWF’s Support Us page to get involved and, if you work for an organisation that handles online images and videos, consider becoming a Member.

What you can do to protect children and help survivors

If a child is in immediate danger, call 999 or the police emergency phone number in your country

Understanding that child sexual abuse material (CSAM) is evidence of real abuse is a critical first step. Once you know that, what else can you do to help?

If someone, child or adult, has experienced child sexual abuse, it’s vital to respond with compassion, urgency and belief. Learn to spot the signs of sexual abuse: sudden changes in behaviour, withdrawal, anxiety or physical signs that don’t seem right. Trust your instincts and speak up.

Who to contact:

  • IWF: If you’ve encountered online material that involves child sexual abuse, report it anonymously here.
  • NSPCC: If you’re worried about a child, the NSPCC is here to listen. You can call their helpline 24/7 at 0808 800 5000 or visit their website for advice and support.
  • Marie Collins Foundation: The Marie Collins Foundation offers support to anyone harmed by technology-assisted child sexual abuse which includes online child sexual abuse.
  • Stop It Now: If you’re worried about your own feelings or behaviour towards children, the Stop It Now service from the Lucy Faithfull Foundation offers counselling and advice. 

For more parenting advice and additional support, visit our useful links and parenting helplines.

Internet Watch Foundation seeks ‘resilient’ candidates for unique leadership role

Internet Watch Foundation seeks ‘resilient’ candidates for unique leadership role

New job role identified as ‘pivotal’ in Cambridgeshire charity’s mission to tackle child sexual abuse material online among growing threats such as AI generated imagery.

6 June 2025 Blog
Parliamentarians join the IWF to tackle online child sexual abuse material

Parliamentarians join the IWF to tackle online child sexual abuse material

On 28 April 2025, the IWF hosted MPs, peers, and staffers in Parliament to discuss the urgent findings of our 2024 Annual Data & Insights Report.

29 April 2025 Blog
UK teen’s sex abuse imagery identified thanks to IWF analysts’ pioneering work with policing database

UK teen’s sex abuse imagery identified thanks to IWF analysts’ pioneering work with policing database

As teens face a ‘crisis’ of online sexual abuse and exploitation, the IWF looks at why getting imagery of older teens removed can be difficult for some moderators.

23 April 2025 Blog