In 2024, the Internet Watch Foundation confirmed 291,273 reports of child sexual abuse material (CSAM) online, the highest number ever recorded. That’s nearly 800 reports a day, each containing content that shows the sexual abuse of a child.
Yet the phrase often still used to describe this content is ‘child pornography.’
This article will break down what CSAM is, what makes it different, and why linguistics matters in the fight to protect children. We’ll also explore the legalities behind CSAM, the growing role of technology, what’s being done to stop this abuse, and how you can be part of the solution.
What is ‘child porn’?
There is No Such Thing as ‘child pornography’. This phrasing implies consent and even legality, as pornography is legal in most countries for consenting adults. But children cannot legally or ethically consent to sexual activity, and they can never be complicit in their abuse.
That’s why the Internet Watch Foundation (IWF) is calling for an end to the outdated and damaging use of the phrase ‘child pornography’. It’s time to use language that reflects the true nature of these acts and respects the victims of these crimes.
While the term is still used in some legislation and media, it’s not the right language. What people often call ‘child porn’ is more accurately known as child sexual abuse material (CSAM).
What is CSAM (child sexual abuse material)?
CSAM includes any images or videos that show the sexual abuse or exploitation of a child.
CSAM takes many forms. Sometimes it’s the result of grooming, where someone builds trust with a child online and then manipulates them into sharing explicit images. In other cases, it involves sexually coerced extortion (sometimes called ‘sextortion’), which is when a child is blackmailed into sending more imagery or money under threat.
And now, with the rise of technology, some CSAM is AI-generated but disturbingly realistic. Even if no real child was directly involved in the sexual abuse, these images still feed demand and normalise abuse, especially when the AI models have been trained on images of real children.
In most countries, creating, sharing or viewing child sexual abuse material (CSAM) is a serious criminal offence. But beyond the law, it’s about protecting children and treating them with the care and respect they deserve. Using the term CSAM helps us focus on what matters: stopping abuse and standing up for children’s safety and dignity.
CSAM vs ‘child porn’: why language matters
The words we use matter. When people say ‘child porn,’ it can sound almost like a category of adult content, but it’s evidence of a child being abused.
This has real-world consequences. During Operation Ore, a major UK police investigation into people accessing child abuse images, media reports often incorrectly used the phrase ‘child porn.’ The result was sensationalism, and in some cases, even less public empathy for victims. It blurred the reality of what those images represented. In some historical cases, courts have handed down lighter sentences because the material was framed as pornography rather than what it truly is: abuse.
That’s why we use the term CSAM, or child sexual abuse material, because children cannot consent to their own abuse. By avoiding the phrase ‘child porn’ and using clear, accurate language like CSAM, we put responsibility where it belongs: on the offender.