‘Significant breakthrough’ as IWF analysts crack code used by predators to share child sexual abuse material online
The expanded keywords list from the IWF will help prevent criminals from accessing images and videos of children being sexually abused.
A “significant” breakthrough has been made in cracking the “predators’ code” after criminals developed the secret language to help them access and spread child sexual abuse material on the internet.
Analysts at the Internet Watch Foundation (IWF) have spent months investigating and deciphering thousands of coded words and phrases being used as a secret “slang” by online sex predators who are searching for child sexual abuse material on the open web.
The IWF, the UK-based charity responsible for finding and removing images and videos of child sexual abuse from the internet, has added thousands of new words and phrases to its watch list after they were discovered to be secret keywords being used to find illegal content.
The IWF has kept the internet safe with a watch list of 450 words and phrases which its members use to block searches for images and videos of child sexual abuse.
Now, the number of key words and phrases has increased massively by 3,681, with several hundred more still to be added.
The seemingly innocent words and phrases have been adopted by online “communities” involved in sharing child sexual abuse material.
The new keywords will be used by internet companies, including mainstream search engines, to scan for the words and block any searches which would return illegal content.
Anyone wanting to sign up to the IWF as members can use the keywords list. Interested parties should contact the IWF to see what services might best help them keep safe on the internet.
Susie Hargreaves, Chief Executive of the IWF, said: “It’s a kind of predators’ code. These people have developed their own secret language over time to help them find and access criminal images of children being sexually abused without attracting the attention of their internet provider.
“These new additions to our keywords list represent a major breakthrough in stopping people finding this illegal material online and disrupting their access to images and videos that perpetuate children’s suffering and pain.”
The words which have been added to the IWF’s keywords list can appear to be normal search terms and are not always explicitly linked to child sexual abuse. The criminals who are using them know they have “double meanings” or secret combinations which are used to find the worst material.
IWF’s hotline manager, who asked to be identified only as Chris, manages a team of 13 analysts who have been deciphering words and terms used by online predators.
Chris said: “We’ve got quite strict rules around what the analysts can and can’t do, so they are not allowed to solicit any information from individuals, they are not allowed to try and coerce or tempt people into making disclosures.
“We’ll read the logs and the chat room conversations and even newsgroup conversations and see people’s exchanges, and in those exchanges it becomes apparent people are using key phrases or key words.
“It’s almost like cockney rhyming slang – when you see those phrases being used in those forums, and on those platforms, that are not common parlance anywhere else, you get an instinct for what they mean from the context in which they are being used.
Sarah Smith, Technical Projects Officer at the IWF, said: “There are certain phrases or words that are used that might have a double meaning but, equally, people who are in these communities will be aware of particular series of images because they get labelled with particular names, so people will request images from a particular series, they will use particular words which the lay person would not understand.
“Some of them might be very straight forward, others not so much. There is a certain slang that is used.”
Ms Smith said identifying new keywords can, in turn, help analysts at the IWF find and remove new child abuse material they have not previously seen.
She said, however, that the criminals’ language is “constantly evolving” to help them avoid detection and to avoid the criminal content they are sharing being identified and removed.
She added: “It is always an arms race when we are dealing with the offender community. It is a high stakes, high risk game for them to try to ensure they can maintain their access to this content and also to ensure they are not identified.”
Ms Smith said the new keywords will make a “significant impact in our ability to disrupt access to this content”.
Bino Joseph is a developer at the IWF. He has been part of the team working to build the tools which help to expose criminal content and get it removed from the internet.
He said: “In the current list there are words which straight away on its own mean child abuse terms. And there are other words which, on their own, don’t mean child abuse but, if you are using them in combination with any other words in the list, then they become a key word.”
He said the new keywords list will help a range of companies, from internet service providers to companies which run chat functions or messaging services, as well as web hosting companies, to keep their platforms safe.