Glossary

Regularly used terms and acronyms

 

Action or Actioned (see also confirmed):  A report which has been assessed and found to contain child sexual abuse material and where we took active steps to remove this material from the internet.

Adverts: Text or imagery that promotes access to or the sale of child sexual abuse material.

Agenic AI tools: Autonomous software systems powered by Large Language Models (LLMs) that set goals, plan tasks, and execute multi-step workflows with limited human oversight. Unlike passive chatbots, these agents proactively solve complex problems by interacting with external tools, APIs, and software. They adapt to feedback and learn from their environment to achieve specific, high-level objectives.

Assess or Assessed: The term ‘assessed’ means an analyst has taken time to review a report, which may contain URLs, images or videos, or methods for accessing imagery, to determine if it contains or links to criminal child sexual abuse content or not.

Bad actors: Any individual or group that engages in harmful, unethical, or malicious behaviour by the production, distribution and/ or consumption of child sexual abuse material.

Banner site: A website or webpage made up of adverts for other websites with text links or images that take you to third-party websites when you click on them.

Blog: A blog is a discussion or information site made up of separate entries, or posts. Most are interactive, and visitors can leave comments and even message each other on the blog. The interactivity is what makes them different from other static websites.

CAID: The Child Abuse Image Database (CAID) is a project led by the Home Office which enables UK law enforcement to assess, categorise and generate unique hashes for tens of millions of child abuse images and videos found during their investigations.

Category A, B and C: We assess child sexual abuse material according to the levels detailed in the Sentencing Council's Sexual Offences Definitive Guideline. The Indecent Photographs of Children section (Page 34) outlines the different categories of child sexual abuse material.

  • Category A: Images or videos involving penetrative sexual activity; images involving sexual activity with an animal; or sadism.
  • Category B: Images or videos involving non-penetrative sexual activity.
  • Category C: Other indecent images or videos not falling within categories A or B.

Child sexual abuse content/ material: Media (images, videos and manuals) that show the sexual abuse of children as well as links and other methods used to access it.

Child sexual abuse imagery: Image and videos that show the sexual abuse of children.

Clear web: The publicly accessible portion of the internet indexed by traditional search engines like Google, Bing, and Yahoo

Collage (see also grid images): An image that is comprised of multiple images, often in a grid pattern.

Coms Groups: Interconnected, transnational online networks involved in serious, high-harm criminal activities, including child sexual abuse, serious violence, cybercrime, and extremism.

Confirmed (see also action / actioned): A report which has been assessed and found to contain child sexual abuse material and where we took active steps to remove this material from the internet.

Cookies: A small piece of data sent to the web browser by the server. This data is recorded in the Cookie field in the HTTP request. Just like with referrers, the server can check this data and show you a different result depending on the cookies you have acquired.

Cryptographic hash: A cryptographic hash is a digital fingerprint of any form of digital data. Cryptographic algorithms can hash a single word, an mp3, a zip file – anything digital. Cryptographic hashes can be used to identify exact matches of that digital data.

Cyberlockers: File hosting services, cloud storage services or online file storage providers. They are internet hosting services specifically designed to host users’ files.

Dark web: The dark web, also known as the dark net, is the hidden part of the internet accessed using Tor or a similar dark web-compatible browser. Tor is anonymity software that makes it difficult to trace users’ online activity.

Deepfakes: Media (images, videos, or audio) that has been digitally manipulated through AI tools or software to replace one person’s likeness convincingly with that of another. 

Disguised websites: Websites which, when loaded directly into a browser, show legal content - but when accessed through a particular pathway (or referrer website) show illegal content, for example child sexual abuse images.

Domain alerts: Details of domain names that are known to be hosting child sexual abuse content.

Domain Name System (DNS): A system that translates human-readable domain names (such as www.example.com) into machine-readable IP addresses (such as 192.0.2.1). It allows users to access websites using easy-to-remember names instead of numeric IP addresses.

DNS operates through a network of servers that resolve domain names into corresponding IP addresses, enabling web browsers and other internet services to connect to the correct servers.

Forum: An online chat site where people talk or upload files in the form of posts. A forum can hold sub-forums, and each of these could have several topics. Within a topic, each new discussion started is called a thread, and any forum user can reply to this thread.

Gateway sites (see also referral sites): A webpage that provides direct access to child sexual abuse material, e.g. via a link, but does not itself contain it.

Graphics Processing Units (GPUs): Sometimes called 'graphics cards' are designed to do lots of simultaneous calculations independently. Typically, they’re used in gaming but they can also be used to solve more general purpose computational problems on large scale data.

Grid images (see also collages): An image that is comprised of multiple images, often in a grid pattern.

Hash/hashes: A string of letters and numbers generated from the binary data of a digital file using a hashing algorithm. A hash acts as a digital fingerprint of the file; even a very small change to the file will result in a different hash value.

Once an image or video has been assessed as child sexual abuse material it is given a hash value to enable it to be identified without needing to visually examine it. 

Hash List: The IWF Hash List contains hashes (see hashes) of known criminal images and videos that we have assessed as showing child sexual abuse. It is updated daily and manually verified by our expert analysts.

By using our Hash List of criminal child sexual abuse material, tech companies can look for matching hashes in their systems and, when a match is identified, stop criminals from uploading, downloading, viewing, sharing or hosting these known images and videos showing child sexual abuse. 

Hidden services: Websites that are hosted within a proxy network, so their location can’t be traced.

Hosting: The process of storing and managing digital content such as websites, applications, or data on a specialised computer called a server, making it accessible to users over the internet or a network.

Image board: A type of internet forum that operates mostly through posting images. They’re used for discussions on a variety of topics, and are similar to bulletin board systems, but with a focus on images.

Image host/ Image hosting site: A service that lets users upload images which are then available through a unique URL. This URL can be used to make online links, or be embedded in other websites, forums and social networking sites.

Image store: A service or system designed as a digital container to hold images or visual media files for retrival and usage.

Internet Service Provider (ISP): A company or organisation that provides access to the internet, internet connectivity and other related services, like hosting websites.

Invite Child Abuse Pyramid (ICAP): Custom-built websites using viral marketing techniques, similar to that of a pyramid scheme, where people are incentivised to share links to child sexual abuse sites far and wide in a ‘scattergun’ approach and with the aim of recruiting as many buyers as possible.  

Inchoate sites: Non-clickable (text) links that share information - such as URLs - that enable or assist others in committing the offence of accessing child sexual abuse material. Distribution methods could include innocent looking videos which include links and instructions in the comments section about how to view and download content elsewhere, with some videos even hiding passwords that unlock illegal content on secondary sites.

IntelliGrade: Our powerful in-house software tool that enables our analysts to accurately grade child sexual abuse images and videos, while automatically generating unique hashes which are used to identify and eliminate these images wherever they appear.   

Keywords: A collection of words and phrases that are identified as being associated with child sexual abuse material.

Manuals: Refers to manuals that provide instructions or guidance on how to locate or perpetrate child sexual abuse online.

Meta data: The additional information recorded about images and videos that have been assessed as child sexual abuse. This could include, age, sex, type of sexual activity, or whether the image was ‘self-generated’. 

Multichild: A new feature (2024) that allows IWF analysts to track information about all the children seen in an image. It is integrated in to the IWF’s IntelliGrade software, a powerful tool that enables us to accurately assess individual child sexual abuse images, while automatically generating unique hashes.

By recording child data for all children present in a unique image, we are able to more accurately represent the number of children seen in criminal imagery.

Newsgroups: Internet discussion groups dedicated to a variety of subjects. Users make posts to a newsgroup and others can see them and comment. Sometimes called ‘Usenet’, newsgroups were the original online forums and a precursor to the World Wide Web.

Non-photographic imagery (NPI): Images and videos of child sexual abuse which are not photographs, for example computer-generated images.

Notice and takedown (NTD): Formal notice to takedown content in accordance with The Electronic Commerce (EC Directive) Regulations 2002, (specifically Part 19).

The communication is formal notice to take down the URL(s) which we have determined contains, advertises or links to child sexual abuse material which we believe to be criminal under UK law.

Nudify/ Nudification: AI apps which allow users to easily remove the clothing from images of real people and children.

Open source: Software whose source code is released under a license in which the copyright holder grants users the rights to use, study, change, and distribute the software and its source code to anyone and for any purpose.

.onion: A special-use top level domain designation for anonymous "hidden services" reachable only through the Tor network.

Perceptual hash: A perceptual hash is a digital fingerprint of an image which has been created using an algorithm. Perceptual hashes enable near-duplicates of that image to be identified.

Proactive: We can actively search for child sexual abuse content, and create reports for action by our analysts. We call this ‘proactive searching’, and it accounts for a large proportion of our work.

Prohibited reports/ images: Non-photographic images of child sexual abuse. They include computer generated images (CGI), cartoons, manga images and drawings.

Proxy network: These are systems that enable online anonymity, accelerate service requests, encryption, security and lots of other features. Some proxy software, such as Tor, attempts to conceal the true location of services.

Redirector: A technique used by websites to make a web page available under more than one URL address. When a web browser attempts to open a URL that has been redirected, a page with a different URL is opened. 

Refferal sites (see also gateway sites): A webpage that provides direct access to child sexual abuse material, e.g. via a link, but does not itself contain it.

Referrer: An optional field in the HTTP request which identifies the page making the request. This allows the server to see what web page you’re coming from. When your browser sends a request to see a web page, the server can check an optional HTTP referrer field stored by your browser. This allows the web developer to program which version of the site you will see depending on what’s in your referrer field.  

Reporting Portals: A world-class reporting solution provided by the IWF for child sexual abuse content, for countries which don’t have an existing Hotline

Revictimisation: Revictimisation, or repeated victimisation, is what happens to a victim each time their image is shared online. A single image of a victim can be shared once, tens, hundreds or even thousands of times.

Self-reporter: Children or young people who report their own intimate, sexual or explicit imagery directly to us or via our child reporting services.

‘Self-generated’ child sexual abuse imagery: ‘Self-generated’ images and videos are those where a child or children can be seen alone, with no perpetrator physically present with them at the time the imagery was captured - though they may be digitally present.

These children are most often groomed, deceived or extorted into producing and sharing sexual imagery of themselves. Sometimes children are completely unaware they are being recorded and that an image or video of them is then being watched and shared by abusers.

We regard the term ‘self-generated’ child sexual abuse as an inadequate and potentially misleading term which does not fully encompass the full range of factors often present within this imagery, and which appears to place the blame with the victim themselves. Children are not responsible for their own sexual abuse. Until a better term is found, however, we will continue to use the term ‘self-generated’ as, within the online safety and law enforcement sectors, it is well recognised.

Sexually coerced extortion/ sextortion: A form of blackmail in which intimate, nude, or sexual images or videos are used to threaten or exploit an individual, often to demand additional imagery or money. In some cases, the images or videos are captured without the individual’s knowledge.

Social networking site: A social networking service is a platform to build social relations. It usually has a representation of each user (often a profile), their social links and a variety of other services. Popular examples include Facebook and X.

Top-level domain (TLD): Domains at the top of the domain name hierarchy. For example .com, .org and .info are all examples of generic top-level domains (gTLDs). The term also covers country code top-level domains (ccTLDs) like .uk for UK or .us for US and sponsored top- level domains (sTLDs) like .mobi or .xxx

URL: An acronym for Uniform Resource Locator. A URL is the specific location where a file is saved online. For example, the URL of the IWF logo which appears on the webpage www.iwf.org.uk is www.iwf.org.uk/themes/iwf/images/theme-images/logo.png.

URL List: Our dynamic URL List provides a comprehensive list of webpages where we’ve confirmed images and videos of child sexual abuse. We update it twice a day, adding new URLs as our analysts find them and remove URLs that no longer contain the criminal content.

All IWF Members can use this List, under license, so that they can block access to these criminal webpages. While access to the images and videos is blocked, we work to have the actual image or video removed from the internet. 

Webpage: A document which can be seen using a web browser. A single webpage can hold lots of images, text, videos or hyperlinks and many websites will have lots of webpages. www.iwf.org.uk/about-iwf and www.iwf.org.uk/Hotline are both examples of webpages.

Website: A website is a set of related webpages typically served from a single web domain. Most websites have several webpages.