In 2023 the Internet Watch Foundation (IWF) declared a partnership with Aylo (formerly MindGeek) to develop a robust standard of good practice, aimed at guiding the adult industry in the vital fight against child sexual abuse material (CSAM) online.
To forge this path, the IWF assembled an Expert Advisory Board to spearhead the development of these standards. This board boasts a diverse representation from notable organisations, including the British Board of Film Classification, SWGfL, Aylo, Marie Collins Foundation, the Home Office, PA Consulting, National Crime Agency (NCA) and academia represented by Middlesex University and Exeter University.
Under the esteemed leadership of Sir Richard Tilt, former Chair of the IWF, the board dedicated the following 18 months to crafting these standards. They engaged with various stakeholders - including academics, child protection organisations and adult service providers - to meticulously assess the current landscape and refine the proposed standards.
We are proud to announce the culmination of these efforts in the IWF’s standard of good practice for adult content providers. This comprehensive framework outlines essential criteria that adult services should meet in order to have confidence that imagery of child sexual abuse imagery is prevented.
These efforts by the IWF and its partners underscore a committed stride towards safer online environments, protecting children and young people while setting an exemplary standard for the adult industry.
Download: Standards and Supporting Document (PDF)
This document sets out the steps that should be taken by adult services who wish to prevent child sexual abuse imagery from appearing on their services.
The Advisory Board has set out six principles for adult services for tackling child sexual abuse (CSA), to apply in all countries where services are provided, across all sites:
Baseline Standard
a. Adult services must comply with legislation and regulations1. This includes requirements to assess the level of risk on their services and embrace safety by design as an approach to mitigate harms in the configuration of their platform or service.
b. Adult services must adopt mechanisms to detect and prevent known child sexual abuse material (CSAM), as deemed appropriate by the IWF.
c. Adult services must ensure that all materials published comply with the following standards (taken from the British Board of Film Classification’s (BBFC) standards for an R18 certification) and therefore exclude:
d. Adult services must publish transparency reports every six months2.
e. Adult services must establish and maintain a dedicated portal to facilitate accessible and secure communication between law enforcement and the platform regarding specific investigations, ensuring a channel for victim redress is readily available.
f. Adult services must have a clear reporting function for users to flag harmful content3.
g. Adult services must subject anyone visiting, publishing, or appearing in material on the platform to age verification measures to confirm they are over 18 years old at the time of production before content is permitted to be published4.
h. Adult services must ensure that consent is secured for all individuals appearing in content. All individuals must be allowed to withdraw consent at any time. Cases which involve professional contracts should be assessed on a case-by-case basis.
i. Adult services must not support technologies that obscure the content of messaging and other communications that would inhibit moderation.
j. Adult services must not adopt, or encourage the adoption of, technologies that can be used to bypass content filtering and content blocking mechanisms, whether for accessing their services or hosting them.
Higher Standard
a. Human moderators must review all content before it can be published on the platform.
b. Human moderators must be well-supported.
c. Adult services must deploy tools and changes on all historical content and accounts where reasonably possible, even if this requires removing old content.
d. Adult services should have a clear and effective deterrence messaging display if a user attempts to search for words and terms associated with underage content and other types of illegal content. This should be supplemented by a message encouraging users to contact an appropriate support organisation.
e. Adult services should seek to engage with other relevant organisations to use their tools, participate in initiatives, and seek their expertise.
f. Adult services should invest in horizon scanning for future issues and share intelligence with other platforms about threats to children on their sites or services.
1. Legal requirement: Implicit
2. Legal Requirement: Digital Services Act. Article 24 (2)
3. Legal Requirement: Online Safety Act, Part 3, Section 20, Subsection 3
4. Legal Requirement: Online Safety Act, Part 5, Section 81, Subsection 2