For many years, the IWF has worked with a range of stakeholders, including politicians and campaigners, to advocate for strong regulations combating Child Sexual Abuse Material (CSAM) and improving safety online.
We welcomed the UK Online Safety Act as a pivotal opportunity in safeguarding children from the potential dangers and harms they may encounter in the digital world.
Regulation can help ensure children are consistently provided with age-appropriate experiences that prioritise their safety online. By holding tech companies accountable, it shifts the responsibility on to platforms to minimise harm and deliver more positive outcomes for children and young people.
The UK Online Safety Act (2023) is a new law that introduces measures to protect children and adults online. The Act requires tech companies that provide user-to-user services or search engines to improve online safety by removing illegal content (which includes CSAM), addressing harmful material for children, and enforcing age limits for adult content.
Under this law, these companies are required to safeguard UK users by evaluating potential risks of harm and taking steps to address them. The new regulation applies to all relevant services with a significant number of UK users, or those targeting the UK market, irrespective of the companies’ locations.
The Act officially became law in October 2023.
Ofcom, the independent communications regulator in the UK, has since been working to implement the new legislation. Ofcom has drafted specific steps that providers can take to meet their safety obligations through codes of practice. Under the new regulatory framework, Ofcom possess the power to assess and enforce compliance among service providers.
With the regulatory regime now in effect, for Ofcom “2025 is the year of action for services. Sites and apps must now act to better protect users online, especially children”.
The Online Safety Act represents a crucial advancement in safeguarding children from online sexual abuse.
In 2024, IWF’s team of analysts acted on over 290,000 web pages displaying this illegal content, with each page containing hundreds, if not thousands, of indecent images of children.
This is the most child sexual abuse webpages the IWF has ever discovered in its 29-year history and is a five per cent increase on the 275,650 webpages identified in 2023.
To tackle the circulation of CSAM online, the Act establishes safety duties for online services. Platforms are required to conduct risk assessments to evaluate the likelihood and impact of child sexual exploitation and abuse on their sites. These services must then take appropriate measures to mitigate identified risks, as well as actively identify and remove illegal content.
Under the Illegal Harms Codes, for the first time, platforms are legally required to tackle CSAM, such as through deploying hash-matching technology to detect and remove child sexual abuse material, as well as detecting and removing content matching listed CSAM URLs.
The greater the risk on a service, the more measures and safeguards needed to keep users safe from harm and prevent grooming and exploitation of children. With the implementation of the Act, the sharing of CSAM should become significantly more difficult.
The Act requires online user-to-user services and search services to take robust action against illegal content and activity. Platforms will be required to implement measures to reduce the risks their services are used for illegal offending. They will also need to put in place systems for removing illegal content when it does appear.
Companies that fail to comply with the new regulations face fines of up to £18 million or 10% of their qualifying worldwide revenue, whichever is greater. Criminal action may also be pursued against senior managers if their company does not ensure compliance with information requests from Ofcom. The regulator will also have the authority to hold companies and senior managers criminally liable if they fail to adhere to enforcement notices regarding specific child safety duties related to child sexual abuse and exploitation on their platforms.
In December 2024, Ofcom published its first codes of practice and guidance for services to prevent illegal content on their platforms. For the first time, platforms will be legally required to detect and remove known child sexual abuse material and the IWF stands ready to support platforms to meet this obligation.
Search services will also have to take steps to reduce the risks if users encounter illegal content via their services. Companies with websites that are likely to be accessed by children need to take steps to protect children from harmful content (including pornography) and behaviour.
The Codes also include measures to tackle online grooming. For example, children’s profiles and locations will not be visible to other users, and non-connected accounts cannot send them direct messages.
The Online Safety Act’s illegal duties came into force on 17 March 2025. We expect to see a significant increase in the range of providers hash matching for known CSAM. For example, all file-storage and file-sharing services will have to undertake hash matching, regardless of size.
Ofcom will also assess the measures being taken to stop the spread of CSAM, by file-sharing and file-storage providers, through their new enforcement programme. You can read more in our blog post on the Illegal Harms Codes.
Number in our Codes | Recommended measure | Who should implement this |
ICU C9 | Providers should ensure that hash-matching technology is used to detect and remove child sexual abuse material (CSAM). This involves analysis images and videos communicated publicly on the service and comparing a digital fingerprint of that content to digital fingerprints of previously identified CSAM. |
|
ICU C10
|
Providers should detect and remove content communicated publicly on the service which matches a URL on a list of URLs previously identified as hosting CSAM. |
|
Table extract from: Ofcom: Protecting people from illegal harms online Volume 2: Service design and user choice. p149.
The Act states that Ofcom may not recommend the use of proactive technology to analyse user-generated content communicated privately, or metadata relating to user-generated content communicated privately. This means that Ofcom cannot direct service providers to deploy proactive technology, such as the IWF’s services, in private communications.
Regarding end-to-end encrypted (E2EE) environments, Ofcom has an outlined criteria as to when an E2EE service would be considered private, and in what circumstances it would be public.
Section 121 of the Act sets out Ofcom’s powers to require services to use accredited technology, including in private messaging, to tackle child sexual exploitation and abuse (as well as terrorism).
The section states that if Ofcom considers that it is necessary and proportionate to do so, the regulator may give a notice to a service provider. The notice requires the provider to use its “best endeavours to develop or source technology for use on or in relation to the service or part of the service”.
This proactive approach is critical for the detection of CSAM, ensuring that safeguards are in place wherever there is a risk to children or potential exposure to CSAM. By enforcing the requirement for companies to use their best endeavours in detecting CSAM, Ofcom can help ensure that technology is effectively deployed to combat these threats.
It is crucial that Ofcom fully leverages its authority under Section 121 of the Act, compelling tech companies to take all necessary measures to detect, block, and prevent the upload, sharing, and storage of images and videos depicting child sexual abuse.
The Act requires that services hosting pornography or other harmful content implement 'age assurance' measures to prevent children from typically accessing such material.
Age assurance methods - such as age verification, age estimation, or a combination of both - must be ‘highly effective’ in accurately determining whether a user is a child.
On 25 July 2025, the children’s safety duties under the Online Safety Act came into force. This means user-to-user services that are “likely to be accessed by children” must ensure that those children are prevented from encountering primary priority content (including pornography) and protected from being exposed to a list of other priority content.
For some services, such as adult services, this means they need to introduce “highly effective” age assurance to keep children off the entire service.
On 24 April 2025, Ofcom published its final proposals for the protection of children online.
The Children’s Codes outlines the recommended safety measures providers will need to comply with if their services are likely to be accessed by a child. Where a services is likely to be accessed by a child, platforms are required to carry out a children’s risk assessments, and use proportionate safety measures to keep children safe online.
Services are not obligated to follow the recommended measures in the Children’s Codes as long as they introduce measures that achieve the same safeguarding outcome.
Ofcom has made some positive changes in ensuring children have age-appropriate experiences through the children’s codes. They have met the IWF’s call for “age-appropriate access to content, features, and functionalities” by making adjustments to their content moderation and recommender settings.
Ofcom has also acknowledged the need for platforms to better align with the ICO’s Age-Appropriate Design Code (AADC). However, the extent to which the incorporation of the AADC is enforceable remains unclear.
The Codes now require services to take a child’s age into account when deciding on how to respond to primary content and non-designated content. They also require platforms to consider the risks to children in different age groups when services are developing this content and search moderation measures.
Services have also been asked to take a more “protective approach” to content moderation for under 16s, including image blurring, “while maintaining a robust level of protections for 16-17-year-olds.”
On 25 July 2025, the children’s safety duties under the Online Safety Act came into force. These are welcome content moderations measures to ensure children’s interaction online are age appropriate.
To assess and monitor industry compliance with the illegal content risk assessment duties under the Act, Ofcom has launched an enforcement programme.
On 31 March, certain large services, as well as small but risky sites, submitted illegal harms risk assessments to the regulator.
Providers must determine how likely it is that users could encounter illegal content on their service, or, in the case of user-to-user services, how they could be used to commit or facilitate certain criminal offences. Providers must also make and keep a written record of their risk assessment, including details about how it was carried out and its findings.
Ofcom will scrutinise the compliance of sites and apps that may present particular risks of harm from illegal content due to their size or nature – for example because their users may risk encountering some of the most harmful forms of online content and conduct, including child sexual exploitation.
Providers are required, by law, to respond to any statutory request for information by Ofcom in an accurate, complete and timely way. If any platform does not provide a satisfactory response by the deadline, Ofcom will open investigations into individual service providers.
Ofcom has strong enforcement powers at its disposal, including being able to issue fines of up to 10% of turnover or £18m – whichever is greater – or to apply to a court to block a site in the UK in the most serious cases.
As of August 2025, Ofcom has issued its illegal harms code and children’s safety duties.
Ofcom is referring to those published codes as “first edition” codes, to reflect that the regime must be iterative, and further codes would be developed by the regulator to address emerging technologies and the associated harms.
In June 2025, Ofcom launched a consultation on additional safety measures to be included in its existing online safety codes of practice on illegal harms and the protection of children. The new measures which Ofcom are consulting focus on additions to existing measures, gaps identified in previous codes, and responses to new threats.
The deadline for responses to Ofcom's consultation is 20th October.
For more details of the timeline of the Act’s implementation, see Ofcom’s updated roadmap.
Companies looking to ensure their platforms comply with the provisions set out in the UK Online Safety Act and contribute to making the internet a safer place for all can apply to join the Internet Watch Foundation as Members.
Members can gain access to a range of cutting-edge datasets and alerts, to protect their platforms, brands and customers from known CSAM content, as well as early insights into threats and trends.
Find out more here or contact our team directly at [email protected].
Our new tool Image Intercept allows eligible small businesses to proactively detect and stop known child sexual abuse images and videos, by leveraging our advanced hash-matching technologies. Find out more here or email [email protected].
For information on complying with the Online Safety Act, head to Ofcom’s guide for services.