Yesterday (Thursday 8th May) the Department for Science, Innovation and Technology published its final Statement of Strategic Priorities (SSP) for Online Safety. The SSP sets out the UK Government’s strategic priorities and desired outcomes which Ofcom must have regard to when exercising its regulatory functions.
The Online Safety Act (the Act) is a crucial child protection measure with the potential to transform children’s safety online. We welcome the Secretary of State’s strong Statement on the importance of the online safety agenda, and the call for greater pace, urgency, and ambition in Ofcom’s implementation process.
The Statement follows a consultation which took place late last year, in which the IWF submitted a response. We are pleased to see that the Government has adopted some of the tangible recommendations we put forward. This includes explicitly referencing child sexual abuse material alongside terrorist content when directing Ofcom to actively tackle material across all platforms and functionalities.
The Government has made clear that “Children should also not be responsible for protecting themselves from harmful content. We want to see services that are safe by design, where features are chosen and designed to limit the risk of harm to users. This should be a basic principle for operating in the UK market”. We share Government's view that safety by design "should be a basic operating principle in the UK market" and encourage clear guidelines to be provided on how platforms should operationalise safety by design.
On Terms of Service, the Government has been more ambitious with its wording, explicitly directing Ofcom to “hold services accountable to ensure that they apply these terms consistently, fulfilling their promise to users that if content is in breach of their terms then it must be removed.”
Additionally, the Statement makes explicit that age assurance technology is already sophisticated and should be implemented by services “to identify child users and ensure that they cannot access harmful content on their services – this includes both age estimation and age verification technology.”
The need for strong, ambitious foundations that place children’s safety at the core of regulatory efforts is underscored by the escalating concerns over the widespread presence of CSAM online. In 2024, the Internet Watch Foundation (IWF) uncovered over 290,000 web pages containing CSAM. This is the most child sexual abuse webpages the IWF has ever discovered in its 29-year history and is a five per cent increase on the 275,650 webpages identified in 2023.
As implementation enters this next phase, the IWF stands ready to work alongside Ofcom as it enforces the Online Safety Act and to help companies to do everything they can to comply with the new duties.