Out-Law News 4 min. read

Online Safety Act becomes UK law with more detail on duties imminent


Content platforms in the UK can expect to receive more details on the steps they might need to take to address the risk of illegal online harms on 9 November, a regulator has confirmed after the Online Safety Act passed into UK law.

The complex new law places significant obligations on many online service providers, requiring the removal of illegal content by all in-scope services, as well as the removal of content that is legal but harmful to children by in-scope services that are likely to be accessed by children.

For certain types of “priority” illegal content and content that is harmful to children, services will have an obligation to proactively monitor their platforms and remove this content before users encounter it.

Ofcom’s initial analysis is that more than 100,000 online services could be subject to the new law, however, the greatest obligations will fall on certain high risk and high reach services. Different safety measures will be appropriate for different types of service and Ofcom’s recommendations will vary for services depending on their size and degree of risk. The Act provides for the creation of new regulations to set out the thresholds for determining which category of services that online services fall into, while other important details that can support compliance will be provided by Ofcom in guidance and codes of practice. Ofcom is the regulator tasked with overseeing and enforcing the new regime.

Lottie Peach of Pinsent Masons, an expert in the regulation of online services, said: “Importantly, Ofcom makes it clear that online services are not required to follow the recommendations in Ofcom’s codes – services may choose to use other measures than those set out in the codes, but they will need to explain how their chosen approach meets the duties in the Act. Where services follow the recommendations in Ofcom’s codes, they will be deemed to have complied with the relevant duties in the Act.” 

In a statement issued after the Online Safety Act received Royal Assent on Thursday, Ofcom said it is “moving quickly to implement the new rules”, which it has provided more detail on (15-page / 1.37MB PDF) in a new paper it has also published. There are three distinct phases to its planned work.

The first phase is focused on the ‘illegal harms’ duties arising under the Act. Ofcom said it will publish a series of draft codes of practice and guidance relevant to those duties on 9 November.

Included in those publications will be Ofcom’s analysis of the causes and impacts of online harm, which it said would “support services in carrying out their risk assessments”, as well as draft guidance on a recommended process for assessing risk. The papers will also include “draft codes of practice, setting out what services can do to mitigate the risk of harm” and further draft guidelines outlining the regulator’s proposed approach to enforcement.

Ofcom will hold a consultation on the draft codes and guidance and said it plans to publish a statement on its “final decisions” in autumn 2024. The codes of practices will be given a statutory footing. When finalised by Ofcom, they will be submitted to the Secretary of State for Science, Innovation and Technology, and subject to their approval, laid before parliament.

Phase two of Ofcom’s work relates to child safety, pornography and the protection of women and girls.

Ofcom expects to publish draft guidance on age assurance in December this year. In spring 2024, the regulator expects to consult on further draft codes of practice relating to protection of children, as well as analysis of the causes and impacts of online harm to children and draft risk assessment guidance focusing on children’s harms. Specific draft guidance on protecting women and girls will then be produced by spring 2025, Ofcom said, after it has finalised its codes on the protection of children.

The third phase concerns transparency, user empowerment, and other duties on categorised services.

Ofcom said only a “small proportion of regulated services” will fall within the categorised services facing additional requirements under the Act. Those requirements include duties to produce transparency reports; provide user empowerment tools; operate in line with terms of service; protect certain types of journalistic content; and prevent fraudulent advertising.

Ofcom intends to issue a call for evidence early next year in relation to its approach to those duties and follow that up with a further consultation on draft transparency guidance in mid-2024. It will further advise the government on the thresholds for determining categorised services in early 2024 and expects the government to make regulations on the matter next summer. Assuming no delay to that timeline, Ofcom said it will publish the register of categorised services by the end of 2024; publish draft proposals regarding the additional duties on these services in early 2025; and issue transparency notices in mid-2025.

The Online Safety Act provides Ofcom with significant powers of enforcement, including the power to issue fines of up to £18 million, or 10% of a company’s annual global revenue, whichever is highest. Failure to meet certain child protection duties could also give rise to criminal liability for senior managers – including a risk of imprisonment of up to two years. Experts at Pinsent Masons have previously advised content platforms to seek to understand the extent to which the legislation will impact their services.

Dame Melanie Dawes, Ofcom chief executive, said: “Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression. We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.”

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.