Out-Law Analysis 9 min. read

Ofcom provides detail on Online Safety Act ‘illegal harms’ regime


Online content platforms operating in the UK should study recent publications issued by Ofcom to get a better handle on how the Online Safety Act will apply to them, and what they can do to meet the new legal requirements around addressing illegal content.

The draft codes of practice and guidance published by Ofcom on 9 November supplement the wording in the Online Safety Act, which became UK law late last month. While they will not be binding when they are finalised – it is open to service providers to adopt other approaches to comply with their duties under the Act – any service provider that implements the recommendations in those documents will be deemed to have complied with the legislation. This means that, for many service providers, compliance with the codes and associated guidance Ofcom issues will be the easiest way to meet their duties under the Act.

The initial tranche of draft codes and guidance issued address just one aspect of the new legislative regime – the ‘illegal harms’ duties arising under the Act. The papers include useful summaries of its proposals and who they apply to, as well as of the six chapters of Ofcom’s detailed illegal harms consultation paper.

Ofcom intends to follow-up with further phases of work addressing other requirements arising under the Act over the next 18 months. However, for the providers of the more than 100,000 online services – UK and non-UK based – that Ofcom has estimated will be subject to the Online Safety Act, the immediate task is to digest the significant detail that Ofcom has outlined, understand how it will apply to their business, consider whether to engage with the regulator’s consultation – open to 23 February 2024 – and think about the practical measures they will need to implement to ensure compliance with the new regime.

The illegal harms duties in brief

The Online Safety Act (OSA) requires the removal of illegal content by providers of in-scope services, being user-to-user (U2U) services, search services, or services on which provider pornographic content is published or displayed. It also requires the removal of content that is legal but harmful to children by in-scope services that are likely to be accessed by children. For certain types of “priority” illegal content and content that is harmful to children, services will have an obligation to proactively monitor their platforms and remove this content before users encounter it.

Higgins Meghan

Meghan Higgins

Senior Associate

Ofcom estimates that the number of online services regulated by the Act could be 100,000 or even significantly higher, and that many of these services are based overseas and may be unfamiliar with the online safety regime

The greatest obligations will fall on certain high risk and high reach services, under a new categorisation system for regulated services.

An overview of in-scope services

In chapter 3 of its illegal harms consultation, Ofcom provides detailed information about the types of services it considers to be in-scope of the OSA as either a user-to-user (U2U) service or a search service. Ofcom estimates that the number of online services regulated by the Act could be 100,000 or even significantly higher, and that many of these services are based overseas and may be unfamiliar with the UK’s online safety regime.

What constitutes a user-to-user service?

Under the OSA, a U2U service is defined as “an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”.

Ofcom suggests that U2U services in scope of the OSA are likely to include:

  • social media services;
  • video-sharing services;
  • messaging services;
  • marketplaces and listing services;
  • dating services;
  • review services;
  • gaming services;
  • file sharing services;
  • audio sharing services;
  • discussion forums and chat rooms;
  • information sharing services – such as online encyclopaedias and question and answer services; and,
  • fundraising services.

According to Ofcom, generative AI content could also constitute user generated content in some circumstances. This includes where a user uploads the generative AI content to a U2U service where it can be encountered by other users of the service. Where a user embeds a generative AI enabled bot on a U2U service, the content generated by that bot would constitute user generated content too.

What search services are in-scope?

Search services in scope include both general search services and vertical search services that enable users to search for specific products or services offered by third party operators, such as flights, financial products or insurance.

Categorisation of regulated services

The application of particular measures proposed by Ofcom depends on whether a service is a U2U service or a search service, and further depends on the size of the service and how risky it is considered to be.

Ofcom proposes to define a service as “large” where it has an average user base greater than seven million per month in the UK, approximately equivalent to 10% of the UK population. Services with a user base below this threshold are categorised as “smaller” services – a category type that includes small and micro businesses.

The proposals also subdivide the large and smaller services into three different risk levels:

  • ‘low-risk’, where a risk assessment indicates that service is low risk for all kinds of illegal harm;
  • ‘specific risk’, where the service is medium or high risk for a specific kind of harm for which a particular measure is proposed;
  • ‘multi-risk’, for services that face significant risks for illegal harms – this being where the services are assessed as being medium or high risk for at least two different kinds of harms out of the 15 kinds of priority illegal harms specified in Ofcom’s draft risk assessment guidance.

Ofcom has recommended harm-specific measures for certain risks, although it hasn’t identified specific measures for all types of illegal harm. Its draft codes of practice outline the proposed measures services can take to meet the duties in the Act, which have been separated depending on whether the service is a U2U service or a search service.

For example, providers of U2U services will need to ensure their terms of service have provisions on how individuals are protected from illegal content, any proactive technology used, and how relevant complaints are handled and resolved, whereas search services will have to include such a provision in a publicly available statement.

Risk management

Ofcom has suggested a range of measures, including around governance and accountability, to help providers in-scope manage the risk of illegal content appearing on their service and other online harms.

For example, Ofcom has said that, for all U2U services assessed at all risk levels, providers could ensure there is a named person accountable to the most senior internal governance body for compliance with illegal content safety duties and other reporting and complaints duties.

Ofcom also made clear that, whereas a smaller low risk user-to-user service would not be required to complete internal monitoring to assess the effectiveness of measures to mitigate and manage the risks of harm and report to a governance body, a larger service assessed as ‘multi-risk’ would be required to do this. This approach aligns with its intention of placing the most onerous and burdensome expectations on the largest multi-risk services.

The number of measures that Ofcom has indicated that in-scope services should be implementing in relation to illegal content alone is significant, and these measures may need to be applied in respect of at least 15 different kinds of illegal harms – the number of harms set out in Ofcom’s draft risk assessment guidance. There is likely to be a significant burden, particularly for small and micro services, in understanding how Ofcom’s very extensive proposals should be understood and implemented by a particular service.

Whilst the measures in relation to illegal harms will apply to the broadest number of services, some of those services will have to undertake further measures in relation to other types of content, such as content that may be harmful to children. Ofcom’s chief executive Dame Melanie Dawes has emphasised that protecting children online is the regulator’s first priority. Given the detail and length of the proposed guidance, we would encourage providers of services that are likely to be in scope to begin engaging with and seeking to understand how it might impact them as early as possible.

The approach to illegal content risk assessment

The OSA requires providers of in-scope services to periodically undertake a suitable and sufficient illegal content risk assessment. This exercise will involve the providers having to make an assessment of the level of risk their service presents – including, among other things, in the context of the likelihood of harm arising to individuals from being presented with illegal content. Corresponding safety duties arise under the OSA, requiring action to be taken by in-scope service providers.

In its draft illegal content risk assessment guidance, Ofcom has proposed a four-step risk assessment process:

1. Understand the Harms

In its draft guidance, Ofcom has split the priority illegal harms into 15 different types. The list includes terrorism offences; child sexual exploitation; encouraging or assisting suicide or serious self-harm; and harassment. Certain service features are also considered to exacerbate risk, such as image sharing and livestreaming.

2. Assess the Risk of Harm

According to Ofcom, providers of in-scope services should consider other characteristics that may increase or decrease risks of harm, such as their user base, design features, algorithmic systems, business model, any user protection or risk mitigation measures, and other relevant aspects of the service’s design and operation, and the way it is used. This is the point at which the likelihood and impact of each kind of harm should be considered and a risk level assigned accordingly.

3. Decide measures, implement and record

Providers can decide to comply with a safety duty, either by taking the measures recommended in Ofcom’s codes of practice or by implementing their own policies to address risks. They should also consider any additional measures to respond to the risks they have identified, which may not be addressed in Ofcom’s codes of practice.

4. Report, review and update risk assessments

Ofcom will provide guidance on best practice arrangements for governance reporting, as well as best practice around ongoing risk management and mitigation.

The illegal content codes of practice

The core of Ofcom’s consultation is its proposed illegal content codes of practice, which provides extensive advice – running to 370 pages – on how providers of online services can mitigate the risks of illegal content on their services.

Core measures proposed by Ofcom include:

  • governance and accountability arrangements around the management of online safety risks, including senior management visibility of and accountability for key risks;
  • content moderation and search moderation measures to ensure that regulated services comply with duties to take down or deprioritise illegal content;
  • reporting and complaints recommendations for all services to make procedures easy for people to use, to allow users to provide extra information about their complaints, and to ensure action in response to complaints;
  • terms of service, and publicly available statements, providing accessible information about these processes;
  • large general search services handling complaints about predictive search recommendations;
  • multi-risk services undertaking on-platform tests of recommender systems, collecting safety metrics, and testing for safety outcomes.

In addition, certain specific measures are proposed to address specific kinds of illegal harm. Some measures would require technical changes to be made to services, such as adjustments to default settings for under-18s, and mechanisms enabling users to block and mute other users or disable comments.

The proposed codes of practice are extensive and prescriptive.

Ofcom proposes 34 measures for U2U services, although the applicability of each measure depends on the service size and risk categorisation. For search services, 28 measures are proposed.

For U2U services, certain measures, such as implementing a content moderation system or process to take down illegal content swiftly, and implementing a complaints process that is easy to find and use, apply to all services in all risk categories. The proposed measures are divided into the categories of governance and accountability, content moderation, automated content moderation, reporting and complaints, terms of service, recommender systems, enhanced user control, and user access measures.  

The draft code also acknowledges concerns around the potential impact of the measures on users’ rights to freedom of expression and privacy. Ofcom notes that these are ‘qualified’ rights, as enshrined in the European Convention on Human Rights, and that interference with these rights may be justified on specified grounds, including the prevention of crime, the protection of health and morals and the protection of the rights and others. Ofcom acknowledges that there is a risk of error in online services taking down content to protect UK users from illegal content, but notes that this risk is inherent in the scheme of the Act.

Ofcom suggested that the risk of unjustified interference with users’ rights to freedom of expression and privacy could be limited if service providers apply “accuracy targets” in respect of content moderation and operate effective complaints procedures.

Next steps for in-scope services

Ofcom’s consultation provides additional information about its current thinking which should be helpful to online services seeking to understand whether they will fall within the Act’s scope. The consultation questions at annexes 1-4 of the consultation provide service providers with an opportunity to share their views with Ofcom on a range of aspects of the proposals. Online services that have concerns or suggestions about the proposed approach are encouraged to provide responses on the consultation questions by 23 February 2024.

Co-written by Lottie Peach and Sadie Welch of Pinsent Masons.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.