Out-Law Analysis 6 min. read

The UK’s Online Safety Bill: what next for online service providers


Content platforms should seek to understand the extent to which new laws passed by the UK parliament will impact their services.

The Online Safety Bill passed its final parliamentary stage on Tuesday 19 September and now awaits Royal Assent. The legislation is controversial, but it is now here to stay and requires business action. It is unclear, however, whether it will achieve its purpose of protecting people from online harm – or simply lead to a complex and confused regulatory landscape which deters online service providers from participating in the UK market.

The Online Safety Bill in brief

The finalised Online Safety Bill is years in the making. In 2017, the government published the internet safety strategy green paper, followed by publication of and consultation on a white paper on ‘online harms’ in 2019 – an initiative that led to a draft Online Safety Bill being published in 2021. As it progressed through parliament, the Bill was subject to significant amendments. The outcome of the process is a highly complex new law that places significant obligations on many online service providers.

The Bill requires the removal of illegal content by all in-scope services, as well as the removal of content that is legal but harmful to children by in-scope services that are likely to be accessed by children. For certain types of “priority” illegal content and content that is harmful to children, services will have an obligation to proactively monitor their platforms and remove this content before users encounter it. In imposing this new “general monitoring” obligation the Bill diverges from the existing intermediary safe harbours regime which has been in place for over 20 years. It also diverges from the approach taken in the EU’s Digital Services Act. The duties in relation to certain ‘priority’ illegal content apply to both an extensive list of existing offences and new offences introduced by the government through the Bill.

As the various iterations of the Bill emerged during its passage through parliament, it was clear that UK law makers were becoming increasingly focussed on providing protections for children online. The Bill contains provisions that address various steps online services can take to comply with their duties, such as the use of age verification technologies. More information about content that should be considered harmful to children will be provided in additional regulations and guidance.

Some controversial provisions have fallen away over time. The Bill had at one time controversially imposed duties on the largest service providers in relation to content that was “legal but harmful” to adult users. The Bill now provides for a ‘triple shield’ of protections to be given from the largest providers of search services and social media platforms for adult users by:

  • ensuring that illegal content is removed;
  • placing a legal responsibility on social media platforms to enforce the promises they make to users when they sign up through their terms and conditions; and
  • ·offering users the option to filter out harmful content, such as bullying, that they do not want to see online.

When the new legislation begins to apply, the government will also be required to set out secondary legislation designating the thresholds that will determine which category of services that online services fall into. Certain high risk and high reach services that fall into Category 1, 2A or 2B service will face greater obligations. Ofcom is the regulator and is required to produce a register of these categorised services and to advise the government on the thresholds for these categories.

There are significant powers of enforcement under the Bill. These include the power to issue fines of up to £18 million, or 10% of a company’s annual global revenue, whichever is highest. Perhaps most notably, failure to meet certain child protection duties could give rise to criminal liability for senior managers – including a risk of imprisonment of up to two years.

The Bill has been subject to extensive scrutiny in parliament and there have been a series of recent amendments. The Bill contains provisions that tackle online fraud and violence against women and girls, including provisions that seek to make it easier to charge abusers who share intimate images. It also includes provisions that prevent users from being exposed to fraudulent and scam advertising. Recent amendments include provisions intended to make it easier for coroners and parents to access deceased children’s data. Others provide that social media firms must prevent activity that facilitates animal cruelty and torture – including content that is seen by users in the UK but takes place outside of the UK.

Obligations

Online services that have a significant number of UK users, or focus on the UK as a target market, or can be used by UK users and contain content that could be considered harmful, are all within the geographic scope of the Bill. These online services will need to consider whether they are regulated because they facilitate the sharing of user-to-user services, are search services, or because they publish pornographic content that is accessible by UK users.

Those services that are in-scope services, which the UK government has estimated to number more than 25,000 businesses in the UK, now need to consider the steps they will take to comply. Although Ofcom will issue more guidance as to how regulated services can comply with their duties, it would be sensible for regulated services to take steps now to consider:

  • whether they fall within the Bill’s scope, and if so which elements of their businesses will be regulated;
  • the obligations that are likely to apply to that service given their business model;
  • how they will be able to assess the risks associated with content on their platforms;
  • whether their services are likely to be accessed by children;
  • the current systems in place which allow users to report concerns about content or making complaints; and
  • the services’ current terms and conditions and where they are likely to require improvement.   

Ofcom guidance

Ofcom is the designated regulator for the Bill. It last updated guidance relevant to the Bill, titled the ‘roadmap’ to regulation, in June. Ofcom has duties to publish codes of practice and guidance on how in-scope services can comply with the various duties contained in the legislation.

Ofcom’s June ‘roadmap’ explains that the regulator plans to publish their codes and guidance in three phases.

Phase one includes publishing the first draft codes of practice shortly after the commencement of the Act.

Phase two includes setting out child safety duties and consulting on a register of risks and risk profiles relating to harms to children and producing draft risk assessment guidance focusing on children’s harms.

Phase three involves setting out the additional requirements for regulated services that fall into the Category 1, 2A or 2B thresholds. These requirements include duties to:

  • produce transparency reports;
  • provide user empowerment tools;
  • ·operate in line with terms of service;
  • protect certain types of journalistic content; and
  • prevent fraudulent advertising.

Action needed by businesses

The Bill has been criticised for attempting to cover all ills. It is a complex piece of legislation and provides significant obligations for service providers that fall in-scope – this includes larger services that have more tools at their disposal to seek to meet the duties, but also smaller services that are less well-equipped.

The stated purpose of the legislation is to make the UK the safest place in the world to be online, yet critics have raised concerns that the wide-ranging duties risk restricting free speech and undermining privacy. Some technology and social media companies have threatened to simply leave the UK rather than to seek to comply with the strict provisions that they fear would undermine users’ privacy.

Messaging services such as WhatsApp and Signal have explained that the powers risk undermining end-to-end encryption and violating significant privacy rights of users, or perhaps emboldening hostile governments who may seek to draft similar laws. The government has said that these powers would only be used “where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content”. The Home Office published guidance on end-to-end encryption and child safety to coincide with the Bill receiving its final approval in the House of Lords.

Co-written by Lottie Peach of Pinsent Masons.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.