Out-Law / Your Daily Need-To-Know

Out-Law News 2 min. read

UK Online Safety Bill updated to require content to be assessed in context


Proposed amendments to legislation in the UK would require online service providers to consider “all relevant information that is reasonably available” to them when determining the status of content on their platforms – including whether it is illegal content and needs to be removed from their platforms.

The proposed new requirement has been added to the Online Safety Bill by the UK government. The Bill was introduced into UK parliament earlier this year. It is wide-ranging and would, if given effect in its current form, impose extensive obligations on in-scope online services providers to address illegal and harmful content that appears on their platforms. However, the Bill’s progress has been slowed due to the ongoing uncertainty over who will be the new leader of the governing Conservative party.

In a factsheet issued to explain the new provisions it has added to the Bill, the government said the proposed duty to consider “all relevant information that is reasonably available” should involve consideration of “all reasonably-available contextual information”. It said this includes “content judgements relating to other duties in the Bill (content of democratic importance, journalistic content, harmful-to-children and harmful-to-adults content), as well as content judgements relating to the illegal content duties”.

The new clause on providers’ judgements about the status of content also includes specific duties in relation to judgments made about illegal content, or illegal content of a particular kind, or a fraudulent advertisement. The provisions require online service providers to put in place systems to enable them to treat content as falling under those categories if there are “reasonable grounds to infer that content is content of the kind in question”.

The legal test proposed for determining whether ‘reasonable grounds for that inference’ exist in relation to content and an offence is where all relevant information that is reasonably available to a provider has been considered and the provider has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied, and it does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

Where content has been generated by bots or other automated tools, the proposed legal test is to be applied “in relation to the conduct or mental state of a person who may be assumed to control the bot or tool (or, depending what a provider knows in a particular case, the actual person who controls the bot or tool)”, the government said.

A further new clause added to the Bill by the government would require Ofcom to issue guidance to help online service providers make judgments on whether content on their platforms is illegal.

The government said: “We expect [the Ofcom guidance] will include examples of the kind of contextual and other information that is likely to be relevant when drawing inferences about mental elements and defences, and how far providers should go in looking for that information. This will provide greater certainty to companies about when they must take action against illegal content, particularly in relation to offences that rely on mental elements, or absence of defences.”

Meghan Higgins of Pinsent Masons said that while additional Ofcom guidance would be welcome, the new provisions could make it more difficult for providers to meet obligations to assess  the status of content, not less. She said that while it was helpful to see the government acknowledge the difficulties many online service providers would encounter in assessing content under the Bill, the requirement to consider additional contextual information could increase the difficulties in making complex judgments as to whether content satisfies the elements necessary for commission of a criminal offence, particularly when using automated systems and processes.

In a separate factsheet, the government highlighted other new clauses it has added to the Bill, which are designed to enhance protections for journalism. These include ‘temporary must carry’ provisions that would oblige online service providers to notify news publishers and enable those publishers to appeal before removing or moderating their content.

Experts at Pinsent Masons have previously expressed concern that the aims of the Bill – to reduce online harms – could be undermined by a lack of clarity over the way the legislation is to be implemented and enforced.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.