Out-Law / Your Daily Need-To-Know

Out-Law News 4 min. read

ICO guide addresses Online Safety Act and GDPR cross-over


New guidance has been issued by the UK’s Information Commissioner’s Office (ICO) to help content platforms meet their duties under the UK’s Online Safety Act in a way that accords with UK data protection law.

Meghan Higgins, an expert in technology law at Pinsent Masons in London, said that the ICO’s guidance on content moderation and data protection explains how various aspects of data protection law – from fundamental principles on data minimisation, accuracy, retention and security, to rules governing data subject rights and automated decision-making – are engaged by content moderation requirements arising under the Online Safety Act.

The Online Safety Act was passed into law in October 2023 and imposes significant and complex obligations on certain online service providers. It requires, among other things, the removal of illegal content by all in-scope services, as well as the removal of content that is legal but harmful to children by in-scope services that are likely to be accessed by children.

In 2022, in anticipation of the new online safety regime being introduced in the UK, Ofcom and the ICO – the UK’s data protection authority – committed to “work closely together to achieve maximum alignment and consistency between the data protection and online safety regimes”. The ICO said its new guidance reflects that joint statement.

Higgins said: “Data protection considerations add a layer of complexity to the process of implementing compliance measures under the Online Safety Act. The ICO’s guidance is timely as it coincides with Ofcom moving forward with measures of its own to support compliance with the Online Safety Act – its consultation on an initial tranche of draft codes and guidance pertinent to the ‘illegal harms’ duties under the Online Safety Act closed at the end of last week – and with the Digital Services Act, which also imposes content moderation requirements, taking full effect in the EU.

“The guidance highlights that in-scope services will need to assess whether they are meeting their obligations under data protection law when implementing content moderation techniques to comply with the new duties under the online safety regime,” she said.

In its guidance, the ICO explained how the processing of personal data could arise during the different stages involved in content moderation. It highlighted, in particular, that user-generated content is likely to constitute personal data in and of itself and how content moderation can also involve using personal data that is linked to the content or a user’s account. 

According to the ICO, because content moderation involves personal data processing that is likely to result in a high risk to people’s rights and freedoms, organisations must carry out a data protection impact assessment (DPIA) prior to the processing. It said service providers that engage in content moderation must also carry out a DPIA if they are using children’s personal data as part of offering an online service directly to them.

The ICO guidance also addressed the fact that some online service providers that host large volumes of content rely heavily on algorithms and other automated systems to meet their obligations on removing content under the Online Safety Act. It highlighted the need for those businesses to determine whether their content moderation “involves solely automated decisions that have legal or similarly significant effects on people” – and to comply with the restrictions on such decision-making set out in Article 22 of the UK General Data Protection Regulation (GDPR).

Higgins said: “Services should be aware that where solely automated decisions are made about content that have legal or significant effects on people, they will need to consider whether one of the exceptions under Article 22 of the UK GDPR applies to allow this type of decision-making. This could be because the decision-making is authorised by domestic law, necessary for a contract, or based on the user’s explicit consent. The ICO also recommends that services implement an appeals process for content moderation that users can easily find and make use of.”

The ICO’s guidance also addressed the potential for the processing of ‘special category’ data by content moderation systems. Special category data is data considered, under UK data protection rules, to be particularly sensitive and to which additional restrictions on processing and safeguards apply. The term captures data about a person’s race, ethnic origin, political opinions or religious and philosophical beliefs, among other things.

The ICO highlighted that content moderation may involve processing special category information both directly and by inference, such as if special category information is used to provide context, or where the content being moderated includes special category information about users, such as their political views. It emphasised the need for online service providers processing special category information in their content moderation systems to not only have a lawful basis for such processing but to ensure they identify a specific condition in the UK GDPR that provides for such processing.

The ICO further confirmed that where the details that a content moderator can infer about someone likely fall into the realms of special category data – such as if someone is wearing clothing associated with a particular religious affiliation, for example – and they are using that information to inform the moderation, they need to meet the requirements for processing special category information under the UK GDPR.

In a similar vein, the new guidance also addressed the potential for online service providers to process personal data relating to criminal convictions and offences – including suspicions or allegations of criminal activity – when moderating content. Like with special category data, there are particular restrictions on processing such data under the UK GDPR. The ICO said services may be processing content relating to criminal convictions and offences when making a judgment that there are reasonable grounds to infer that particular content is illegal. In those cases, services must identify a condition for processing, as well as a lawful basis, it said.

Higgins said that online service providers also need to be aware of the cross-over between the child safety duties arising under the Online Safety Act and specific data protection rules applicable to the processing of children’s data in the UK, under the age appropriate design code. Providers may process children’s personal data when using age assurance technologies to comply with the child safety duties under the Online Safety Act, she added.

“Services working to meet their obligations under the Online Safety Act by implementing new technologies for age assurance and content moderation purposes should factor in the need to carry out a DPIA before implementing these technologies and ensuring that they update users about their plans to process personal data as part of content moderation obligations," Higgins said. 

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.