Out-Law / Your Daily Need-To-Know

Out-Law News 5 min. read

Protecting children and disinformation in Ofcom’s sights

Young siblings on devices SEO

image sourced from Getty Images’ Creative library


Measures online platforms can implement to meet their obligations to protect children from harmful content under the UK’s Online Safety Act have been recommended by a regulator.

In respect of children’s safety, the Online Safety Act requires providers of ‘user-to-user’ services or search services to assess whether children are likely to access their service – or part of it; complete a children’s risk assessment to identify risks their service poses to children; and, where children are likely to access their services, take and implement a series of safety measures – including to mitigate and manage the risks of harm to children and prevent them from encountering ‘priority’ content deemed harmful to children, like pornography and material that promotes self-harm.

Ofcom is obliged, under the Act, to prepare codes of practice to help in-scope providers meet their duties. On Wednesday, it opened a consultation on two draft children’s safety codes – one aimed at providers of ‘user-to-user’ services, and the other aimed at providers of search services.

The providers are not obliged to adopt the codes – theoretically, they can meet their legal duties in other ways – but Ofcom’s new papers provide guidance on how providers can assess if their service is likely to be accessed by children, as well as information on the causes and impacts of harms to children, and the draft codes include a series of “recommended measures” that they can adopt to assist with compliance.

The measures proposed span issues of governance and accountability, content moderation, and user reporting and complaints, among other areas. For providers of user-to-user services, a focus of the measures is also on the use of “highly effective age assurance”.

Some of the specific measures Ofcom has recommended apply to all providers, but others are targeted at larger services – those with an average user base greater than seven million per month in the UK – and services exposed to medium or high risks, while other measures are also based on the principal purpose of the service.

Ofcom said: “We recognise that the size, capacity, functionalities, user base and risks of online services differ widely, and so have not taken a one-size-fits-all approach. Instead, we propose that services who pose the greatest risk to children need to do more to keep them safe.”

Ofcom is consulting on its proposals until 17 July. Its latest publications follow other announcements last week that relate to its broad regulatory brief in respect of online content.

First, Ofcom opened an investigation into whether Fenix International Limited, provider of the adult video-sharing platform OnlyFans, breached its duties to implement appropriate measures to protect under-18s from encountering restricted material such as pornography. The regulator is also investigating whether the provider failed to provide complete and accurate responses to statutory information requests.

While OnlyFans is a platform that is intended for use by over-18s, it is subject to rules set out in the UK’s Communications Act 2003 that requires it to put measures in place to protect under-18s from videos and audiovisual commercial communications containing restricted material, like pornography.

The current rules governing video sharing platforms in the UK will be repealed and replaced by new rules under the UK’s Online Safety Act, once the new rules are fully implemented. The Online Safety Act imposes duties on services that publish pornographic content to ensure that age assurance is used to prevent children from accessing that content. In December 2023, Ofcom issued draft guidance on effective means of age assurance to providers of services that make public pornographic content. In an interview with the Financial Times earlier this year, OnlyFans chief executive Keily Blair said “there’s not really that many … big operational shifts” that the platform has to do to be ready for the Online Safety Act.

Ofcom sent two information request notices to OnlyFans, in June 2022 and June 2023 respectively, seeking information about its age assurance measures. However, the regulator has now said that “the available evidence suggests” that the information OnlyFans shared in response “may not have been complete and accurate, and that the age assurance measures it had taken may not have been implemented in such a way as to protect under 18s from restricted material”.

Lottie Peach of Pinsent Masons said: “Ofcom’s investigation should serve as a reminder to platforms that Ofcom is taking age assurance very seriously and the measures that some platforms have in place may not go far enough to meet the standards set by Ofcom. We expect increasing regulatory scrutiny of measures taken by online services to protect young people in the UK and other jurisdictions.”

In a separate development last week, Ofcom published a consultation on its three-year media literacy strategy, setting out how it proposes to exercise its regulatory functions on media literacy. Ofcom has media literacy responsibilities under the Online Safety Act, which include building awareness of how people can protect themselves and others online, encouraging the development and use of technologies and systems so that users of regulated services can protect themselves and others. It is also obliged to publish a media literacy strategy.

Meghan Higgins of Pinsent Masons said: “Ofcom has recognised the increasing complexity for users navigating their experiences online and has updated its description of media literacy to encompass ‘the ability to use, understand and create media and communications across multiple formats and services’. Ofcom wants people to understand the steps they can take to be safer online and to identify trusted media. Its focus is on developing skills for people to be safer online, targeting online harms against women and girls, protecting content of democratic importance, and helping users recognise and respond to mis- and dis-information.”

A strand of Ofcom’s work, under the proposed new strategy, is to engage with online platforms to support efforts to develop media literacy. Its plans include building a collective understanding of what users consider helpful to navigate the online environment and what they expect from online providers, and turning these into action points to influence platforms to alter their products or services in accordance with user expectations. Ofcom highlighted that some platforms and online services already deploy features and tools, such as pop-ups and notifications, to provide context to content seen by users.

Ofcom further intends to provide evidence-based recommendations to encourage online services to develop robust media literacy interventions that take effect both offline and online, and it said will work with platforms on funding of media literacy programmes, by encouraging them to support and fund user media literacy skills and create a best practice principles media literacy consensus.

Higgins said: “The rapid acceleration and development of AI technologies over the past 12 to 18 months has underscored the challenges that Ofcom faces in helping users identify and respond to misleading content such as deepfakes. Ofcom’s proposed research and data driven approach will be welcomed by online services trying to implement practices to help their users understand what they are seeing.”

Ofcom’s draft strategy is open to consultation until 24 June. The regulator intends to publish its finalised strategy in the autumn.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.