OUT-LAW NEWS 5 min. read

Online Safety Act: content around body stigma and depression could be regulated

Bare feet on a bathroom scale with a measuring tape in front

vadimguzhva/iStock.


Online service providers in the UK could face new duties to monitor for and address content that shames or otherwise stigmatises body types or physical features, as well as content that promotes depression, hopelessness and despair, under new plans being considered to protect children against online harm.

Last month, Ofcom confirmed that it will consider whether to re-categorise ‘body stigma content’ and ‘depression content’ in a way that would bring that content within the scope of increased regulation under the UK’s Online Safety Act (OSA). Ofcom is the UK’s online safety regulator.


Read more from Pinsent Masons on children's online safety regulation


The OSA requires providers of in-scope services, being user-to-user (U2U) services, search services, or services on which provider pornographic content is published or displayed, to design and operate their services in a way that prevents users from encountering certain types of illegal content, and take down illegal content quickly upon coming aware of it. It also requires services that are likely to be accessed by children to take similar measures in respect of content that is legal but harmful to children.

The safety duties about illegal content and safety duties protecting children vary depending on how severe the harm associated with the regulated content is. Services that are likely to be accessed by children have the most onerous duties in respect of ‘primary priority content that is harmful to children’ (PPC), including an obligation to prevent children of any age from encountering such content through age verification or age estimation. Services have less onerous obligations in relation to ‘priority content that is harmful to children’ (PC), and ‘non-designated content that is harmful to children’ (NDC).

The OSA currently defines the types of content that qualifies as primary priority content and priority content harmful to children at sections 61 and 62. These provisions can be updated to change the types of content that services need to address to protect children.

Currently, body stigma content and depression content is categorised as NDC. In respect of content classed in this way, providers of in-scope services must assess whether it is likely to be accessed by children on their platforms, run a risk assessment where that is the case, and take proportionate steps to address risks they identify.

Ofcom is now considering whether body stigma content and depression content should be treated as either PC or PPC instead. Additional duties apply to providers in respect of content categorised this way – most significantly that services need to prevent child users from encountering PPC.

To meet their obligations around PC, providers of in-scope services are required to run the same assessments of the likelihood of children accessing the content and the risks posed as they are in respect of NDC, but take more targeted action such as how they moderate, review and age-filter PC. Examples of PC include violent content, content that is abusive or incites hatred, bullying content, harmful substances content, and dangerous stunts and challenges content.

The PPC category covers content considered so harmful to children that providers must proactively monitor for it and remove it before users encounter it, using ‘highly effective age assurance’ methods. Examples include pornographic content, suicide content, self-injury content and eating disorder content.

The potential re-categorisation of body stigma content and depression content was flagged by Ofcom in a recent call for evidence (16-page / 319KB PDF) it opened in which it has invited stakeholders to provide evidence to be used in its first statutory report on content harmful to children, which it is required to publish by 26 October this year. The report may provide advice to the Secretary of State on whether it should change the PPC and PC currently identified in the OSA.

Specifically, Ofcom is seeking evidence of the incidence of content harmful to children on regulated user-to-user, search and/or combined services. This includes the quantity, prevalence or presence of such content on these services and the frequency with which children are encountering it.

It is also seeking evidence of the harm that children in the UK suffer or may suffer as a result of encountering content that is harmful to children, in particular evidence of physical or psychological harm, as well as evidence suggesting it may be appropriate to make changes to the kinds of PPC and PC in the Act and, if so, what changes would be appropriate.

Sadie Welch of Pinsent Masons said the statutory duty on Ofcom to produce an annual report on content harmful to children is aimed at ensuring the regulator continuously assesses the evolving nature of online harms. In its report, Ofcom must include advice to the Secretary of State as to whether it is appropriate to make changes to the categories of content Ofcom has identified as harmful to date.

Welch said: “Ofcom has acknowledged that, given the length of time it takes to build an evidence base, it may be too soon to recommend significant changes to the kinds of PPC and PC in the Act. However, Ofcom has expressly said it plans to assess evidence relating to body stigma and depression content and consider whether it is appropriate to add either kind of content to the categories of PPC or PC in the OSA.”

“The Ofcom call for evidence comes at the same time that the children's commissioner has published a report on children’s exposure to appearance-changing products online, which includes things such as prescription only weight loss drugs, skin lightening products and muscle building supplements. That report found that the children surveyed felt that viewing body image content presented alongside the use or marketing of products was putting undue pressure on them and having a negative impact on their self-esteem. It calls on Ofcom to amend its protection of children code of practice to require regulated online services to put in place proactive measures to prevent all content that is harmful to children, including NDC relating to body stigma, from being served to children,” she said.

“The children’s commissioner report highlights that, despite naming advertising-based business models as a risk of harm, Ofcom has not included any related safety measure in its corresponding code, which means technology companies are not currently required to take action against the risk of children being served body stigma content through advertising, to be considered compliant with their duties under the OSA. We see that the report could lead to the re-categorisation of body image content, from NDC to one of the more severe categories of content deemed harmful to children,” she said.

For types of content to be categorised as PC or PPC, the Secretary of State must consider that "there is a material risk of significant harm to an appreciable number of children presented by content of that kind". In addition, to add a content type to the PPC category, the Secretary of State must further consider whether it is appropriate for the duties placed on online services to prevent or minimise the risk of children from encountering PPC to apply to that kind of content.

Ofcom’s call for evidence will close on 10 March 2026. 

Earlier this week, the UK government opened a separate consultation on possible wider interventions relating to children’s digital wellbeing – which include on a possible social media ban for children, daily screen time limits, restrictions on overnight access, and age-related restrictions – including on services that have personalised algorithms or on the use of certain features and functionalities, like those that allow children to make in-service purchases or which encourage them to use services for longer.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.