OUT-LAW NEWS 3 min. read
Child social media ban among new UK proposals
The consultation closes on 26 May 2026. Matt Cardy/Getty Images.
02 Mar 2026, 6:03 pm
Social media platforms could be forced to prevent children from accessing their services under one option for reform the UK government is considering that would impact how social media companies engage with young people.
Australia became the first country in the world to introduce a ban on children under the age of 16 accessing social media platforms late last year. That law imposes user age assurance requirements on platforms. In January, the UK government promised to consult on its own ban. It has now opened a consultation (76-page / 830KB PDF) on the matter. It has asked stakeholders whether they would support a legal requirement for social media services to have a minimum age of access – and is seeking views on whether that measure should be applied to under-16s or for children aged younger.
Alternative interventions are also under consideration, however. These include daily screen time limits for individual apps, restricting overnight access for individual apps, and imposing age-related restrictions – including on services that have personalised algorithms or on the use of certain features and functionalities, like those that allow children to make in-service purchases or which encourage them to use services for longer.
Read more from Pinsent Masons on child online safety
- The UN principles shaping child online safety regulation
- The risks addressed by child online safety regulation
- UAE introduces new online restrictions with child safety legislation
- UK to consult on social media ban for children
- Under 16s social media ban now in force in Australia
Through its consultation, which will close on 26 May 2026, the government is also seeking views on whether the digital age of consent, under the General Data Protection Regulation (GDPR), should be raised from the current 13 years. The digital age of consent is 14 in Italy and Spain, 15 in France, and 16 in Germany and Ireland.
The government said it will act “swiftly on the evidence gathered” from the consultation. Its response is expected this summer.
Lauro Fava of Pinsent Masons welcomed the fact that the government is considering various options and not just focusing on a ban.
“A lot of charities are against a ban because they perceive it not to be in the best interests of children – a concept recognised in international law,” said Fava.
“Arguably, proponents of a ban place an over-emphasis on rights to privacy and safety and overlook the many benefits of social media access for children – the ability to learn and be creative, among others – as well as the other relevant rights at play under international law, including children’s rights to play and leisure time and to freedom of expression,” he said.
“Implementing a ban would come with problems – for example: how to effectively address the risk of circumvention; how to address risks associated with young people gaining overnight access to a digital environment that they have no prior experience of; and how to ensure vulnerable children can get access to support services,” he said.
“More targeted interventions on specific features and functionalities of social media platforms, such as infinite scrolling, autoplay, and personalised algorithms, would be a more proportionate response and build on existing obligations platforms face under UK frameworks. Under the UK GDPR, for example, platforms have obligations in respect of privacy-by-design, fair processing and profiling, while risks associated with children spending long periods of time on social media are also something platforms might be expected to act on following their children’s risk assessments under the Online Safety Act and to avoid detrimental uses of data flagged in the Information Commissioner's Office’s ‘children’s code’,” according to Fava.
“Many platforms already see the direction of travel on the issue of ‘stickiness’ and, notwithstanding the lack of conclusive research around the impact on children’s mental health, are already making it easier for children and parents to limit screentime. Platforms now have an opportunity to encourage the UK government to ensure that any further regulatory intervention is aligned with that already taken in the EU under the Digital Services Act, to enable consistency in what technical measures need to be deployed cross-border,” he said.
“What is important is that children are given their place in feeding their thoughts in on any prospective ban or restrictions. While there may be some parents that would welcome a ban as a solution to the problem of keeping up with fast-moving changing technologies and practices, children will have their own ideas about what constitutes their ‘best interests’ in relation to balancing the benefits of having access to services with the acknowledgement of potential harms that can arise from that,” Fava added.