OUT-LAW GUIDE 2 min. read

The UN principles shaping child online safety regulation

Child face illuminated over laptop

LaylaBird/iStock.


UN principles developed in the late 1980s, including one which promotes the concept of the ‘best interests’ of a child, shape how policymakers globally regulate the online world today amidst the proliferation of digital services and increased legislative focus on the protection of children.

The ‘best interests’ principle, set out in Article 3 of the UN Convention on the Rights of the Child (UNCRC), is reflected in a suite of legal frameworks – from the EU’s Digital Services Act (DSA) and Age Appropriate Design Code in the UK, to Australia’s amended Online Safety Act.

In this guide, we look in more detail at the rights and protections the UNCRC provides and explain how it has come to influence online safety regulation.

The best interests of the child

The UNCRC is the most widely adopted human rights convention in history, having been ratified by 196 states. It identifies a number of rights for children, who are defined in the Convention as those under 18, unless a signatory state has set a lower legal age for adulthood. The UNCRC imposes basic, flexible standards for children’s rights that can be implemented in countries with widely different political and economic situations around the world.          

The Convention also imposes a number of duties on those states to ensure that children can enjoy their rights. In particular, Article 3 of the Convention requires that in all actions, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration. The best interests assessment is accordingly a flexible set of variables that can guide decision makers in educational settings and family court proceedings, as well as the development of legislative frameworks. The best interests principles have been adopted in a growing number of legislative frameworks, such as the UK’s Age Appropriate Design Code and the European Commission’s guidelines on the protection of minors, developed in accordance with Article 28 of the DSA.

Many policymakers and regulators that have developed frameworks relating to children, including in relation to online safety, have drawn from the ‘best interests’ principle and other provisions of the UNCRC when making decisions about how they should protect children in different settings.

The Convention also recognises other rights that should be made available to children, including:

  • A child’s right to express his or her views freely;
  • A child’s right to freedom of expression, including freedom to seek, receive and impart information and ideas of all kinds through media of the child’s choice;
  • A child’s freedom of thought, conscience and religion;
  • A child’s freedom of association and of peaceful assembly;
  • A child’s right to privacy in their family, home or correspondence, and to be free from unlawful attacks on his or her honour and reputation;
  • A child’s right to engage in play and recreational activities;
  • A child’s right to access to information and material from a diversity of national and international sources, especially those aimed at the promotion of his or her social, spiritual and moral wellbeing.

In 2021, the United Nations Committee on the Rights of the Child (the Committee) issued ’General Comment 25’ on children’s rights in relation to the digital environment, which explains how states that are parties to the UNCRC should implement the Convention in relation to the digital environment, and provides guidance on relevant legislative, policy and other measures to ensure compliance with their obligations under the Convention.

The Committee’s recommendations for assessing best interests of children online

General Comment 25 contains a detailed series of recommendations for states in how to apply the UNCRC’s principles to measures relevant to a child’s experience online.

In relation to the ‘best interests’ principle specifically, General Comment 25 provides that states should “ensure that, in all actions regarding the provision, regulation, design, management and use of the digital environment, the best interests of every child is a primary consideration” and “involve the national and local bodies that oversee the fulfilment of the rights of children in such actions”.

Further, the Committee said that, in considering the best interests of the child, states “should have regard for all children’s rights, including their rights to seek, receive and impart information, to be protected from harm and to have their views given due weight, and ensure transparency in the assessment of the best interests of the child and the criteria that have been applied”.

The ‘best interests’ principle sits alongside three other core principles that are to govern states’ approach to children’s rights and protection in the digital environment. Those principles – of non-discrimination; right to life, survival and development; and respect for the views of the child – all interlink with the ‘best interests’ principle and have relevance in the context of online safety regulation.

For example, the Committee said that states should “take all appropriate measures to protect children from risks to their right to life, survival and development”, citing risks relating to content, contact, conduct and contract and giving examples such as violent and sexual content, cyberaggression and harassment, gambling, exploitation and abuse, including sexual exploitation and abuse, and the promotion of or incitement to suicide or life-threatening activities. Children’s views on the nature of the risks they face should be listened to, it added.

General Comment 25 also said that the use of digital devices should not be harmful and should not be a substitute for in-person interactions among children or between children and parents or caregivers. It states that precautions may be required, depending on the design, purpose and uses of technologies.

General Comment 25 advocates for states to adopt comprehensive policies and strategies that are aimed at providing children with the opportunity to benefit from engaging with the digital environment and ensuring their safe access to it. The Committee said states have a duty to protect children from infringement of their rights by businesses – including the right to be protected from all forms of violence in the digital environment. It highlighted the potential for the way digital services are designed or operated to cause or contribute to violations of children’s rights to freedom from violence. The Committee recommended that laws and regulations be put in place to try to prevent violations of the right to protection from violence, and that compliance with those frameworks be monitored and enforced.

States have further been advised by the Committee to take steps to protect children from harmful and untrustworthy content, citing the risk of children experiencing gender-stereotyped, discriminatory, racist, violent, pornographic and exploitative information, as well as false narratives, misinformation and disinformation and information encouraging them to engage in unlawful or harmful activities. It said states should ensure relevant businesses and other providers of digital content develop and implement guidelines to enable children to safely access diverse content – but that children’s rights to information and freedom of expression should be recognised when protecting children from harmful material.

Rights of access to information and around freedom of expression are referred to regularly by the Committee as UNCRC rights that children enjoy that should be factored into measures regulating how they engage with the digital environment.

For example, the Committee said that while digital service providers should be made to comply with relevant guidelines, standards and codes, and enforce lawful, necessary and proportionate content moderation rules, it said content controls, school filtering systems and other safety-oriented technologies should not be used to restrict children’s access to information in the digital environment, but rather only to prevent the flow of harmful material to children; and that the other rights children enjoy – including around freedom of expression and privacy – must be recognised in the context of content moderation and content controls.

Another consideration, the Committee said, should be a child’s evolving capacities. It said that risks and opportunities online differ depending on a child’s age and stage of development and that this should be factored into how measures to protect children online, or facilitate their access to digital services, are designed.

Specifically on the design of age-appropriate measures, the Committee said these should be informed by the best and most up-to-date research available, from a range of disciplines, and that states should ensure that digital service providers offer services that are appropriate for children’s evolving capacities.

The Committee has further encouraged states to introduce or update data protection regulation and design standards that identify, define and prohibit practices that manipulate or interfere with children’s right to freedom of thought and belief in the digital environment, for example by emotional analytics or inference. It said they should ensure that automated systems or information filtering systems are not used to affect or influence children’s behaviour or emotions or to limit their opportunities or development. On data protection principles, however, the Committee said that age-based or content-based systems designed to protect children from age-inappropriate content should be consistent with the principle of data minimisation.

States are recommended to prohibit the profiling or targeting of children for commercial purposes on the basis of a digital record of their actual or inferred characteristics, including group or collective data, targeting by association or affinity profiling. Practices that rely on neuromarketing, emotional analytics, immersive advertising and advertising in virtual and augmented reality environments to promote products, applications and services should also be prohibited from engagement directly or indirectly with children, the Committee said.

Transparency and redress are also considered in General Comment 25, with states advised to provide children with child-sensitive and age-appropriate information in child-friendly language on their rights and on the reporting and complaint mechanisms, services and remedies available to them in cases where their rights in relation to the digital environment are violated or abused.

Enforcement of Convention rights

While states that have ratified the UNCRC are bound by its provisions, not all states have implemented the provisions of the Convention into domestic law. In the UK, Scotland became the first nation to incorporate the UNCRC into domestic law in July 2024, giving children the ability to enforce their rights before the Scottish courts.

States that have ratified the UNCRC must regularly report on implementation of the Convention to the Committee. Some states, although not the UK, have also ratified an option third protocol to the Convention which establishes a process for children whose rights have been violated to complain to the Committee.

A complex picture

For policymakers and businesses alike, the UNCRC and the Committee’s General Comment 25 present some guiding principles for action in relation to children’s rights and protection online. However, it is a complex picture, with often competing principles at play. Both states and private companies should consider undertaking a child’s right impact assessment when implementing measures that affect children to consider the risks and benefits in more detail along with supporting evidence.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.