Further, the Committee said that, in considering the best interests of the child, states “should have regard for all children’s rights, including their rights to seek, receive and impart information, to be protected from harm and to have their views given due weight, and ensure transparency in the assessment of the best interests of the child and the criteria that have been applied”.
The ‘best interests’ principle sits alongside three other core principles that are to govern states’ approach to children’s rights and protection in the digital environment. Those principles – of non-discrimination; right to life, survival and development; and respect for the views of the child – all interlink with the ‘best interests’ principle and have relevance in the context of online safety regulation.
For example, the Committee said that states should “take all appropriate measures to protect children from risks to their right to life, survival and development”, citing risks relating to content, contact, conduct and contract and giving examples such as violent and sexual content, cyberaggression and harassment, gambling, exploitation and abuse, including sexual exploitation and abuse, and the promotion of or incitement to suicide or life-threatening activities. Children’s views on the nature of the risks they face should be listened to, it added.
General Comment 25 also said that the use of digital devices should not be harmful and should not be a substitute for in-person interactions among children or between children and parents or caregivers. It states that precautions may be required, depending on the design, purpose and uses of technologies.
General Comment 25 advocates for states to adopt comprehensive policies and strategies that are aimed at providing children with the opportunity to benefit from engaging with the digital environment and ensuring their safe access to it. The Committee said states have a duty to protect children from infringement of their rights by businesses – including the right to be protected from all forms of violence in the digital environment. It highlighted the potential for the way digital services are designed or operated to cause or contribute to violations of children’s rights to freedom from violence. The Committee recommended that laws and regulations be put in place to try to prevent violations of the right to protection from violence, and that compliance with those frameworks be monitored and enforced.
States have further been advised by the Committee to take steps to protect children from harmful and untrustworthy content, citing the risk of children experiencing gender-stereotyped, discriminatory, racist, violent, pornographic and exploitative information, as well as false narratives, misinformation and disinformation and information encouraging them to engage in unlawful or harmful activities. It said states should ensure relevant businesses and other providers of digital content develop and implement guidelines to enable children to safely access diverse content – but that children’s rights to information and freedom of expression should be recognised when protecting children from harmful material.
Rights of access to information and around freedom of expression are referred to regularly by the Committee as UNCRC rights that children enjoy that should be factored into measures regulating how they engage with the digital environment.
For example, the Committee said that while digital service providers should be made to comply with relevant guidelines, standards and codes, and enforce lawful, necessary and proportionate content moderation rules, it said content controls, school filtering systems and other safety-oriented technologies should not be used to restrict children’s access to information in the digital environment, but rather only to prevent the flow of harmful material to children; and that the other rights children enjoy – including around freedom of expression and privacy – must be recognised in the context of content moderation and content controls.
Another consideration, the Committee said, should be a child’s evolving capacities. It said that risks and opportunities online differ depending on a child’s age and stage of development and that this should be factored into how measures to protect children online, or facilitate their access to digital services, are designed.
Specifically on the design of age-appropriate measures, the Committee said these should be informed by the best and most up-to-date research available, from a range of disciplines, and that states should ensure that digital service providers offer services that are appropriate for children’s evolving capacities.
The Committee has further encouraged states to introduce or update data protection regulation and design standards that identify, define and prohibit practices that manipulate or interfere with children’s right to freedom of thought and belief in the digital environment, for example by emotional analytics or inference. It said they should ensure that automated systems or information filtering systems are not used to affect or influence children’s behaviour or emotions or to limit their opportunities or development. On data protection principles, however, the Committee said that age-based or content-based systems designed to protect children from age-inappropriate content should be consistent with the principle of data minimisation.
States are recommended to prohibit the profiling or targeting of children for commercial purposes on the basis of a digital record of their actual or inferred characteristics, including group or collective data, targeting by association or affinity profiling. Practices that rely on neuromarketing, emotional analytics, immersive advertising and advertising in virtual and augmented reality environments to promote products, applications and services should also be prohibited from engagement directly or indirectly with children, the Committee said.
Transparency and redress are also considered in General Comment 25, with states advised to provide children with child-sensitive and age-appropriate information in child-friendly language on their rights and on the reporting and complaint mechanisms, services and remedies available to them in cases where their rights in relation to the digital environment are violated or abused.
Enforcement of Convention rights
While states that have ratified the UNCRC are bound by its provisions, not all states have implemented the provisions of the Convention into domestic law. In the UK, Scotland became the first nation to incorporate the UNCRC into domestic law in July 2024, giving children the ability to enforce their rights before the Scottish courts.
States that have ratified the UNCRC must regularly report on implementation of the Convention to the Committee. Some states, although not the UK, have also ratified an option third protocol to the Convention which establishes a process for children whose rights have been violated to complain to the Committee.
A complex picture
For policymakers and businesses alike, the UNCRC and the Committee’s General Comment 25 present some guiding principles for action in relation to children’s rights and protection online. However, it is a complex picture, with often competing principles at play. Both states and private companies should consider undertaking a child’s right impact assessment when implementing measures that affect children to consider the risks and benefits in more detail along with supporting evidence.