Out-Law / Your Daily Need-To-Know
OUT-LAW ANALYSIS 6 min. read
Why online platform Ts and Cs may need to be rewritten for children
FG Trade/iStock.
31 Mar 2026, 6:26 pm
Online platforms in the UK may need to re-write their terms and conditions so that they can be easily understood by children – even if they do not intend for children to access their services.
That requirement will only kick-in if platforms fail to operate effective age-gating, to prevent children accessing their services.
This position was confirmed recently by the UK’s data protection authority, the Information Commissioner’s Office (ICO). It highlights the increased scrutiny around children’s experiences in the digital world and comes at a time when online service providers operating in the UK must get a handle on legislative duties relating to children arising under both online safety laws and the data protection framework – requirements that can overlap.
Child data protection rules
Under UK data protection law, the processing of personal data must be fair, lawful and transparent, as well as necessary for the purposes being pursued, among other things. These, and other, basic principles apply to the processing of children’s data as much as they do to adults. However, specific protections regarding children’s personal data are provided for under the framework.
Among other things, the UK General Data Protection Regulation (GDPR) states that any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand. It also provides that specific protections for children should apply particularly to the processing of personal data for the purposes of marketing or creating personality or user profiles, as well as around the collection of personal data with regard to children when using services offered directly to a child.
In that latter regard, Article 8 of the GDPR sets specific standards around the processing of children’s personal data by providers of ‘information society services’ where those services are ‘offered’ directly to children. In those circumstances, in the UK, the provider must obtain parental consent to the processing of data for children under the age of 13.
The ICO has, in accordance with its duties under the Data Protection Act 2018, developed a specific code of practice, the age-appropriate design code, also known as the children’s code, which sets standards relevant to information society services likely to be accessed by children.
Those requirements have been supplemented by new provisions contained in Section 81 of the Data (Use and Access) Act (DUAA), which took effect in February.
Under Section 81 of DUAA, providers of information society services likely to be accessed by children will need to consider how children can best be protected and supported when using the services when designing those services. That is just one of the listed “children’s higher protection matters” that the providers must factor in. They must further consider the fact that: children merit specific protection with regard to their personal data because they may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing; and have different needs at different ages and at different stages of development.
The Reddit case
In February, the ICO imposed a fine of more than £14.4 million on online platform Reddit citing non-compliance with rules relating to child data protection.
At the heart of its findings – which relate to the period between 25 May 2018, when the GDPR began to apply, and 8 July 2025 – was that Reddit was deemed to be a platform likely to be accessed by children. This brought it within the scope of the ICO’s children’s code.
The ICO found Reddit was likely to be accessed by children even though the company’s privacy policy and user agreement prohibited children under the age of 13 from using the platform.
During the period of infringement, Reddit asked users to self-declare that they were over the age of 18 in order to access content classed as ‘not safe for work’, and in its engagement with the ICO, the company further cited age-ratings given to its app on Apple’s App Store, which it said would have prevented children under the age of 17 or 16 from downloading the app where parents had set up content restrictions on their devices.
Despite these measures, the ICO assessed that Reddit “had no form of age-gating anywhere on the platform”.
Because children were considered likely to have accessed the platform, Reddit’s compliance with the UK’s regime on child data protection fell to be scrutinised by the ICO.
The ICO found the company had unlawfully processed the data of children during the period of infringement – including to personalise their feeds and inform targeted advertising – and said that it had not attempted to obtain parental consent for the processing.
The ICO further found that Reddit should have carried out a data protection impact assessment to understand the risks associated with the processing of children’s data but had not done so, adding that child users may have been exposed to unsuitable or potentially harmful content on the platform.
The ICO said the processing had not been fair or transparent either. In this regard, the ICO highlighted that for the processing of personal data to be fair, the data subject must be able to understand the contractual terms to which they are being asked to agree. It said children under 13 years old, who had been able to access the platform during the period of infringement, “were not in a position to understand the contractual terms that Reddit asked them to agree to”. It said some of the terms were opaque and the language used complex.
Online service providers may find that part of the case particularly noteworthy, as it concerns an apparent paradox: that they can be punished for not having contract terms appropriate to children – even if their user agreement prohibits children from accessing the service.
For providers, the enforcement action taken by the ICO shows that if their service is likely to be accessed by children, and they do not genuinely enforce effective age assurance or age verification, then the service must have terms and conditions that are comprehensible to children and their practices must accord with the higher levels of protection given to children’s personal data.
Reddit has said it will appeal against the ICO’s decision.
Cross-over with the online safety regime
Under the Online Safety Act, online service providers must carry out a risk assessment to determine whether their platform is likely to be accessed by children. This assessment is what drives the rest of the compliance obligations.
The outcome of that risk assessment dictates the compliance route. Where a service is found likely to be accessed by children, companies must then either prevent access – for example, through effective age gating – or put in place appropriate age‑assurance measures.
The risk assessments are not optional, and they do not depend on what a platform’s stated target audience is or what their terms and conditions say. What matters if whether their services are likely to be accessed by children.
For service providers, a complicating factor is that while a child is said to be under 13 years of age for GDPR purposes, the Online Safety Act deems a child to be a person under 18 years of age.
Beyond the Reddit case, there has been other recent enforcement action focused on the issue of age assurance – under both the UK data protection and online safety regimes.
In early February, the ICO imposed a near-£250,000 fine on MediaLab.AI for child data protection failings stemming from its failure to verify the age of users. Earlier this month, Ofcom imposed a fine of £520,000 on 4Chan, which included for age-check failings the regulator said enabled children to access pornography.
The two authorities have been increasingly coordinating their regulatory activity. The ICO’s recent open letter to social media and video-sharing platforms, calling on them to strengthen age checks and protect children’s data, was issued on the same day that Ofcom wrote to a number of sites and apps calling on them to enforce their minimum age rules with highly-effective age checks.
To help providers of services likely to be accessed by children to meet their obligations under both the online safety and data protection regimes, Ofcom and the ICO published a joint statement (14-page / 708KB PDF) that highlights how their remits interact.
The regulators said that where a service is likely to be accessed by children, the provider “must have an age assurance process that is highly effective at determining whether or not a user is a child”, and that where they set a minimum age for the service, they should “use an effective age gate to prevent underage access and avoid unlawful processing under UK GDPR”. Among other things, they confirmed their view that “self-declaration alone is not an effective means to determine the age or age range of users and prevent access by underage users”.
For the ICO specifically, its scrutiny of child data protection issues represents a broadening of its regulatory attention away from purely data security matters, which has been the area where its enforcement action has traditionally been focused.
Co-written by Gemma Erskine and Prune Corel of Pinsent Masons.