Out-Law News 3 min. read
04 Apr 2023, 3:07 pm
The growing focus of regulators on ensuring children are safe to use online services has been brought into sharp focus with recent actions of the UK’s data protection authority, a technology law expert has said.
Meghan Higgins of Pinsent Masons was commenting after the Information Commissioner’s Office (ICO) opened a consultation on draft new guidance that aims to help information society service (ISS) providers determine whether their services fall within the scope of the UK’s children’s code, which the ICO oversees compliance with.
The draft new guidance is designed to help ISS providers determine whether children are likely to access their service. If children are likely to access an ISS, the provider must ensure compliance with the ICO’s children’s code, which was introduced in 2020 and is officially called the age appropriate design code.
The code not only applies to services intended for use by children but also to adult-only services that are nonetheless likely to be used by under-18s, even if they are not specifically targeted at those users.
ISS providers are obliged to assess whether their services fall within the scope of the code. The draft new guidance issued by the ICO clarifies, among other things, how providers can establish whether a ‘significant number’ of children are accessing their service – in which case the service falls in-scope.
The ICO said: “‘Significant’ in this context does not mean that a large number of children must be using the service or that children form a substantial proportion of your users. It means that there are more than a de minimis or insignificant number of children using the service.”
The ICO has set out ‘FAQs’ and a non-exhaustive list of factors that it encourages ISS providers to consider to determine whether the number of children accessing their service qualifies as ‘significant’.
It said internal analytics, business intelligence and market research might be used to get an insight into user behaviour and ultimately infer the age range of those users, that data concerning advertising that appears on their service might also help providers determine whether the ads are likely to appeal to children, and that complaints data might also provide an indication of the age of people accessing their service.
The ICO has also prepared a series of case studies that demonstrate how online dating, pornography, gaming, and social media ISS providers can assess whether their services are accessed by a “significant number of children” and ensure they are compliant.
Higgins said: “The ICO’s consultation is consistent with an increasing focus amongst both law makers and the public on ensuring that there are protections for children and young people in the online world. This was also highlighted recently when MPs tabled an amendment to the Online Safety Bill, which is currently before the UK parliament, that would extend liability to senior managers at online services that had failed to engage with their safety duties in relation to children.”
Beyond the children’s code, the Online Safety Bill provides for in-scope service providers to carry out a children’s access assessment. If an in-scope service is judged to be ‘likely to be accessed by children’, the risk assessment and safety duties under sections 10 and 11 for regulated user-to-user services, and sections 24 and 25 for providers of regulated search services, of the Bill would apply.
Under the Bill, Ofcom would be required to publish guidance to assist providers in determining whether their services are ‘likely to be accessed by children’ and the first children’s access assessment must be completed within three months from the date the guidance is published by Ofcom. Lottie Peach, also of Pinsent Masons, said it is likely that the outcome of the ICO’s consultation will influence Ofcom’s thinking on setting guidance on children’s access assessments. Ofcom has announced that it will consult on the duties under the Bill to protect children from "legal but harmful" content in autumn 2023.
Peach said: “Now may be a good time for service providers that expect to fall within the scope of the Online Safety Bill’s children’s access requirements to engage with the ICO’s consultation and have their say on the guidance, as the outcome may well impact Ofcom’s approach when producing their guidance for the Bill.”
The ICO’s consultation on its draft guidance is open to feedback until 19 May 2023.
Separately, on Tuesday the ICO announced that it had imposed a £12.7 million fine on TikTok after it considered that the company breached obligations the company was subject to under the UK General Data Protection Regulation (GDPR), between May 2018 and July 2020. It found that, during this period, TikTok had provided its services to UK children and processed their personal data without obtaining the consent or authorisation to do so from the kids’ parents or carers. It also identified failings in relation to the information shared with users to explain how their data is collected, used, and shared, and further considered that UK users’ data had not been processed lawfully, fairly and in a transparent manner. TikTok has said it disagrees with the findings and is reviewing its next steps, according to the BBC.
20 Jan 2023
09 May 2022