Out-Law News | 12 Oct 2017 | 8:11 am | 2 min. read
Labour's Lord Stevenson of Balmacara questioned the proposals, contained in the draft Data Protection Bill introduced before parliament last month, in a debate on the Bill in the House of Lords on Tuesday.
"Understandably, there is much concern about this low age limit, particularly as the General Data Protection Regulation gives discretion in a range up to 16 years of age," Lord Stevenson said. "Setting an age limit of 13, or even 16, would almost certainly be illegal under the UN Convention on the Rights of the Child, to which the UK is a signatory."
"The Children’s Society argues that if companies continue to rely on their current practices – whereby they allow only over-13s to have an account but have no age verification process to check that children who are consenting are the age they state themselves to be – then there will continue to be widespread breaches of both the companies’ own rules and this new Data Protection Act. In the Bill, it is unclear how breaches will be handled by the information commissioner and what penalties will be put in place for those companies failing to verify age properly," he said.
Lord Stevenson also said that the Data Protection Bill makes "no consideration" of a child's capacity to consent to data processing, and also does not deal with "protection for vulnerable children".
"Although there are arguments for setting the age limit higher – or indeed lower – there is surely a need both for proper evidence to be gathered and for a minimum requirement for companies to have robust age verification systems and other safeguards in place before any such legislation is passed," Lord Stevenson said. "We will pursue that. There is also the question of the overlap this derogation has with the right to be forgotten... That right kicks in only at age 18; we need to probe why that is the case and how that will work in practice."
In the debate, Lord Stevenson also said that concern about the increasing use of algorithms and automatic data processing "needs to be addressed".
The peer said that businesses that rely on algorithms could be required to participate in "recording, testing and some level of disclosure" about their use and analysis of data, "particularly when algorithms might affect employment or are used in a public policy context".
Lord Stevenson also questioned whether the UK would win a so-called 'adequacy decision' from the European Commission to enable data flows between organisations in the UK and EU to continue uninterrupted post-Brexit.
"On the UK’s exit from the EU, the UK will need to satisfy the European Commission that our legislative framework ensures an 'adequate level of protection', but achieving a positive adequacy decision for the UK is not as uncontentious as the government think," Lord Stevenson said. "Under article 45, the GDPR requires the European Commission to consider a wide array of issues such as the rule of law, respect for fundamental rights, and legislation on national security, public security and criminal law when it makes its decision."
"As has already been pointed out by several commentators, the current surveillance practices of the UK intelligence services may jeopardise a positive adequacy decision, as the UK’s data protection rules do not offer an equivalent standard of protection to that available in the rest of the EU. We will need to pursue this disjuncture in Committee," he said.
Lord Stevenson also said that the new Data Protection Bill will need to address how the UK ensures that UK data protection laws continue to be aligned with the EU's as new policy and case law develops across the remaining 27 EU member states.
He suggested that the information commissioner could be placed under a duty in the Bill to "make regulations which reflect the changes taking place in the EU". Alternatively, "the Bill could provide for some form of lock-step arrangement under which statutory instruments would be triggered when UK laws need to be amended", he said.