Out-Law / Your Daily Need-To-Know

Out-Law News 3 min. read

Exam results put reliance on algorithms in the spotlight


The recent outcry over the use of algorithms in determining school pupil qualifications across the UK has highlighted the need for organisations to remember their legal duty to process personal data fairly, a data protection law expert has said.

The recent outcry over the use of algorithms in determining school pupil qualifications across the UK has highlighted the need for organisations to remember their legal duty to process personal data fairly, a data protection law expert has said.

Stephanie Lees of Pinsent Masons, the law firm behind Out-Law, said data protection principles, enshrined in law, should guide the use of algorithms at a time when many organisations will be exploring the potential of new digital technologies such as artificial intelligence (AI).

"The General Data Protection Regulation and the Data Protection Act in the UK both contain specific provisions that apply to algorithms which involve processing personal data," Lees said. "In those circumstances, the data protection principles and requirements need to be considered from the outset, before organisations proceed to use any technologies that use algorithms, to ensure they can satisfy their 'accountability' obligations."

Across the UK, qualifications bodies such as the SQA in Scotland and Ofqual in England put in place algorithmic modelling in a bid to moderate the grades recommended for pupils by their teachers.

The system of moderation was devised after pupils were unable to sit national exams earlier this year due to coronavirus restrictions. Changes have subsequently been applied to the grading of pupils following complaints that some pupils had had their teacher-assessed grades disproportionately adjusted by reference to the past performance of the school they attended.

Lees said that the growth of AI and use of algorithms by organisations across many sectors, particularly in the adtech and fintech sectors, has been followed closely by the UK's Information Commissioner's Office (ICO) – the UK's data protection authority. While the technology offers potential benefits, the ICO has acknowledged it poses risks relating to transparency, accuracy and unfair prejudicial outcomes.

Last month the ICO, together with the Alan Turing Institute, published guidance on AI and data protection. The guidance provides practical advice to organisations, many of which have been seeking further regulatory guidance on how data protection laws apply to new advanced technologies.

Lees said: "The clear lessons that need to be learned from the exam grades case and also the problems encountered in developing and rolling out the coronavirus contact tracing app, are the need for caution and moment of reflection, that organisations must take when using technology and AI to solve problems. The stakes and risks arising from such technologies are higher when personal data is involved and data protection laws are designed to try and safeguard against any harms, from the outset."

Following the publication of A-level results in England earlier this month, the ICO said: "We understand how important A-level results and other qualifications are to students across the country. When so much is at stake, it’s especially important that their personal data is used fairly and transparently. We have been engaging with Ofqual to understand how it has responded to the exceptional circumstances posed by the COVID-19 pandemic, and we will continue to discuss any concerns that may arise following the publication of results."

"The GDPR places strict restrictions on organisations making solely automated decisions that have a legal or similarly significant effect on individuals. The law also requires the processing to be fair, even where decisions are not automated. Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades. Anyone with any concerns about how their data has been handled should raise those concerns with the exam boards first, then report to us if they are not satisfied. The ICO will continue to monitor the situation and engage with Ofqual," it said.

Lees also said that while it is unclear why Ofqual was of the view that automated decision making had not taken place in this case, the level of human intervention in the grade-setting process, from teachers and exam boards initially and through the appeals process, could be determinative factors.

"The general position under Article 22 of the GDPR is that any automated decision making using personal data that produces 'legal' or 'similarly significant' effects, requires an individual's consent. There are, however, exceptions to this, in limited circumstances," Lees said.

"Whilst the ICO statement reveals some level of insight into its interpretation of Article 22, regrettably it does not address their view on the other data protection law considerations here, particularly surrounding the implementation of the algorithm. Data protection law requires data protection impact assessments (DPIAs) to be carried out where 'high risk' processing of personal data is being carried out. DPIAs allow organisations to map the data protection principles and assess the end-to-end risks from the outset," she said.

It is also unclear from the ICO's statement whether the ICO had been involved by Ofqual in the design of the system of moderation. Lees said the interest in and response to the issue could spur the ICO and Ofqual to put in place a memorandum of understanding, similar to what the ICO already has in place with other UK regulators such as the Competition and Markets Authority, Financial Conduct Authority and Ofcom, to strengthen future cooperation.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.