Out-Law Analysis 6 min. read
05 Sep 2023, 9:14 am
The UK’s Information Commissioner’s Office (ICO) recently opened a consultation on draft new guidance on biometric data, which is the first phase of guidance the ICO is preparing in this area and looks at how data protection law applies when using biometric data in biometric recognition systems.
Biometric recognition systems are increasingly popular tools being used by businesses to verify customers’ identity in the digital world. The systems rely on biometric templates being developed about individuals and then those templates being matched against data gathered when there is an action – like, for example, where a person takes a photo of their face, applies a fingerprint, or takes an eye scan, when seeking to log into a user account.
There has been little general guidance to date in this complex area. The European Data Protection Board finalised its guidelines on the use of facial recognition technology in law enforcement in May this year, while the previous UK information commissioner published opinions on the use of facial recognition technology in public places and by law enforcement. However, there has been little guidance concerning use of biometric data beyond the context of facial recognition.
The guidance comes at a time when the ICO is expected to take a greater role in guidance in this area. Under the proposed Data Protection and Digital Information Bill, the government would no longer be obliged to publish a statutory surveillance camera code of practice and so the office of the surveillance camera commissioner is to be scrapped. The functions of the commissioner for the retention and use of biometrics remain but they are to be carried out by the investigatory powers commissioner. The current holder of the roles of both biometrics commissioner and surveillance camera commissioner, Fraser Sampson, will leave his post on 31 October this year. Sampson has highlighted the need for regulation in areas beyond data protection.
In its draft guidance, the ICO explained that the processing of biometric data to uniquely identify an individual is inherent in the use of biometric recognition systems and that that means businesses using those systems will be processing ‘special category data’ for the purposes of data protection law. The processing of special category data is subject to greater restrictions than ordinary personal data.
As the ICO highlighted, not all biometric data will qualify as special category data – whether it does or not will depend on the purpose for which biometric data is processed, not the outcome of the processing. The definition of biometric data is contained in the UK GDPR. In its guidance, the ICO summarised it in the form of a three-point test, where data would constitute biometric data where it:
However, biometric data will only constitute special category data where it is used for the purpose of uniquely identifying an individual.
The ICO said: “In order to uniquely identify someone using biometric data, your purpose involves: collecting personal data relating to someone’s characteristics and processing it in a certain way (e.g. to create a biometric template); and comparing that data with other biometric data that you hold in order to identify a match. If you intend to take these steps, then you are processing biometric data for the purpose of unique identification.”
“This means that you will be processing special category biometric data from the moment you collect the data as described in the first step, not from the point that you attempt any comparison for identification purposes. This purpose test is met whenever you use a biometric recognition system, because your purpose for doing so will be either to establish: who someone is (identification); or if someone is who they claim to be (verification). Both of these involve comparing a biometric template against another (reference) template for the purpose of finding a match,” it said.
The ICO said that businesses should undertake a data protection impact assessment prior to using a biometric recognition system, and that they will typically require the explicit consent of data subjects to process biometric data when using those systems. Consent will only be valid if individuals have a “suitable alternative” to agreeing to the processing of that data, it stressed. It said, for example, that customers could be given the option of inputting a PIN number to gain access to services as an alternative to using facial recognition technology-based access control systems.
There may be other lawful bases for processing biometric data when using biometric recognition systems, the ICO said – such as where the processing is necessary for the prevention and detection of unlawful acts – but this condition can only be relied on if the controller can show that using the special category biometric data is necessary both for the prevention and detection of crime and for reasons of substantial public interest.
The guidance flags the need for a clear assessment of who acts as controller and who acts as processor. It also highlights that providers of AI solutions commonly want to use data generated by customers to further develop their models. The ICO has said that organisations should establish whether providers want to do this, confirm that the system provider would act as controller for that use, and ensure data subjects are notified of the use. Organisations should regularly review their services and switch to another provider if their provider’s use is no longer compliant.
The ICO has flagged that the guidance is not intended to be a comprehensive guide to compliance when using biometric data. However, there are some areas where further guidance would be welcome. For example, more technical detail, like that provided in the ICO’s guidance on AI, and guidance on how privacy enhancing technologies could be used in biometric recognition systems.
We are also seeing a trend towards the adoption of biometric recognition systems for fraud prevention, particularly in the financial services sector. However, while the guidance seems to place a lot of focus on traditional biometric technologies such as fingerprint and facial recognition, our view of emerging market trends is that there has been an increase in the adoption of technologies using behavioural biometric data, where biometric templates or profiles are created based on samples of how users enter their passwords or the angle at which they hold their phones. Future biometric samples of this nature are then checked against the behavioural biometric profile to establish a match. Guidance could provide more technical examples dealing with this type of processing given its prevalence and likelihood to be used to a greater extent over the next few years.
Similarly, the guidance does not look at systems which briefly process personal data of groups of people to confirm a non-match. This is the type of technology used, for example, in live facial recognition systems that check individuals in a particular area against a watchlist; information about individuals who are not on the watchlist will be processed briefly to confirm there is no match. The Home Office is reportedly looking to expand its use of technologies to find and track criminals, and to potentially deploy new biometric systems nationally over the next 12 to 18 months.
There are ethical considerations in processing biometric data that the guidance does not currently address either.
As it currently stands, there is an argument that some novel uses of biometric data, such as via technology that recognises and analyses emotion, which aren’t used for the purpose of identifying individuals per se, would not be subject to the same restrictions and controls that apply when processing special category data.
For example, the faces of individuals in a crowd could be analysed by technology to see how they are reacting to a speech, but the data processed would not be classified as special category data even though the processing could be considered quite intrusive. The ICO has previously blogged about the risks around emotional analysis technologies. Some further guidance on balancing biometric technologies with the rights of data subjects would be valuable to ensure these technologies are used ethically.
The guidance also provides only brief coverage of the risks around bias and discrimination. It highlights that that if a system detects characteristics of certain groups less well than others, it is likely to have a biased outcome. However, it does not currently include practical steps on how these risks could be addressed. It also does not highlight the need for caution around racial bias in facial recognition, despite the risks being well documented.
It is possible that the ICO will fill in some of these gaps in the guidance in due course. The first phase explicitly does not cover regimes for law enforcement and security purposes, so these may also be covered in future phases.
The ICO’s draft guidance is open to consultation until 20 October 2023. A second part to the guidance, regarding biometric classification and data protection, is expected to be issued in due course – the ICO said it will hold a call for evidence in relation to that guidance early next year.
Co-written by Holly Lambert of Pinsent Masons.
Out-Law News
06 Jul 2023
Out-Law News
30 May 2022