Out-Law News 5 min. read

Guidelines highlight challenges of facial recognition technology as Clearview AI fined


Guidelines recently issued by the European Data Protection Board (EDPB) focus on the use of facial recognition technology in the context of law enforcement, but one expert has highlighted how some commentary within the guidelines has broader application and how the EDPB’s views align with the views of the UK’s Information Commissioner’s Office (ICO).

Lauro Fava of Pinsent Masons said: “The guidelines are yet another instalment in a series of guidance that reaffirms the severe restriction of facial recognition technology by the current data protection framework. Other guidance includes the EDPB’s guidelines on video-surveillance , the ICO’s opinion on use of live facial recognition technology in public places and its separate opinion on use of live facial recognition technology by law enforcement in public places.”

“It is clear that the EDPB views facial recognition technology (FRT) as a serious threat to fundamental rights, even stating that all processing of biometric data constitutes a serious interference with those rights. This viewpoint underpins the entire guidance. But it has to be noted that FRT can serve various objectives, both in the context of commercial use and in addressing public safety or law enforcement concerns, and these present different levels of risk,” he said.

The EDPB draws a clear distinction between lower risk situations, such as using FRT for authentication purposes, and higher risk situations, where facial recognition is used for widescale monitoring and identification.

Fava highlighted sections of the new guidelines on making information known to data subjects in a concise, intelligible and easily accessible form, and on the scope for processing special category data that is “manifestly made public”; both of which are complex issues in the FRT space. However, he said the guidance does have shortcomings: “It does not analyse in much detail some of the more foundational legal points, such as the nuances of the definition of ‘biometric data’. It remains unclear whether the definition covers soft biometrics – traits that are not unique to an individual, such as a person’s height.”

Facial recognition technology can serve various objectives, both in the context of commercial use and in addressing public safety or law enforcement concerns, and these present different levels of risk

According to Fava, probably the most helpful parts of the guidelines are the appendices. One of these describes practical steps that should be followed when procuring and deploying FRT, which Fava said may serve as a good guide even for businesses in the commercial space. For those involved in law enforcement, appendix III is also useful as it sets out a series of common scenarios and the EDPB’s views on each one.

Fava said that one of the law enforcement scenarios highlighted bears a striking resemblance to the facts considered in the case of Bridges v South Wales Police, which concerned the monitoring of a public space using FRT to identify individuals on a watchlist. Fava said the views expressed by the EDPB in appendix III align with comments made by the ICO and the judgment of the Court of Appeal, in that the legal framework was considered to lack the clarity needed to allow the use of FRT.

A further example in the appendix concerns a scenario that bears similarities with a use of FRT that was recently the subject of enforcement action in the UK, Fava said. On Monday, facial recognition software provider Clearview AI was fined more than £7.5 million by the ICO and ordered to stop obtaining and using the personal data of people in the UK sourced from the public internet, and to delete the data it has already gathered of UK residents from its systems too.

Clearview AI scrapes images and biometric data from the internet. Its service allows its customers, which include law enforcement agencies, to upload their own images to see if it returns a match against the images on Clearview AI’s database. According to the ICO, the company has a database of more than 20 billion images of people’s faces and data.

The use of FRT entails the processing of biometric data, which qualifies for special protection under both UK and EU data protection law. A finding of the joint investigation by the ICO and its Australian counterpart, the Office of the Australian Information Commissioner (OAIC), was that Clearview AI had not met these high standards of data protection in respect of its use of biometric data. The ICO said that the company was responsible for a series of data protection failings, including that it had failed to use the information of people in the UK in a way that is fair and transparent, did not have a lawful basis for collecting the data, and lacked a process for stopping data being retained indefinitely. 

While the ICO said Clearview AI no longer offers its services to UK organisations, it said the company retains customers in other countries and so still uses the personal data of UK residents. The ICO’s enforcement notice is designed to put an end to that data use.

The company has been the subject of other regulatory investigations across the globe too, including in the US, Canada, France and Italy. The EDPB previously concluded that Clearview’s facial recognition software does not meet the conditions set out in the EU’s Law Enforcement Directive. John Edwards, the UK information commissioner, said “international cooperation is essential to protect people’s privacy rights in 2022”.

Fava said cases such as Clearview AI’s shed a negative light on the technology and do not represent its full potential. They do nothing to change the impression many regulators and the public more generally have about FRT, he said.

The EDPB has called for FRT to be banned from use under the proposed EU AI Act, which is currently being scrutinised by EU legislators.

“Remote biometric identification of individuals in publicly accessible spaces poses a high risk of intrusion into individuals’ private lives and does not have a place in a democratic society as by their nature it entails mass surveillance,” the EDPB said. “In the same vein, the EDPB considers AI-supported facial recognition systems categorising individuals based on their biometrics into clusters according to ethnicity, gender, as well as political or sexual orientation as not compatible with the [EU] Charter [of Fundamental Rights].”

“Furthermore, the EDPB is convinced that the use of facial recognition or similar technologies, to infer emotions of a natural person is highly undesirable and should be prohibited, possibly with few duly justified exceptions. In addition, the EDPB considers that processing of personal data in a law enforcement context that would rely on a database populated by collection of personal data on a mass-scale and in an indiscriminate way, e.g. by ‘scraping’ photographs and facial pictures accessible online, in particular those made available via social networks, would, as such, not meet the strict necessity requirement provided for by Union law,” it said.

Fava said: “It is sensible that regulators put their primary focus on the use of FRT in law enforcement, interpreting the law in a way that protects fundamental rights and saves us from an Orwellian future; but more thought by regulators and legislators on how the technology can be harnessed in a safe and positive way would be welcomed by many.”

“No technology is inherently good or bad and there are potential uses of FRT in the private sector that would be beneficial in a modern society. These often involve situations where a customer wants to be automatically identified in a crowd, but such uses are prevented by the way the current legal framework is interpreted to apply to the personal data of bystanders. Other situations are where FRT offers an unparalleled level of security, but these uses are hindered by the current consent requirements. As things stand, businesses wanting to deploy FRT are faced with great uncertainty,” he said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.