Digital & Technology: Facing the future – framing the debate on biometric data processing

Tuesday 23rd January 2024

On 22 December 2023, the Information Commissioner’s Office (ICO) issued a short statement in response to a parliamentary letter calling for a ban on facial recognition technology in the UK due to privacy and security concerns.

Whilst we await the full response, the ICO notes:

“Facial recognition technology can bring benefits in helping to prevent and detect crime, but it relies on processing large amounts of sensitive personal data. That is why the law places a high bar for its usage: its use must be necessary and proportionate, and its design must meet expectations of fairness and accuracy.”

 

What is facial recognition technology?

Facial recognition technology (FRT) is technology which identifies an individual from a digital facial image. When the image is captured by a camera, FRT creates a biometric template of an individual’s face by analysing its unique features. It can then be used for various use cases including to verify a person’s identity or to identify/match an individual using another image or set of images.

Live facial recognition (LFR) is a type of FRT which is sometimes used in public places and is captured in real-time.

 

FRT & Privacy Laws

The use of FRT is not expressly prohibited under data protection laws, although given that it involves the processing of personal data, biometric data and often special category data, this processing comes with a high threshold to ensure the processing is lawful, including:

  • a lawful basis for processing under Article 6 UK GDPR;
  • a ‘special category’ lawful basis for processing under Article 9 UK GDPR;
  • the requirement for a Data Protection Impact Assessment; and
  • the requirement for human intervention for any meaningful decision making about people.

Sitting over the specific requirements for processing certain types of personal data is the requirement for the need to show reasonableness and proportionality in the data controller’s processing activities, which can be tricky to navigate when justifying intrusive processing.

The ICO guidance in this area also notes that systems often allow for data to be integrated into ‘big data’ systems which could constitute ‘profiling’ under the UK GPDR which is in many cases prohibited.

 

Use cases

A common example of FRT is Apple’s Face ID, where a user can choose to enable the feature to enhance security and convenience rather than using a passcode to unlock their device. Another use-case is the use of FRT for passport control purposes – to link an individual to the identification documents they are using therefore verifying their identity. FRT is also increasingly used by retailers to recognise those with shoplifting convictions and ultimately to prevent and detect crime.

However, the use of these technologies extends to wider government and corporate use where consent is not relied upon to process the relevant data, which attracts a host of privacy concerns.  The Metropolitan Police were criticised by privacy advocates for using LFR in public spaces during King Charles III’s Coronation in May of last year. Whilst the Met Police argued this was necessary to promote safety and security, and to minimise disruption at the Coronation by identifying wanted criminals in crowds, critics argued deployment was ‘Orwellian’ and constituted excessive monitoring of individuals.

 

Debate

Advocates and opponents of FRT have been fiercely querying its use, with such debates reaching the European Parliament. Arguments in favour of the use of FRT include proportionately promoting safety, security, and protection of the public as well as the prevention and detection of crime. Arguments against the use of FRT include its potential impact on human rights, the accuracy of current FRT systems and consequently the possibility for discrimination, and surveillance capitalism.

The law has not yet fully evolved here and is also inconsistent across different jurisdictions, meaning there is no one-size-fits-all approach for businesses operating internationally. For example, whilst use is widespread in the US, the proposed AI Act to be adopted in the EU specifically identifies FRT as constituting ‘unacceptable risk’ AI systems. However, it was recently reported that exemptions for law enforcement agencies to use the technology in limited circumstances have been included in the proposed drafting, leading some to describe it as ‘draconian’ legislation.

 

Summary

Whilst data protection and privacy laws are not the only laws which apply to the use of FRT, privacy laws are a fundamental consideration when adopting the technologies and those wishing to adopt them should fully consider the legal and regulatory implications and create a full audit trail to demonstrate that a full assessment as required under the UK GDPR has been undertaken.

If you have any questions about the use of facial recognition technology, or more widely, personal data processing, please get in touch with one of our privacy experts.