UK High Court upholds police use of automated facial recognition technology to identify suspects

R (on the application of Edward Bridges) v The Chief Constable of South Wales [2019] EWHC 2341

Summary

The High Court of England and Wales has confirmed that the use of automated facial recognition technology (AFR) to match the faces of members of the public against police watchlists is lawful.  The Court found that although the use of AFR infringes an individual’s right to respect for their privacy, the interference is justifiable for law enforcement purposes, and the current UK legal regime is adequate to ensure its appropriate and non-arbitrary use.

This is the first time any court has considered AFR, and marks an important test for the legal parameters of this technology as it develops and is deployed more widely.

Facts

AFR uses biometric data processing to assess whether two facial images depict the same person. Since mid-2017, the South Wales Police (SWP) has piloted a program called AFR Locate, which takes facial digital images from live CCTV streams, extracts biometric facial data in real-time and compares this against watchlists, alerting police to matches so that they may act to apprehend the suspect. Where there is no match, the biometric data is discarded almost immediately. In the SWP pilot, the use of AFR Locate was trialled at specific events or in certain areas over discrete time periods, such as at a football match, an exhibition and along a busy shopping strip. 

The claimant, Mr Bridges, claimed that the SWP had processed his image in two locations using AFR Locate. He was not on any SWP watch list.

The primary claim was that AFR Locate was unlawful because it breached Bridges’ “right to respect for his private and family life, his home and his correspondence” under article 8(1) of the European Convention on Human Rights (ECHR). It was further argued that this inference was not justifiable under article 8(2), because it was not “in accordance with the law” nor “necessary in a democratic society” for any of the relevant purposes under that article (such as national security, public safety or crime prevention) (ECHR claim).

Bridges also contended that AFR Locate breached:

  • the public sector duty to eliminate discrimination under the Equality Act 2010 (UK), because the technology was said to indirectly discriminate by generating a higher proportion of false positive matches for women and minority ethnic groups (equality claim); and

  • various provisions of UK data protection law, which require the fair and lawful processing of personal data (data protection claim).

Decision

Interference with right to privacy

The Court accepted that SWP’s use of AFR interfered with Bridges’ rights under article 8(1) of the ECHR. AFR Locate goes beyond the ”expected and unsurprising” events that might ordinarily occur in public places, such as being photographed, because it extracts and analyses biometric facial data, which is unique to the individual and an important source of personal information. The fact that the data is deleted where no match is identified does not affect this conclusion.

In accordance with the law

Mr Bridges argued that SWP had no legal basis to deploy AFR Locate, and that there was no legal framework for its use. The Court rejected this argument, finding that the common law powers of police to prevent and detect crime were “amply sufficient” to support AFR Locate. It noted that new technology does not necessarily require an express or bespoke statutory power or framework.  Existing data protection laws, codes of practice and SWP policies were deemed sufficient to govern SWP’s use of AFR.

Necessary in a democratic society

The Court considered the test from Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700, which provides that any interference with article 8(1) rights must:

(i)         pursue a sufficiently important objective to justify the limitation of a fundamental right;

(ii)        be rationally connected to that objective;

(iii)       be incapable of substitution for a less intrusive measure without unacceptably compromising that objective; and

(iv)       strike a fair balance between the individual rights and community interests, with regard to the above matters and the severity of the consequences.

Factors (i) and (ii) were not disputed. With respect to factors (iii) and (iv), the Court found that AFR Locate strikes the requisite fair balance of interests, and is not disproportionate in achieving its objective.  The technology is used openly and transparently, with significant public engagement (such as large notices and social media announcements). It is deployed for a discrete period of time and for the specific purpose of attempting to identify people whose presence is of justifiable interest to the police. The fact that an individual’s data is deleted almost instantaneously where no match is made supports its proportionality. Furthermore, the fact that AFR Locate has resulted in at least one watchlist individual being identified on each occasion of its use, and had not produced any wrongful arrests, demonstrated that its use was rationally connected to the objective of crime prevention.

Equality claim

The Court dismissed the equality claim because there was no firm evidence that AFR Locate produced the indirectly discriminatory results that were alleged. 

Data protection claim

The Court accepted that AFR involves the processing of personal data, even where the individuals pictured are not identifiable by name. However, the processing was fair and lawful in accordance with the principles of the UK data protection law.

Commentary

Implications for future AFR use

The Court acknowledged that AFR carries the potential for misuse by public authorities, and emphasised that the sensitive processing of an individual’s personal data should not be undertaken except where there are cogent and robust reasons. In light of this, the Court was cautious to confine its judgment to the particular circumstances in which AFR Locate has been deployed by SWP. It was acknowledged that questions of proportionality are inherently fact-sensitive, and as such, the Court did not draw any broader conclusions about the defensibility of possible future uses of AFR technology.

The Court also considered the fact that facial matches made by AFR Locate were reviewed by a police officer to be an important safeguard for use of the technology. This suggests that this judgment, although affirming the role of AFR in modern law enforcement, is not intended to provide carte blanche for police arrests based on judgments made by AFR software alone without human intervention.

AFR in Australian law enforcement

The Federal Government has proposed two bills, the Identity-Matching Services Bill 2019 (Cth) and the Australian Passports Amendment (Identity-Matching Services) Bill 2019 (Cth) which would give effect to a 2017 Council of Australian Governments agreement, the Intergovernmental Agreement on Identity Matching Services. They provide for the sharing and matching of identity information by State and Federal government services and law enforcement, and companies. The proposed laws could facilitate live facial recognition in circumstances that are much broader than this case.

These Bills have been rejected by the Parliamentary Joint Committee on Intelligence and Security on the basis that they do not include adequate safeguards for peoples’ privacy, are too broad and vaguely drafted, and do not require sufficient oversight and transparency of government agencies.

Pilot programs similar to AFR Locate have reportedly been deployed in Western Australia, Queensland and Victoria. It is a troubling development for democracy and privacy in Australia, and an area that demands greater public scrutiny.

The full text of the decision can be found here.

 

Gabrielle Jack is a Law Graduate at King & Wood Mallesons.