alphaspirit - stock.adobe.com
Police use of facial recognition found ‘unlawful’ in court
In a landmark decision, the Court of Appeal has ruled that South Wales Police’s facial recognition deployments breached human rights and data protection laws
South Wales Police (SWP) has confirmed it will not contest the Court of Appeals’ ruling that has deemed its use of automatic facial-recognition (AFR) technology unlawful.
The appeal, raised by human rights group Liberty on behalf of Cardiff resident Ed Bridges, supersedes an earlier High Court ruling that deemed SWP’s use of AFR to be lawful in September 2019.
According to the Court of Appeals judgment, the decision was made on the grounds that SWP’s use of the technology was “not in accordance” with Bridges’s Article 8 privacy rights; that it did not conduct an appropriate Data Protection Impact Assessment (DPIA); and that it did not comply with its Public Sector Equality Duty (PSED) to consider how its policies and practices could be discriminatory.
“I’m incredibly, ecstatically pleased by today’s judgment… Automatic facial-recognition technology is an intrusive and discriminatory mass surveillance tool. It has been used without the public’s consent and often without their knowledge. We should all be able to use public spaces without being subjected to oppressive surveillance,” said Bridges on Twitter.
Liberty lawyer Megan Goulding added that it was “a major victory in the fight against discriminatory and oppressive facial recognition”.
“The court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties,” she said.
The original legal claim was raised in the wake of Bridges’ concerns that he had been tracked by the technology on two occasions: once during a peaceful anti-arms protest, and another while out shopping in Cardiff.
Despite ruling at the time that it “does entail infringement” of Bridges’s Article 8 privacy rights, the two presiding High Court judges decided that South Wales Police’s use of AFR had “struck a fair balance and was not disproportionate”, making its deployment justified.
However, the Court of Appeal has now said that the legal framework that the previous decision regarded as being sufficient for making the deployments lawful under Article 8 “is, on further analysis, insufficient”.
“Too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed,” it said.
In terms of the discriminatory impact of the technology, the ruling stated the “SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex”.
It added: “For reasons of commercial confidentiality, the manufacturer is not prepared to divulge the details so that it could be tested. That may be understandable but, in our view, it does not enable a public authority to discharge its own, non-delegable, duty under section 149.”
SWP’s facial recognition software, which is also used by the Metropolitan Police Service (MPS), is provided by Japan’s NEC Corporation, but it has never undergone any demographic testing for potential bias.
Speaking on 24 January at the launch of a report by Royal United Services Institute (Rusi), which found police algorithms need national guidance “as a matter of urgency”, MPS commissioner Cressida Dick claimed the AFR technology used by the MPS is proven not to have an “ethnic bias”, adding that the only bias is that “it is slightly harder to identify a wanted women than a wanted man”.
While calling for a legislative framework to govern how the technology is used, Dick also claimed there is a “very strong” legal basis for AFR use by police, and that human officers will always make the final decision.
The future of police AFR
Chief constable Matt Jukes said that SWP will give the court’s findings “serious attention”, and that its policies have already evolved since the deployments considered in the case.
“We are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required,” he said.
“We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology. But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching of our duties around equality.”
Deputy chief constable Jeremy Vaughan, the national policing lead for facial recognition, added that there is nothing in the judgment that fundamentally undermines the use of AFR by police. “This judgement will only strengthen the work, which is already underway, to ensure that the operational policies we have in place can withstand robust legal challenge and public scrutiny,” he added.
Speaking to Computer Weekly, Tim James-Matthews, a barrister at Matrix Chambers, said the problem the Court of Appeal ultimately found was an absence of regulation around how the technology was deployed, “as opposed to anything particular in the technology itself”.
“The court didn’t know, or have any evidence, that [NEC’s] NeoFace Watch itself is racially biased in the way there is evidence that some AFR is racially biased. But what they said was that, essentially, the South Wales Police hadn’t done the work of identifying and determining whether or not there were equalities implications in using the technology, and how they might guard against or protect from those,” he said.
“The Metropolitan Police must be in a similar situation, they will have to consider whether or not they’ve also done the groundwork in determining, so far as they can be sure, that NeoFace Watch doesn’t show a racial bias when they deploy it.”
Police forces would likely have to conduct a more thorough bias testing of their facial-recognition systems to work out whether the information given to them by suppliers about their system’s potential bias is reliable, in light of the ruling, he added.
“It may also be… that they can’t deploy the technology if the company doesn’t give them more information about how it’s trained, how they’ve tested it and how the company has satisfied themselves that their technology isn’t biased in any way,” he said.
Responding to the court’s decision, surveillance camera commissioner (SCC) Tony Porter said he has noted the issues of bias arising in the judgement and that he will consider how his guidance can be amended accordingly, including what can be done on the manufacturer side to eliminate the problem.
“In so far as the legal deficiencies identified by the court are concerned, I am sure that others will share my frustration with the Home Office. The police worked hard to apply themselves in adhering to the Surveillance Camera Code of Practice, a statutory-based document, which, for more than five years, I have fruitlessly and repeatedly been calling on the Home Office to update,” he said.
The Surveillance Camera Code of Practice was produced by the Home Office in June 2013 to help strike a balance between public security and upholding civil liberties, and it is the duty of the SCC to review how it is working and to provide advice to ministers on whether it needs amending.
The SCC has previously called for the code to be updated with “stronger guidance as to the authorisation and use of such technology”, as it has “been allowed to diminish” over the years.
“My considered view is that Home Office and the Secretary of State have been asleep on watch and should reflect on the comments of the court and now act in the public interest,” said Porter.
“I hope they step away from their self-generated plan to dilute both my role and that of the biometric commissioner in to a single commission, and I hope that they review and update the code and commission an independent review of the legal framework which governs overt state surveillance – hope, but not hopeful.”
Read more about facial-recognition technology
- The ethics of police forces collaborating with private entities over the use of live facial recognition (LFR) technology is under investigation by an advisory body to the Home Office.
- The number of companies claiming to have developed facial-recognition tools that can identify masked faces has skyrocketed since the start of the Covid-19 coronavirus pandemic.
- Three major technology companies have committed to not selling facial-recognition tech to law enforcement, but remain silent on how other technologies can be used to fuel racial injustice and discriminatory policing.