Brian Jackson - Fotolia

UK police double down on ‘improved’ facial recognition

The Met and South Wales Police have doubled down on their use of facial recognition technology after research found improved accuracy in their algorithms when using certain settings, but civil society groups maintain that the tech will still be used in a discriminatory fashion

The Metropolitan Police Service (MPS) and South Wales Police (SWP) have reaffirmed their commitments to using facial-recognition technologies, after research commissioned by the forces finds “substantial improvement” in the accuracy of their systems.

Conducted by the National Physical Laboratory (NPL), the research tested the facial detection and recognition algorithms being used by both forces, and found there is “no statistical significance between demographic performance” if certain settings are used.

It specifically found that when deploying the Neoface V4 facial recognition software provided by Japanese biometrics firm NEC, the two police forces can achieve “equitable” outcomes across gender and ethnicity by setting the “face-match threshold” to 0.6 (with zero being the lowest similarity and one indicating the highest similarity).

The NPL said that using this setting would reduce the likelihood of false positives, with one in 6,000 being falsely matched when using a watchlist of 10,000 facial images, and one in 60,000 when using a watchlist of 1,000 images.

Previous versions of the Neoface software, it said, had an average false positive rate of one in 1,000 when testing against watchlists of between 2,000 and 4,000 images. However, it noted that “false positive identifications increase at lower face-match thresholds of 0.58 and 0.56 and start to show a statistically significant imbalance between demographics with more Black subjects having a false positive than Asian or White subjects”.

While these results are for live facial recognition (LFR), the NPL also tested Neoface for use in retrospective facial recognition (RFR) systems and Operator Initiated Facial Recognition (OIFR) systems.

Unlike LFR, which compares a live video feed to pre-determined watchlists to find matches, RFR is used to identify people in images or videos after the fact, while OIFR is where an officer takes and submits a photo for immediate search against a reference image database. For both RFR and OIFR, the algorithm correctly identified the person from their facial image every time.

MPS and SWP press forward

“Live Facial Recognition technology is a precise community crime fighting tool,” said Lindsey Chiswick, director of intelligence at the MPS. “Led by intelligence, we place our effort where it is likely to have the greatest effect. It enables us to be more focused in our approach to tackle crime, including robbery and violence against women and girls.

“This is a significant report for policing, as it is the first time we have had independent scientific evidence to advise us on the accuracy and any demographic differences of our facial recognition technology.”

SWP chief constable Jeremy Vaughan, who is also the national policing lead on biometrics, said the force is now in “a stronger position than ever” to demonstrate to the public that “use of facial recognition technology is fair, legitimate, ethical and proportionate”.

In August 2020, SWP’s use of LFR was deemed unlawful by the Court of Appeal, in part because the force did not comply with its Public Sector Equality Duty (PSED) to consider how their policies and practices could be discriminatory.

It was noted in the judgement that NEC refused to divulge details of its system to SWP, meaning the force could not fully assess the tech and its impacts.

“It is right and proper that our use of technology is subject to legal challenge and scrutiny, and the work that has been carried out to scrutinise and test this ground-breaking technology gives me confidence that we are meeting our equality obligations,” said Vaughan.

“The study confirms that the way South Wales Police uses the technology does not discriminate on the grounds of gender, age or race, and this reinforces my long-standing belief that the use of facial recognition technology is a force for good and will help us keep the public safe and assist us in identifying serious offenders in order to protect our communities from individuals who pose significant risks.”

Read more about police technology

SWP claimed that, prior to the Court of Appeal decision, its LFR deployments resulted in 61 people being arrested for offences including robbery violence, theft and failure to respond to court warrants.

“There was nothing in the Court of Appeal judgment that fundamentally undermined the use of facial recognition to protect the public, and I am pleased that the work that has been carried out has given us confidence in meeting our public sector equality impact obligations and strengthened the operational policies and guidance we have in place to withstand the most robust legal challenge and public scrutiny,” said Vaughan.

Over the course of six deployments between February and July 2022, the MPS scanned 144,366 people’s biometric information, resulting in eight arrests for offences including possession of Class A drugs with intent to supply, assaulting an emergency worker, failures to appear in court, and an unspecified traffic offence.

Based on the gulf between the number of people scanned and the number of arrests made, civil society groups, lawyers and politicians previously told Computer Weekly that the MPS’s approach to LFR is fundamentally flawed and “irresponsible”.

Chiswick, commenting on the NPL research, said: “We understand the concerns raised by some groups and individuals about emerging technology and the potential for bias. We have listened to these voices. This research means we better understand the performance of our algorithm. We understand how we can operate to ensure the performance across race and gender is equal.”

Civil society reacts

Despite the improved accuracy of the Neoface system, civil society groups maintain that the technology is “discriminatory and oppressive”.

Big Brother Watch’s legal and policy officer, Madeleine Stone, for example, described LFR as a “suspicionless mass surveillance [tool] that turns us into walking ID cards”, adding that “one in 6,000 people being wrongly flagged by facial recognition is nothing to boast about, particularly at deployments in large cities where tens of thousands of people are scanned per day”.

Stone said if rolled out on a UK-wide level, LFR could mean tens of thousands of people will be wrongly flagged as criminals and forced to prove their innocence.

“This report confirms that live facial recognition does have significant race and sex biases, but says that police can use settings to mitigate them,” she said. “Given repeated findings of institutional racism and sexism within the police, forces should not be using such discriminatory technology at all.”

Liberty lawyer Katy Watts – who described the tech as “oppressive by design” – also said the report offers no new insights. “We know that this technology violates our rights and threatens our liberties, and we are deeply concerned to see the Met Police ramp up its use of live facial recognition,” she said. “The expansion of mass surveillance tools has no place on the streets of a rights-respecting democracy.

“Facial recognition doesn’t make people safer, it entrenches patterns of discrimination and sows division,” added Watts. “History shows us that surveillance technology will always be disproportionately used on communities of colour and, at a time when racism in UK policing has rightly been highlighted, it is unjustifiable to use a technology that will make this even worse.”

Dangerously authoritarian

Stone added that the “dangerously authoritarian surveillance tech” is not referenced in a single piece of legislation, and that it has never been debated in Parliament.

Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including a House of Lords inquiry into police use of advanced algorithmic technologies; the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

Most recently, current biometrics commissioner Fraser Sampson said in February 2023 that clear, comprehensive and coherent frameworks are needed to regulate police use of artificial intelligence (AI) and biometrics in the UK.

The government, however, maintains there is “already a comprehensive framework” in place.

Read more on Big data analytics