greenbutterfly - stock.adobe.com
Lords question legality of police facial recognition
Lords committee ‘deeply concerned’ about the rapid expansion of live facial recognition by UK police, which it says is progressing without any real oversight or accountability
UK police are expanding their use of live facial recognition (LFR) technology without proper scrutiny or accountability, despite lacking a clear legal basis for their deployments, according to a Lords committee.
Writing to the Home Secretary on 27 January 2024, the Lords Justice and Home Affairs Committee (JHAC) outlined the findings of its brief investigation into the use of LFR by UK police, noting there are no rigorous standards or systems of regulation in place to control forces’ use of the technology.
This stands in contrast to claims from senior police officers and government ministers that there is a sound legal basis for LFR in the UK.
The committee’s letter added that, looking to the future, there is a real possibility of networked facial recognition cameras capable of trawling entire regions of the UK being introduced, and that there is nothing in place to regulate for this potential development.
A previous investigation by the JHAC into how police are using a variety of algorithmic technologies described the situation as “a new Wild West” characterised by a lack of strategy, accountability and transparency from the top down.
“Does the use of LFR have a basis in law? Is it actually legal? It is essential that the public trusts LFR and how it is used,” said JHAC chair Baroness Hamwee in a statement accompanying the letter. “It is fundamental that the legal basis is clear. Current regulation is not sufficient. Oversight is inadequate.
“Technology is developing so fast that regulation must be future-proofed. Police forces may soon be able to link LFR cameras to trawl large populations, such as Greater London, and not just specific localities. We are an outlier as a democratic state in the speed at which we are applying this technology. We question why there is such disparity between the approach in England and Wales and other democratic states in the regulation of LFR.”
Lack of scrutiny
The Lords’ letter added that while they accept the value of LFR to police, “we are deeply concerned” that its use is being expanded without proper scrutiny or accountability.
“We believe that, as well as a clear, and clearly understood, legal foundation, there should be a legislative framework, authorised by Parliament for the regulation of the deployment of LFR technology,” said the JHAC. “We also believe that the government must lead a wider public debate about the use of LFR technology, as used now and as it develops, to ensure public confidence and support.”
Lords added that “government should not wait for the legality of LFR deployment to be tested again in the courts”, referring to an August 2020 court decision that found South Wales Police used the tech unlawfully after failing to conduct a data protection impact assessment or fulfil its Public Sector Equality Duty (PSED) to consider how its policies and practices could be discriminatory.
Evidence session
In the LFR inquiry’s only evidence session, Lords heard from senior Metropolitan Police and South Wales Police officers about the improving accuracy of the technology, as well as how both forces are managing their deployments.
Claiming there was a “very clear focus” on the most serious criminality, they told the Lords about the operational benefits of LFR technology, which includes the ability to find people they otherwise would not be able to and as a preventative measure to deter criminal conduct.
At the same time, they confirmed that both forces use generic “crime categories” to determine targets for their LFR deployments, bringing into question claims that their use of the technology is concentrated on specific offenders who present the greatest risk to society.
Academic Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, challenged the proportionality and necessity of this approach during the evidence session, claiming the coercive power of the state means police must be able to justify each entry to the watchlists based on the specific circumstances involved, rather than their blanket inclusion via “crime types”.
Read more about facial recognition
- ‘Egregious’ to link passport data with facial recognition systems: The Scottish biometrics watchdog has spoken out against the UK policing minister’s plans to integrate passport data with police facial recognition systems.
- Met Police director of intelligence defends facial recognition: The Met Police’s director of intelligence has appeared before MPs to make the case for its continuing use of facial-recognition technology, following announcements from the force and the Home Office that they intend to press on with its adoption.
- UK police double down on ‘improved’ facial recognition: The Met and South Wales Police have doubled down on their use of facial recognition technology after research found improved accuracy in their algorithms when using certain settings, but civil society groups maintain that the tech will still be used in a discriminatory fashion.
Commenting on this in the letter, Lords also questioned the necessity and proportionality of police LFR deployments, noting that while police themselves argue it is targeted on “the most serious crime and the most serious vulnerability” such as murderers and rapists, the watchlists also contain images of shoplifters and others committing far less serious crimes.
“How is ‘serious crime’ defined, and is the definition standard for all forces? How does any definition impact the police’s assessments of necessity and proportionality when deploying LFR technology in a particular situation?” it asked, questioning whether there should be independent third-party approval of any police watchlist.
In its recommendations, Lords called on the government to publish a national regulation, or at least guidelines, on how “extensive crowd-scanning activity” is being assessed for lawfulness, necessity and proportionality, both before and after the deployment of LFR; subject watchlist selection to “specific compulsory statutory criteria and standardised training”; and standardise pre-deployment communication with the public to ensure better public trust.
Lords added that they expect a government response to their questions and recommendations by 26 March 2024.
Long-running concerns
The Lords’ latest inquiry follows a 10-month-long JHAC investigation into the use of advanced algorithmic technologies by UK police, including facial recognition and various crime “prediction” tools. It found that police are essentially “making it up as they go along”, and are deploying a range of technologies without a thorough examination of their efficacy or outcomes.
Throughout that inquiry, the JHAC heard from expert witnesses that UK police are introducing new technologies with very little scrutiny or training, continuing to deploy them without clear evidence about their efficacy or impacts, and have conflicting interests with their own tech suppliers.
One of the main conclusions of the investigation was that new legislation is urgently necessary to control how technologies are used by UK police.
In July 2022, however, the UK government largely rejected its findings and recommendations, claiming there was already “a comprehensive network of checks and balances”. It has since maintained this position, while also pushing for much more widespread adoption of the technology by UK police.
Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics – including the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.
In an exclusive interview with Computer Weekly, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, also highlighted a number of issues with how UK police had approached deploying its facial recognition capabilities, and warned that the future oversight of police tech is at risk as a result of the government’s proposed data reforms.