andreusK - stock.adobe.com
Home Office and MoD seeking new facial-recognition tech
The UK’s Defence and Security Accelerator is running a ‘market exploration’ exercise on behalf of the Home Office to identify new facial-recognition capabilities for security and policing bodies in the UK
The UK government’s Defence and Security Accelerator (Dasa) has launched a “market exploration” of facial-recognition technologies to identify mature capabilities that can be deployed by the Home Office for “policing and security purposes” within the next 18 months.
Part of the Ministry of Defence (MoD), Dasa regularly conducts market exploration exercises to determine which technologies already exist, which are currently in development, and which need further investment.
Run on behalf of the Home Office, Dasa’s latest market exploration into facial-recognition technologies will not provide funding or support to suppliers, and is instead designed to gain a better understanding of current technologies that could enhance the facial-recognition capabilities of policing and security bodies in the UK.
The market exploration document noted facial recognition is “an increasingly important capability for law enforcement and the Home Office”, and that it provides “significant opportunities” to improve public safety.
In increasing it’s use of facial-recognition technologies – which fall into three broad categories of live, retrospective and operator-initiated (meaning an officer decides to take a photo and run it through facial-recognition software) – the Home Office has said it is committed to ensuring the technology is explainable, accountable, effective, ethical and good value for money.
“The minister of state for crime, policing and fire and I strongly support the development and implementation of facial-recognition technology within the law enforcement sector and are encouraged by its potential,” said Paul Taylor, the national policing chief scientific adviser, who added that “industry is pivotal to realisation” of advanced capabilities that can enhance public safety while respecting individual rights and privacy.
“It is essential to acknowledge the concerns surrounding FR technology, particularly those relating to privacy and potential biases. However, responsible development and implementation of FR systems can address these concerns effectively. By establishing robust governance frameworks, implementing strict data protection protocols, and ensuring transparency and accountability, we can strike the right balance between public safety and individual privacy rights.
“To maximise the technological benefits and minimise the risks associated with FR, it is crucial that we support and encourage industry to continue developing capabilities which can be deployed effectively and ethically.”
As part of the market exploration, Dasa and the Home Office will also look for technologies that support algorithmic development, integration and analytics capabilities.
However, they noted that the technologies being investigated will be limited to facial recognition, and that there will be no exploration of other biometric technologies like iris and gait detection, behavioural analytics or age estimation.
“By completing the Capability Submission Form, neither DASA, the Home Office nor yourselves are committing to anything, but your submissions will be compiled by DASA and provided to policing and Home Office partners for their due consideration. Your submission will also help us to identify your interests in this area,” they said, adding the deadline for submissions will be 12 October 2023.
Ongoing concerns
According to a May 2023 report co-authored by academics Pete Fussey and William Webster on behalf of Fraser Sampson – the biometrics and surveillance camera commissioner for England and Wales – policing minister Chris Philp expressed his desire to expand the use of facial-recognition systems in policing during a closed-door meeting.
While the report was conducted independently of Sampson’s office, the commissioner told Fussey and Webster in an interview that he felt it was “extremely likely” that the embedding of facial recognition in policing would also include “exploring integration of this technology with police body-worn video”.
The report further noted there is significant public concern over the police’s use of facial recognition, and that a number of other experts and public bodies have been calling for more detailed rules for deployers of the technology.
This includes a House of Lords inquiry into police use of advanced algorithmic technologies; the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on live facial recognition (LFR) as far back as July 2019.
Most recently, in February 2023, Sampson called for clear, comprehensive and coherent frameworks to regulate police use of artificial intelligence (AI) and biometrics in the UK.
During an appearance before Parliament’s Joint Committee on Human Rights in February 2023, Sampson highlighted ongoing issues around the unlawful retention of custody images and other biometric material used to populate facial-recognition watchlists.
“There are probably several million of those records still,” he said, adding that despite a High Court ruling finding custody image retention to be unlawful in 2012, the response from policing bodies and the Home Office (which owns most of the biometric database used by UK police) is to point out the information is held on a database with no bulk deletion capability.
“I’m not sure that works for public trust and confidence, but even if it did…you can’t [legally] rely on a flaw in a database you built for unlawfully retaining stuff…that’s a technical problem that’s of the country’s and the police’s making rather than the people whose images you’ve kept.”
In April 2023, both the Met Police and South Wales Police reaffirmed their commitments to using facial-recognition technologies, after research commissioned by the forces found “substantial improvement” in the accuracy of their systems when using specific settings.
In an appearance before the Science, Innovation and Technology Committee in June 2023 – which launched an inquiry into AI governance in October 2022 – the Met Police’s director of intelligence Lindsey Chiswick told MPs that use of facial-recognition tech has already led to “a number of significant arrests”, including for conspiracy to supply class A drugs, assault on emergency workers, possession with the intent to supply class A drugs, grievous bodily harm, and being unlawfully at large having escaped from prison.
The Home Office and policing ministers have repeatedly justified using the technology on the basis it “plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism”.
Computer Weekly has asked for evidence to back this claim up on multiple occasions, but has never received a response from the Home Office. The Met, however, confirmed to Computer Weekly in January 2023 that no arrests have been made for those reasons as a result of LFR use.
Read more about facial recognition and biometrics
- Police Scotland use cloud for biometric data despite clear risks: Police Scotland confirms it has stored significant volumes of biometric data on a cloud-based digital evidence sharing system despite major ongoing data protection concerns, bringing into question the effectiveness of the current regulatory approach and the overall legality of using hyperscale public cloud technologies in a policing context.
- Biometrics watchdog calls for public space surveillance review: The biometrics and surveillance camera commissioner is calling for a review of public space surveillance to gain a clearer picture about the proliferation of Chinese surveillance technology across the public sector, but warns against applying double standards on companies just because they are from China.
- UK to surveil convicted migrants with facial recognition: A Home Office scheme to biometrically scan the faces of convicted migrants who have already carried out punishments has come under fire from privacy and human rights groups for being discriminatory.