greenbutterfly - stock.adobe.com

UK police plan national roll-out of facial-recognition phone app

UK police chiefs plan to expand use of facial-recognition technology through a nationwide roll-out of mobile-based tools and increases in retrospective facial-recognition, but oversight problems persist

UK police chiefs have announced plans to equip officers with a mobile-based facial-recognition tool that will enable them to cross reference photos of suspects against a database of millions of custody images from their phones.

Known as operator initiated facial recognition (OIFR), the tool uses software supplier NEC’s NeoFace facial-recognition algorithm, and is currently being jointly trialled by South Wales, Gwent and Cheshire police.

The National Police Chief’s Council (NPCC) has said the Police National Database (PND)-linked tool will be rolled out nationwide in 2024, and that it has further plans to increase the police’s use of retrospective facial-recognition (RFR) software by 100% before May that year.

Jeremy Vaughan, chief constable at South Wales Police (SWP) and the NPCC’s national lead on facial recognition, said that while the technology “can cut the amount of time spent trying to identify an offender from days and months to just minutes … we recognise the need to balance the use of new technology with the right to privacy”.

Described by South Wales Police as “an ‘on-street’ intelligence tool”, the OIFR app works by automatically comparing the photos taken by officers with a predetermined “watchlist” of suspects, which is primarily made up of custody images.

Once a search is made, the app will then reorder the watchlist matches from most to least likely, surfacing the six most likely possibilities to officers’ phones so they can decide whether a match has been accurate.

SWP noted that use of the app is limited to situations where someone is unable to provide their details because they are either deceased, unconscious, have mental health or age barriers or are incapable due to drink or drugs; has refused to provide their details; or it is reasonably suspected that they provided false details. “The image, captured on the mobile device, and biometric data are automatically and immediately deleted after a search is carried out,” the force said.

SWP trial

In a breakdown of a three-month trial conducted by SWP at the start of 2022, the force said it deployed the app-based OIFR tool 42 times to scan the biometric information of 35 people.

This led to 20 follow-up actions, including 11 arrests, four people being reported for summons for previous offences and five people being subject to “safeguarding measures”.

In a separate Facial Recognition Equality Impact Assessment, SWP said the discrepancy between number of scans and number of people scanned can be explained by the app being used more than once with the same subject.

It noted, for example, that one subject’s face was obstructed by clothing, and the need to photograph another subject multiple times “as it was difficult to get a good picture as they were deceased following a fatal RTC [road traffic collision]”.

It also referenced research commissioned by the Met and South Wales Police, which found “substantial improvement” in the accuracy of their facial-recognition systems when using specific settings, which prompted both forces to reaffirm their commitments to using the technology when it was published in April 2023.

The equality assessment shows that, of those photographed by South Wales Police, around a third were from ethnic minority backgrounds. The force’s policing area covers roughly 1.3m people, “6.7% of whom are from Black, Asian or Minority Ethnic backgrounds”, according to a web page for South Wales Police’s bursary scheme.

Unlawful custody image retention

Neither the NPCC announcement or the SWP OIFR trial breakdown mentioned long-running issues with the legality of how certain biometric information is being stored by UK police, specifically the custody images that police use to compile their facial-recognition watchlists.

In 2012, a High Court ruling found the retention of custody images in the PND to be unlawful on the basis that information about unconvicted people was being treated in the same way as information about people who were ultimately convicted, and that the six-year retention period was disproportionate. Despite the 2012 ruling, millions of custody images are still being unlawfully retained.

Writing to other chief constables to outline some of the issues around custody image retention in February 2022, the NPCC lead for records management, Lee Freeman, said the potentially unlawful retention of an estimated 19 million images “poses a significant risk in terms of potential litigation, police legitimacy and wider support and challenge in our use of these images for technologies such as facial recognition”.

Computer Weekly contacted South Wales Police about how it was dealing with the issue, but was referred to the NPCC.

“Custody images are one of the most valuable sources of intelligence for front line officers and investigators, but policing needs to ensure transparency and legitimacy in the way we control the use of this important biometric information,” an NPCC spokesperson said. “A national programme between policing and the Home Office has recently been launched [in October 2023] to ensure consistency and co-ordination across UK policing in how it retains, processes and then uses custody images, particularly in the use of facial recognition.

“Through the programme, we will agree and implement a robust management regime for custody images to ensure compliance to agreed policies and legislation. It is vital to public confidence that this programme is adopted nationally to make sure that we are holding data lawfully and ethically, both now and in the future.”

The spokesperson added that while the programme is yet to be publicised, it will seek to establish a management regime for custody images, alongside a review of all currently held data by police forces in the UK, which will be implemented over a two-year period.

Rapidly expanding facial-recognition use

The NPCC announcement regarding the OIFR roll-out and its intention to scale-up RFR use is in line with a wider push from government and law enforcement bodies to expand the use of facial-recognition technology throughout UK policing.

On 15 November 2023, in a keynote address to the Council’s annual Summit in London, NPCC chair Gavin Stephens noted it would play a “significant role” in helping UK policing become “an effective science-led service”, and further highlighted a study by South Wales Police that found its retrospective use of the tech has reduced the time taken to identify suspects from 14 days to a matter of minutes.

“The same force identifies 200 suspects every single month through its use [of retrospective facial-recognition],” he said.

Other forces across England and Wales are also rapidly increasing their RFR deployments, often without much oversight.  

Home Office data disclosed to The i newspaper and human rights group Liberty under freedom of information rules shows the number of retrospective facial-recognition searches of the PND carried out by forces last year reach a little over 85,000 – more than three times as many as 2021. Figures for the first four months of this year suggest this year’s total is on course to exceed 130,000 – a further 52% annual increase.

It added that while 13 of the 45 UK territorial police forces denied having used retrospective facial-recognition in 2022, the Home Office figures show they had carried out thousands of searches between them.

Read more about police technology

  • Met police data platform deployed with data protection issues: Met Police deploys integrated record management system despite data protection ‘compliance issues' that would inhibit its ability to retrieve data, meet its statutory logging requirements, and respond to subject access requests.
  • Scottish biometrics watchdog outlines police cloud concerns: Police Scotland’s response to the biometrics commissioner’s formal information notice ‘did not ameliorate’ his concerns about the sovereignty and security of the sensitive biometric information being uploaded to cloud infrastructure that is subject to intrusive US surveillance laws.
  • Newham Council rejects use of live facial-recognition tech by police: Live facial-recognition technology should not be used by police in Newham until biometric and anti-discrimination safeguards are in place, according to a motion passed unanimously by the council, but the Met Police and the Home Office have indicated they will not suspend its use. 

Separate data obtained by Liberty and The Ferret shows Police Scotland carried out more than 2,000 retrospective facial-recognition searches in the first four months of 2023, and has tripled its use of this facial-recognition technique over the past five years, from just under 1,300 searches in 2018 to nearly 4,000 in 2022.

In May 2023, an interim report into upcoming UK government data reforms revealed that policing minister Chris Philp was pushing for facial-recognition technology to be rolled out by police forces across England and Wales, and will likely push to integrate the tech with police body-worn video cameras.

He later wrote to police chiefs in October 2023 setting out the importance of harnessing new technologies for policing, urging them to double the amount of RFR searches they are conducting and to deploy live facial recognition (LFR) much more widely.

At the start of the same month, Philp, speaking at a fringe event of the Conservative Party Conference, outlined his plans to integrate data from the police national database (PND), the Passport Office and other national databases with facial-recognition technology to help catch shoplifters and other criminals.

The plan was met with criticism from campaigners, academics and Scottish biometrics commissioner Brian Plastow, who said the “egregious proposal” to link the UK’s passport database with facial-recognition systems is “unethical and potentially unlawful”.

Lack of oversight

Despite UK police looking to massively expand their facial-recognition use, many are concerned about the lack of oversight around biometric technologies in the UK, and have called for clear legal frameworks to be put in place.

In an exclusive interview with Computer Weekly, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, highlighted a number of issues with how UK police had approached deploying its facial-recognition capabilities, and warned that the future oversight of police tech is at risk as a result of the government’s proposed data reforms.

For Sampson, the answer to the complexities around facial-recognition and other biometric-capturing technologies is to have “a very robust, very clear, intuitive oversight accountability framework.”

Others have also called for clear legal frameworks around biometrics, including a House of Lords inquiry into police use of advanced algorithmic technologies; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; the former biometrics commissioner for England and Wales, Paul Wiles; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

The Home Office, however, maintains there is already “a comprehensive legal framework in the UK” to deal with police use of facial-recognition and other biometric technologies.

During his Summit keynote, NPCC chair Stephens further outlined plans to transform policing with cutting-edge science and technology, which alongside facial-recognition will include rolling out drones, a new digital fingerprint-matching system and Robotic Process Automation to deal with administrative tasks.

Stephens also announced the NPCC would launch its own Science and Technology Committee to drive the adoption of new technologies in policing. “I believe that science and technology will be the single biggest driver of reform in policing in the coming years,” he said.

Read more on Artificial intelligence, automation and robotics