andreusK - stock.adobe.com

Biometrics ethics group takes evidence on public-private use of facial recognition

Collaboration between police forces and private entities on facial recognition technology comes under scrutiny by the Home Office’s Biometrics and Forensics Ethics Group

The ethics of police forces collaborating with private entities over the use of live facial recognition (LFR) technology is under investigation by an advisory body to the Home Office.

The ongoing investigation is being conducted by the Biometric and Forensic Ethics Group (BFEG), an advisory non-departmental public body sponsored by the Home Office, which has a remit to focus on the ethical aspects of the application and operation of technologies that produce biometric data and identifiers.

It has already taken evidence from civil liberties groups and regulators, and is now seeking to gather evidence from manufacturers of LFR technology, public and private sector users of LFR, as well as previous and potential users, such as councils, land owners and police.

The call for written or oral evidence will be open until 17 April 2020, and will culminate in an evidence-gathering event for users and tech firms on 30 April.

The Information Commissioner’s Office (ICO) is currently conducting a separate investigation into the use of LFR by the private sector, including where the technology is used in partnership with law enforcement.

The most notable example of UK law enforcement collaborating with private sector entities on LFR is the King’s Cross Estate, which was revealed to be using the technology across a 67-acre area of central London in August 2019.

Having initially denied any involvement, the Metropolitan Police Service (MPS) and British Transport Police eventually admitted last September to supplying the company with images for its database.

The collaborative use of LFR between the police and private sector has also come under greater scrutiny after it was revealed in late February this year that Clearview AI, a controversial startup that scrapes and sells billions of photos of people from social media to police forces without consent, was hacked.

Documents obtained and reviewed by Buzzfeed show the startup is working with more than 2,200 law enforcement agencies, companies and individuals around the world.

In the UK, this includes the MPS, which has run 170 searches since December 2019, and the National Crime Agency (NCA), which has carried out more than 500 since October 2019.

An MPS spokesperson told Buzzfeed that Clearview was not being used with its operational LFR tool, but the force had previously denied using the startup’s services at all in response to an earlier Freedom of Information request submitted by the Metro.

Accounts associated with a  number of other UK police forces or law enforcement agencies were also revealed to be using Clearview, including North Yorkshire Police, Suffolk Constabulary, Surrey Police, Northamptonshire Police, Hampshire Police and the Ministry of Defence.

Read more about facial recognition technology

Although it does not explicitly mention facial recognition technology, the National Policing Digital Strategy 2020-2030 , released on 21 January 2020, states that UK police “will strengthen our relationships with the private sector to empower it to appropriately share in public safety responsibilities”, suggesting that such collaborations will become much more common.

Three days after the digital strategy was released, the MPS announced its intentions to begin deploying LFR operationally for the first time.

This was in spite of numerous calls for a statutory and legally binding framework for the technology’s use, which is still not in place.

Speaking at an event on 24 January, MPS commissioner Cressida Dick called for a legislative framework to govern police use of new technologies, but remained adamant there was already a “very strong” legal basis for the use of LFR by police.

However, the first independent review of the MPS’s LFR trials, conducted by the Human Rights, Big Data & Technology Project, concluded that the “implicit legal authorisation claimed by the MPS… is likely inadequate when compared with the ‘in accordance with the law’ requirement established under human rights law”.

Read more on Privacy and data protection