_KUBE_ - stock.adobe.com
Met police deploy facial recognition in Westminster
Police facial recognition deployment resulted in four arrests, but questions remain about the necessity, proportionality and legality of the technology’s use
London police deployed live facial recognition (LFR) technology in Westminster on Friday 28 January, resulting in the arrest of four people and attracting significant criticism from civil rights groups.
The Metropolitan Police Service (MPS) said its facial recognition deployment – which took place the day after the UK government relaxed mask wearing requirements – was part of a wider operation to tackle serious and violent crime in the borough.
According to the MPS, one arrest was of a man wanted on an extradition warrant related to alleged drugs offences and serious assault, while the other three were for unspecified drugs offences, an unspecified traffic offence, and a man wanted in connection with alleged death threats.
The suspects were engaged and detained by officers following alerts from the vehicle-mounted LFR system, which enables police to identify people in real time by scanning their faces and matching them against a database of facial images, or “watchlist”, as they walk by.
Computer Weekly contacted the MPS for more information about the deployment. According to a post-deployment review document shared by the MPS, only one facial recognition van was deployed in Oxford Circus, which generated 11 match alerts, one of which it said was false. As a result, seven stops were carried out, while the system processed the biometric information of 12,120 people.
Silkie Carlo, director of civil liberties group Big Brother Watch, who was present at one of the deployments in Oxford Circus, told Computer Weekly she witnessed four stops while she was present.
“Of the four stops we saw, two shouldn’t have happened – one was in relation to outdated data and another was a straightforward misidentification,” she said, adding that the person who was misidentified, a young black boy, had his fingerprints taken. “There may have been many more misidentifications – that was simply in the short time I was there.”
Carlo said the experience of misidentification by facial recognition could have “a profound effect” on the individuals subjected to it. “This boy who was stopped on Friday had four or five police officers surrounding him, they took his fingerprints, barking questions at him,” she said.
“The police officers aren’t saying, ‘You match the description’ – which is obviously what a lot of teenagers have had to put up with in London for a long time – but, ‘You’ve been flagged by our system’. That must be incredibly disempowering. It’s a lot more [they’re] up against [because they] have to then start proving who they are, and trying to prove they’re not the person they [the police] think they are.”
On the four arrests made, Carlo said there needs to be more clarity from the MPS about the nature of the offences, especially considering its claim the tech was only deployed to look for “serious and violent offenders”.
“If [the traffic offence] is something like speeding, a lot of people are going to think very, very differently about facial recognition,” she said.
“We’ve always had this problem – that they put out in their press releases, ‘We’re only looking serious and violent offenders’, but in the stops we witness that’s very rarely the case. A 31-year-old man being wanted for drug offences? Is that possession of a small amount of marijuana? Do we really need facial recognition for that?”
A necessary and proportionate deployment?
Before it can deploy facial recognition technology, the MPS must meet a number of requirements related to necessity, proportionality and legality.
For example, the MPS’s legal mandate document – which sets out the complex patchwork of legislation the force claims allows it to deploy the technology – says the “authorising officers need to decide the use of LFR is necessary and not just desirable to enable the MPS to achieve its legitimate aim”.
In response to questions about how the force decided the Westminster deployments were necessary, the MPS said “the deployment was authorised on the basis of an intelligence case and operational necessity to deploy, in line with the Met’s LFR Documents.”
In terms of the basis on which the deployment was deemed proportionate, it added “The proportionality of this deployment was assessed giving due regard to the intelligence case and operational necessity to deploy, whilst weighing up the impact on those added to the watchlist and those who could be expected to pass the LFR system.”
The MPS’s Data Protection Impact Assessment (DPIA) also says that “all images submitted for inclusion on a watchlist must be lawfully held by the MPS”.
In 2012, a High Court ruling found the retention of custody images – which are used as the primary source of watchlists – by the Metropolitan Police to be unlawful, with unconvicted people’s information being kept in the same way as those who were ultimately convicted. It also deemed the minimum six-year retention period to be not proportionate.
Addressing the Parliamentary Science and Technology Committee on 19 March 2019, then-biometrics commissioner Paul Wiles said there was “very poor understanding” of the retention period surrounding custody images across police forces in England and Wales.
He further noted while both convicted and unconvicted people could apply to have their images removed, with the presumption being that the police would do this if there was no good reason not to, there is “little evidence it was being carried out.”
“I’m not sure that the legal case [for retention] is strong enough, and I’m not sure that it would withstand a further court challenge,” he said.
Asked how it had resolved this issue of lawful retention, and whether it could guarantee every image in the 28 January watchlists were held lawfully, the MPS cited section 64A of the Police and Criminal Evidence Act 1984, which gives police the power to photograph people detained in custody and to retain that image.
It added that the custody images are also held in accordance with Management of Policing Information Authorised Police Practice (MOPI APP) guidelines.
“To comply with the MOPI APP, the MPS has simplified the Custody Image deletion process and has put in place measures to proactively facilitate custody image deletion,” said an MPS spokesperson. “Individuals can now make a formal request for their image to be deleted via a link on MPS website, or attend any MPS police station to make their request in person and the deletion decision will be made with reference to the current MOPI APP guidelines on custody images.”
Carlo told Computer Weekly that a police officer on the ground informed her there were 9,500 images on the watch list for the deployment. The MPS’ post-deployment document confirmed the watchlist contained 9,756 images.
“That’s not a targeted and specified deployment because of a pressing need – it’s a catch net,” she said.
In July 2019, a report from the Human Rights, Big Data & Technology Project based at the University of Essex Human Rights Centre – which marked the first independent review into trials of LFR technology by the Metropolitan Police – highlighted a discernible “presumption to intervene” among police officers using the technology, meaning they tended to trust the outcomes of the system and engage individuals that it said matched the watchlist in use even when they did not.
On how it has resolved this issue, the MPS said it had implemented additional training for officers involved in facial-recognition operations.
“This input is given prior to every LFR deployment to ensure officer are aware of the current systems capabilities. LFR is a tool that is used to help achieve the wider objectives of the policing operation, it does not replace human decision making,” it said. “Officer are reminded during the training of the importance of making their own decisions on whether to engage with a member of the public or not.”
Civil society groups react
In response to the Westminster deployment, policy and campaigns director at human rights group Liberty, Emmanuelle Andrews, described facial recognition as “oppressive by design” and said “its inaccuracy and intrusion will fall most hard on people of colour, especially black men who face routine police oppression”.
He added: “The Court of Appeal has agreed that this technology violates our rights and threatens our liberty. Yet the Met has trialled it repeatedly. These tools are neither necessary nor compatible with the type of society we want to live in. To keep everyone safe, we must reject divisive and oppressive surveillance technology, we must reject ever-increasing and unaccountable police powers, and demand that government works with communities to develop strategies based in fairness, participation and support.”
Carlo said the facial recognition van at Oxford Street on 28 January was accompanied by a significant police presence, including around 25 uniformed officers and 25 plainclothes officers.
“The Metropolitan Police have been mired in scandals all year, and there are serious trust issues…to see such a huge deployment of police experimenting with a very intrusive technology whilst also handing out leaflets and having to explain to members of the public why they’re standing there staring at iPads, waiting for match alerts and scanning their faces seems extraordinary and very, very misguided to me,” she said.
This story has been updated to include information and comment from the MPS that was provided after publication.
Read more about police technology
- UK police continue to deploy facial recognition technology disproportionately with no clear legal basis and highly questionable efficacy.
- Retrospective facial recognition software purchased for £3m by the Met Police will be deployed in coming months.
- The UK’s policing minister suggests use of advanced algorithmic technologies by police should be tested in court rather than defined by new legislation.