Alan Stockdale - stock.adobe.com

New Metropolitan Police database raises concerns of Gangs Matrix repeat

The Metropolitan Police Service has been secretly developing a new database, but similarities to the controversial Gangs Matrix have raised concerns among data protection and racial equality activists

The Metropolitan Police Service (MPS) has been secretly developing a new criminal database similar to the Gangs Matrix, the use of which was found to have led to serious breaches of data protection laws in November 2018.

According to a report from The Independent, the so-called Concern Hub is designed to document individuals considered vulnerable to getting involved with gangs.

While the project was not meant to be officially announced until April, it is already being piloted in Lewisham and, according to a written statement from the Met, “is being developed in collaboration with local authorities across London to identify those at risk at the earliest possible opportunity”.

The MPS further characterised the Concern Hub as “a new multi-agency diversion initiative”, adding that the project “aims to safeguard young people at significant risk of becoming involved in violence, drugs or gang activity”.

It is unclear, however, what safeguards are in place to ensure any data sharing between the police, local authorities and other agencies complies with the data protection legislation, and how these will ensure the discriminatory effects of the Gangs Matrix are dealt with.

The Information Commissioner’s Office (ICO) is still actively investigating how third parties used the Gangs Matrix data that police shared with them, but declined to comment on the investigation or the Concern Hub database when asked if it had any knowledge of its development.

It is also unclear exactly what data will be collected and how it will be stored. While this was also an issue with the Matrix, the MPS is currently in the process of reporting monthly to the ICO to show how it’s progressing against each of the recommended measures set out by its 27-page Enforcement Notice.

Read more about the Metropolitan Police

  • The Metropolitan Police has confirmed it exchanged information on WikiLeaks staff in the UK with the US Department of Justice, which is conducting criminal investigations into the organisation.

  • The Metropolitan Police Service has completed its ninth deployment of live facial recognition technology, which took place in Romford town centre on 31 January. Eight people were arrested during the eight-hour deployment, said the Met.

  • Police forces across the UK are covering up their use of sophisticated mass surveillance devices, known as IMSI-catchers - the Bristol Cable and Liberty are campaigning for proper transparency.

One of the recommendations included the need to conduct a data protection impact assessment, which had not been done once in seven years.

The Met did not respond to Computer Weekly when asked if any privacy, data protection, or equality impact assessments had been carried out for the new Concern Hub. It also did not respond to a separate request asking how the Concern Hub was being differentiated from the Matrix. The ICO could only confirm that the Met is still reporting to it.

In a separate review of the Matrix, the findings of which were published in December 2018, the Mayor’s Office for Policing and Crime set out a number of guiding principles, which included the need to “be informed by community and partner views” and “increase transparency”.

One of the groups involved in the review’s consultation process was StopWatch, a coalition that works to promote effective, accountable and fair policing.

“They haven’t been in touch with us since November,” said Katrina Ffrench, CEO of StopWatch: “We were meant to see the review before it went public but they just e-mailed and called us the day it went live, which we felt was buried from being released the Friday before Christmas.”

Ffrench added that StopWatch and others were disappointed to have not been kept informed of the new database being developed, especially given the previous controversy surrounding the Matrix. “We feel MOPAC [the Mayor’s Office for Policing and Crime] should have kept us informed of the Met’s plan to either rebrand or overhaul it,” she said.

“We’re deeply worried about replications of what the Gangs Matrix did. We’ve yet to see any evidence or research of its efficacy as a diversion mechanism, so for us there’s again a lack of transparency about what’s going on in the Concern Hub.”

Lessons learnt?

“The Gangs Matrix – which labels young people as gang members based on vague criteria and untested evidence, conflates victims and perpetrators, and entrenches discrimination in public services – should serve as a cautionary tale,” said Rosalind Comyn, policy and campaigns manager at Liberty.

“Support services for young people play a critical role in addressing the root causes of serious violence, but police-led initiatives characterised by widespread data sharing in the absence of robust safeguards risk increasing the surveillance of young people, and may have debilitating effects in terms of their access to housing, education and employment.”

The Met, however, claims that although the database is in its infancy: “Individuals identified as being at risk will be provided support and pathways away from violence through partnership working with local authorities and a range of initiatives.”

Ffrench, on the other hand, has doubts about how early stages the database is. “It can’t be at such an early stage that they’re going to launch it next month,” she said. “There’s only a week and a half left of this month, so they say it’s early stages but they knew from when Amnesty published that report [in May 2018] that the Matrix was dead.”

Following the official launch of the first hub in April, the plan is to have each local authority managing a hub with a central team at New Scotland Yard providing oversight of the whole hub system.

Police use of data, algorithms and machine learning

Closely tied to the use of police data is the idea of predictive policing, whereby the information is taken to predict future behaviours.

In the context of policing, this usually means taking the data and running it through a machine learning tool to do one of two things.

The first is to take the data and map out crime “hotspots” in an attempt to figure out where new crimes will be committed, while the second is largely focused on profiling individuals to try and predict their “risk” of offending.

The Gangs Matrix, for example, was originally set up in the wake of the 2011 Tottenham Riots to identify and monitor individuals considered to be linked to gangs.

So-called “gang nominals” (individuals on the Matrix) were given algorithmically generated “harm scores” to show how likely they are to commit a violent crime, with just two pieces of corroborating evidence needed to justify their inclusion.

This could range from what estate the individual lived on to who they were friends with on Facebook or what music videos they had shared on social media.

Although senior government officials at the time, including former prime minister David Cameron, asserted that gangs had orchestrated the riots, no evidence has ever been produced to confirm this claim.

Matrix seen as discriminatory tool

According to veteran racial equality activist Stafford Scott, writing in the The Guardian in January 2019, “there now appears to be a slowly growing recognition, even among regulators, that the Matrix is, and has been used as, a discriminatory tool in the Met’s so-called war on gangs”.

“The Met claims the Matrix, informed by intelligence, helps identify and assess the most harmful gang members in each of London’s boroughs,” he said. “But those of us who have witnessed the impact of the Matrix know this is far from the truth.”

The Met is not the only police force in the UK to be using these kinds of tools. A recent report by Liberty, Policing by machine, highlights at least 14 forces using similar tools.

Durham Police, for example, have used a program called the Harm Assessment Risk Tool, which bases its predictions on 34 pieces of data.

Of these, 29 relate to the individual’s previous criminal history, while the rest include characteristics such as postcode, which, according to the Liberty report, “act as ‘proxies for race’ by indirectly indicating a person’s ethnicity and fuelling the same biases”.

The report added that research on algorithms in the United States’ criminal justice system shows that even if no race-related data is included in an algorithm, it still learns characteristics in a discriminatory way, “because other pieces of data it did use correlated with race and led to inadvertent profiling by the algorithm”.

Read more on Technology startups