chalabala - Fotolia
Police algorithms need national guidance ‘as a matter of urgency’
A study by the Royal United Services Institute makes a number of recommendations for how police algorithms can be kept in check
A new set of nationally approved guidelines is needed to ensure police algorithms are deployed in lawful and ethical ways, claims a report by the Royal United Services Institute (Rusi) security think tank.
Compiled as part of a review into algorithmic bias in the policy, the report suggests police algorithms are often created without a “clear business case” or without any empirical evidence to support claims about the benefits such technology will bring.
To remedy this, the report goes on to recommend police forces carry out an “integrated impact assessment” before investing in new algorithm-based software to ensure there is a clear legal basis for it, and put guidelines in place to govern how it should be used.
This should incorporate data protection, human rights, equality and data protection impact assessments, a review of any relevant legal requirements, and an independent ethical assessment.
The report further recommends that individual forces should keep any analytics projects “under constant review” to ensure they meet their original goals and stay in line with the initial assessments, and that a “senior responsible owner” should be assigned to each project for “full accountability”.
The Rusi report is the second of two papers commissioned by the Centre for Data Ethics and Innovation (CDEI) as part of its review into algorithmic bias in the police, which will be used to help the organisation develop a new code of practice for the use of data analytics and algorithms by law enforcement.
While the first is focused specifically on the use of machine learning in predictive crime mapping, individual risk assessments and how algorithmic bias can arise, the second report takes a wider view of the “advanced algorithms used by the police” to make recommendations for a potential future framework. This includes separate recommendations for police forces, regulators and policing bodies, and software developers, as well as general guidance on regulation and oversight.
Police forces, regulators and software developers
According to the report, which 27 people from police forces, civil society organisations, government departments, academics and legal experts were interviewed for, the “research participants universally recognised a lack of any official national guidelines for police”.
It added that establishing these new guidelines would require a joint approach between various regulators and policing bodies, including the National Police Chiefs’ Council (NPCC), the Home Office and the College of Policing.
These new guidelines should include the “integrated impact assessment” as a core requirement, the report said.
Read more about police technology
- National policing strategy sets a number of technological ambitions for the next decade, placing a particular emphasis on collaborative working and building common standards or frameworks.
- Information sharing across agencies, data protection and governance implications are discussed as the government attempts to appropriately use data-driven technologies.
- West Midlands Police is developing a system to identify potential perpetrators of violent crime and aims to target resources – but how viable is the system an.
To ensure a coordinated development and deployment of any new data-driven technologies, the NPCC and Association of Police and Crime Commissioners (APCC) should also establish a national coordinating group for data analytics, which “should maintain a high-level catalogue of all algorithms used by police forces nationwide to inform operational decision-making”.
For software developers, usually providing “commercial off-the-shelf” products to police, the report recommends making it fully auditable, adding that “intellectual property rights must not be a restriction on this scrutiny”.
It also said that human-interpretable features were essential to gaining transparency over how the algorithm came to its decision.
Regulation and ethical oversight
Rather than establishing prescriptive rules and standards for specific data technologies, Rusi recommends any future framework is “tech-agnostic” and “principles based”.
“The framework should instead establish standardised processes to ensure that data analytics projects follow recommended routes for empirical evaluation of algorithms in their operational context and evaluate the project against legal requirements and ethical standards,” said the report.
From its interviews, Rusi found the development of police algorithms are not underpinned by a robust empirical evidence base to support the purported benefits of data-driven technologies, concluding that “a clear business case is therefore often absent”.
At the moment, police have adopted more of an “exploratory approach”, whereby analysis is carried out without establishing a clear purpose until after insights have already been generated.
On top of this, the report found “widespread recognition of the need for meaningful and independent ‘ethical oversight’ of police data analytics projects”.
The explosion in police algorithms
The Rusi report details how the police use of algorithms has significantly expanded in scale and complexity in recent years, and is being driven by three main factors.
The first is the increasing volume and complexity of digital data, which has “has necessitated the use of more sophisticated analysis tools”.
The second is ongoing austerity measures, which was described by every police officer interviewed as the primary driver of developing new data capabilities.
“As discussed in a recent report from Cardiff University, these developments are part of a wider trend across the UK public sector of the use of algorithms and ‘data scoring’ to inform decision making and resource allocation in an age of austerity,” said the report.
The third driver is the UK police’s increasingly preventative – rather than reactive – approach, where the emphasis is on anticipating potential harm before it occurs.
It added, however, that there was no consensus on how to properly apply ethical oversight. “In particular, it remains unclear whether this oversight should be delivered at the local force level, or in the form of a centralised national structure,” it said.
“While various groups currently provide ethical oversight to policing, there remains a lack of consistency in approach between forces, and unclear delineation of local, regional and national responsibilities.”
To address this, the report said the UK Police Ethics Guidance Group should conduct a comprehensive review of current police ethics committees to discover whether they would be effective mechanisms for oversight.
This would include looking at whether or not the committees could be “upskilled”, or whether bespoke digital committees could be established in parallel.
The report was written by Alexander Babuta, a research fellow in national security studies at Rusi, and Marion Oswald, the vice-chancellor’s senior fellow in law at the University of Northumbria.