Mat Hayward - stock.adobe.com

Scottish police tech piloted despite major data protection issues

Scottish policing bodies are pressing ahead with a data sharing pilot despite data protection issues around the use of US cloud providers, placing sensitive personal data of tens of thousands of people at risk

A cloud-based digital evidence-sharing system for police in Scotland is being piloted, despite major data protection concerns raised by watchdogs about how the use of Microsoft Azure could be putting people’s sensitive personal data at risk.  

The Scottish government’s Digital Evidence Sharing Capability (DESC) service is aiming to digitally transform how the criminal justice system manages evidence by creating a unified system for prosecutors, court staff, police officers and defence lawyers to access and handle digital evidence safely and securely.  

Body-worn video technology provider Axon has been contracted by the Scottish government to deliver the digital evidence system, which is being hosted in turn on Microsoft Azure. 

However, in a data protection impact assessment (DPIA), and through email exchanges with the Information Commissioner’s Office (ICO) dating back to summer 2022, the Scottish Police Authority (SPA) – an oversight body set up to scrutinise policing in Scotland – identified that the system could not fully comply with specific data protection requirements for UK law enforcement bodies. 

Emails from both Axon and Microsoft’s legal teams, which have been shared with Computer Weekly, also reveal that the issues now being flagged by the SPA were known and understood by the companies themselves for at least two years, during which time no action was taken to remedy the concerns raised.  

Both companies were asked about the emails and what action they have since taken to resolve the issues raised. Axon said it works “closely with customers to ensure robust and effective safeguards are in place”, while Microsoft did not respond. Axon’s full response is detailed below.  

As detailed by the SPA, the DESC system “will provide the capability to collect and securely share digital evidence between criminal justice partners”, and the types of digital evidence being stored and processed include “public and private space CCTV, body-worn video, evidential calls to police control rooms, police interviews, photographs and videos of victims, accused, crime scenes, documents and fingerprints.”  

This, in turn, could leave witnesses and victims of crime, plus suspects and people being prosecuted, open to a number of risks, including the potential of their data being transferred to a jurisdiction with demonstrably lower data protection standards. It could also negatively impact their data rights to rectification, erasure and not being subject to automated decision-making. 

In a data protection impact assessment, and through email exchanges with the ICO, the Scottish Police Authority identified that the Digital Evidence Sharing Capability system could not fully comply with specific data protection requirements for UK law enforcement bodies

The level of data processing being conducted is not “novel”, the SPA added, but it is new for policing in Scotland and therefore presents “additional risks” that do not exist in the current processes, requiring a DPIA to be completed. 

The SPA specifically noted that “transfers to overseas cloud providers, Axon USA or sub-processors outside of the UK [such as Microsoft] would not be legal”, and that there were a number of other unresolved high risks to data subjects, such as US government access, Microsoft’s use of generic rather than specific contracts, and Axon’s inability to comply with contractual clauses around data sovereignty.  

The DPIA was released via a freedom of information (FOI) request by Owen Sayers, an independent security consultant and enterprise architect with over 20 years’ experience in delivering national policing systems, alongside the SPA’s correspondence with the ICO. All of this material has been seen by Computer Weekly.  

Sayers said the issues brought to light by the FOI have been well-understood and known about for a number of years. “Thanks to the honesty of these Scottish FOI disclosures, painful though they might have been for the parties to make them, that secret is out and will hopefully be positively discussed and addressed.” 

Under the terms of the Data Protection Act (DPA) 2018, policing bodies are obliged to conduct mandatory DPIAs before the start of any new personal data processing where a type of processing is likely to result in a high risk to the rights and freedoms of individuals. This includes where the system is not yet live, but real personal data is still being used. 

The release of the DPIA and ICO correspondence also brings into question the lawfulness of cloud deployments by policing and criminal justice bodies throughout England and Wales, as a range of other DPIAs seen by Computer Weekly do not assess the risks outlined by the SPA around US cloud providers, despite being governed by the same data protection rules. 

Computer Weekly contacted the ICO about the contents of the DPIAs and the use of US cloud providers by UK law enforcement – including whether the ICO has sought formal legal advice since offering an “initial view” to the SPA, and whether it was aware the DESC pilot had launched with two unmitigated high risks.

“Police Scotland and the Scottish Police Authority approached the ICO to gain advice on the Digital Evidence Sharing Capability service, including use of cloud hosting. Engagement is ongoing as we consider the issues and next steps,” said an ICO spokesperson.

“It is important to maintain high standards of data protection compliance when processing law enforcement data, in order to ensure public trust and confidence and safeguard against harm. The purpose of a DPIA is to identify risks, and then to look to mitigate them: that latter step cannot be skipped.”

Computer Weekly also asked the ICO about the prevalence of US cloud providers throughout the UK criminal justice sector, and whether their use is compatible with UK data protection rules. The ICO said it had referred Computer Weekly’s questions to the FOI team for further responses.

Unresolved, high-risk issues

According to the SPA’s DPIA, “there are concerns that the processing may breach the tight controls that apply to international transfers as defined in s73 of the DPA [Data Protection Act 2018]. Those concerns relate to the provider, a wholly owned US company and its sub-processor, Microsoft Azure”, both of which it added are subject to invasive legislation that allows the US government to access their data. 

Specifically, this includes section 702 of the Foreign Intelligence Surveillance Act (FISA), which allows the US attorney general and director of intelligence services to jointly authorise the targeted surveillance of people outside the US, as long as they are not a US citizen; and the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud.  

While the DPIA noted the risk of US government access via the Cloud Act was “unlikely… the fallout would be cataclysmic”.

Alex Lawrence-Archer, a solicitor at data protection specialist law firm AWO, told Computer Weekly: “There seems to be a tension between what the Data Protection Act 2018 requires and that US legislation.”

He added that even if everything was processed from within the UK without routine data transfers to Microsoft for support purposes, the US government “would still be able to compel UK law enforcement personal data from Microsoft… all of this data, regardless of where it’s stored or processed can be accessed by Microsoft US, which is within reach of the US authorities. And to the extent that it is accessed, even the police won’t necessarily know that has happened.” 

The DPIA noted a number of other “high-risk” issues with Microsoft’s terms and conditions that further bring into question its suitability to process UK policing data.  

This includes the fact that Microsoft’s standard data processing addendum is drafted primarily to apply to processing related to the General Data Protection Regulation (GDPR) rather than Part 3 processing (the specific law enforcement requirements); that the contract between Axon and Microsoft does not contain the “granular level of detail” required to satisfy either GDPR or Part 3; and that Microsoft’s use of generic terms and conditions means the DPA’s section 59 requirement for a specific contract detailing the nature of the processing cannot be met.  

It added that while the Microsoft addendum states data is held in the UK, it also states that data may be transferred to or processed in the US, or any other country in which Microsoft or its processors operate, opening up the risk of it processing UK policing data outside of the UK “without any visibility or control over this processing for the controllers”.

For Axon, there is also a high risk of it transferring UK policing data to the US without the knowledge or consent of the data controllers.  

“The terms of the contract were clear in respect of data sovereignty, however, during due diligence it became clear that Axon may not have been fully conversant/understanding of this term as services within the solution processed data in the USA,” it said.  

All of these issues were marked as “high risk” in the DPIA. In the following section on measures that could mitigate these risks, the SPA noted that it was either still awaiting on assurances from Microsoft, or that there was no mitigation possible.  

“In early June 2022, the Scottish Police Authority requested, via Microsoft reseller Phoenix, that Microsoft confirm in writing that MS Azure operates in compliance with Part 3 of the Data Protection Act 2018, and in particular is compliant with the s73 [international transfer] requirements,” it said. “The response was that ‘Microsoft would consult their CELA [corporate, external, and legal affairs] and respond, however, it may take some time’. This response does not give the controllers the level of confidence they might have hoped for.” 

The SPA’s correspondence with the ICO also reveals that the regulator largely agreed with its assessments of the risks. Regarding international transfer requirements, for example, it noted that technical support provided from the US by either Axon or Microsoft would constitute an international data transfer, as would a US government request for data made via the Cloud Act.  

“These transfers would be unlikely to meet the conditions for a compliant transfer,” it said. “To avoid a potential infringement of data protection law, we strongly recommend ensuring that personal data remains in the UK by seeking out UK-based tech support.” 

Commenting on the use of US cloud providers by UK law enforcement generally, Lawrence-Archer said the legal position around tech support from a third country is clear, in that each instance of access counts as a transfer. 

“I'm sure it’s right that not every time such access takes place from the United States will personal data falling within the scope of Part 3 be involved, but it’s probably fair to assume that at least for some of those transfers… there is, at a minimum, a very real risk of international transfers taking place in a way that is not really within the control of the controller,” he said. “It’s difficult to see how such international transfers could be lawful.” 

Conflicting DPIAs 

Other justice system bodies involved in the DESC project as joint controllers were also FOI’d for the DPIAs they had conducted, including the Crown Office Procurator Fiscals Service (COPFS) and the Police Service of Scotland (PSoS). 

The Scottish government itself was also FOI’d as the contracting body in charge of funding the system, but is not acting as a data controller and therefore is not legally obliged to complete a DPIA.  

Responding to an FOI, the Scottish government provided an extract of the contract between Axon and Scottish ministers, but as highlighted in the SPA DPIA, the contract is built around GDPR requirements, and this time makes no explicit mention of the Part 3 rules or the fact that law enforcement data is being processed.  

While the excerpt does mention the need for data to remain in the UK in two of the clauses, there is no indication that Axon and Microsoft are able to achieve this, given the concerns raised by the SPA.  

Further, the Scottish government disclosed that it has no idea what sub-processors are involved in the contract, or what countries data might be actually transferred to, calling into question what diligence it applied during the process.  

Under section 59 of the DPA, controllers (the Scottish criminal justice bodies) themselves must also have a direct written contract with the processors (Axon, in this case), but there is no indication this is the case. Although “data processing agreements” have been created, Sayers said the example provided falls far short of being an actual contract meeting the section 59 terms. 

The Scottish government disclosed that it has no idea what sub-processors are involved in the contract, or what countries data might be actually transferred to, calling into question what diligence it applied during the process

Computer Weekly contacted the Scottish government about DESC to ask why its basic due diligence did not identify the US-based sub-processors and which countries data would be stored or processed in, as well as its thoughts on whether Axon and Microsoft are technically capable of keeping data in the UK, given the concerns clearly identified by the SPA.  

“The Scottish government takes the privacy of citizens’ data very seriously and is collaborating with DESC partners to deliver a ground-breaking service which complies with statutory requirements and reduces risks related to storing and transporting physical evidence. This process includes engagement with the Information Commissioner’s Office and ensures that robust protections are in place to support the six-month pilot,” said a Scottish government spokesperson. 

“All digital evidence in the DESC system, currently limited to a number of more minor cases in a single court jurisdiction, is held securely and only accessible to approved personnel such as police officers, prosecutors and defence agents. Access to this information is fully audited and monitored, and processes are in place to ensure any data risks are quickly identified, assessed and mitigated.” 

Looking at the COPFS DPIA, it asserts – contrary to the SPA DPIA – that all data stays within Microsoft’s UK police-assured secure facility (PASF) datacentres, and that although there is a small risk of data being transferred to a third country, “this has been mitigated”. What that mitigation is remains unclear. 

However, it does acknowledge that “evidence contained in DESC falls under the jurisdiction of the US Cloud Act, which in principle, could provide a legal gateway for evidence (content data) to be provided to the US government without the knowledge of the data controllers or contracting authority. If this were to happen, this would have an adverse impact on a data subject’s rights”.

While the COPFS DPIA acknowledged that Axon and Microsoft are also subject to UK data protection laws, it again cited UK GDPR and not the law enforcement-specific Part 3.  

“On the basis of analysis of the legal advice received (which is consistent with the advice received by other DESC partners and which formed the basis of the DESC partners’ collective decision-making on this issue), the legal frameworks in place, the supplier data processor and UK GDPR legal and contractual responsibilities… the US Cloud Act and GDPR risks relating to Axon and Microsoft US-owned company status are mitigated or can be managed appropriately,” it said. 

However, the sections outlining all the risks and any mitigations have been entirely redacted, making it impossible to assess the level of risk COPFS believes it is carrying. The reason given for withholding this information was: “A purpose of the ongoing pilot is to identify and respond to any concerns that are identified. Premature disclosure of this information would prejudice our ability to do this effectively. I have therefore taken the view that it is not in the public interest to disclose this information.” 

On international transfers, it added that while Axon’s statements to COPFS and its agreements with them “do not fully mitigate the sovereignty risk”, it does provide “important controls to reduce risks and the likelihood of this issue happening”. As part of the mitigations, COPFS referred to standard contractual clauses (SCCs), which Sayers said has no relevance to Part 3 processing. 

COPFS also disclosed a sub-processor list, which explicitly states that certain parts of the service will be provided from US companies, including Axon’s parent company Axon Enterprise, as well as Twilio, Mixpanel and Qualcomm.  

In response to the FOI, COPFS confirmed it also does not hold information on the contract with Axon. 

The PSoS FOI disclosures highlight similar inconsistencies, and are arguably more significant given PSoS is acting as the lead data controller. Its correspondence with the ICO, for example, shows the PSoS had meetings with the ICO in December 2022 and January 2023 in which DESC was discussed.  

An ICO email from 20 January 2023 summarised the meetings, noting that the DESC pilot would begin on 24 January and would involve live personal data; that “there will be international transfers involved in the provision of technical services”; and that PSoS is “assured as the controller” that it is meeting all of the law enforcement data protection obligations. 

However, it noted: “If you have a remaining residual high risk in your DPIA that cannot be mitigated, prior consultation with the ICO is required under section 65 DPA 2018. You cannot go ahead with the processing until you have consulted us.”

Looking at the DPIA disclosed by PSoS under FOI, which was completed and signed off on 19 January, two unmitigated high risks remained, including that sub-processors of Axon are not subject to the terms and conditions, and that the suppliers are subject to the US Cloud Act. These risks were identified by the ICO in the December meetings, where it made clear that these would contravene Sections 59, 64 and 66 of Part 3 if they were not resolved. 

While the risks are marked as high, the DPIA shows they have been accepted by the senior information risk owner (SIRO). However, according to Sayers, these risks cannot be accepted by the SIRO because they relate to the rights and interests of data subjects, and are not security-related risks.  

“On the face of it, then, PSoS have proceeded to pilot without disclosing two high risks, which they have accepted but cannot mitigate,” he said. “They’ve done so immediately after a meeting and email exchange with ICO, where the ICO have been quite clear – you cannot proceed with high risks, and must formally refer them to us for our consideration first.”  

PSoS and COPFS responses

Responding to specific questions about their DPIAs and the DESC service, a PSoS spokesperson said: “Police Scotland takes data management and security very seriously and is working alongside criminal justice partners to ensure robust, effective and secure processes are in place to support the introduction of DESC. 

“All digital evidence on the DESC system is held securely and only accessible to approved personnel, such as police officers, COPFS and defence agents. Access to this information is fully audited and monitored, and processes are in place to ensure any data risks are quickly identified, assessed and mitigated.” 

It did not respond to direct questions about the unmitigated high risks accepted by the organisation, whether it has been approached by the suppliers about known compliance issues, and if it had formally consulted the ICO. 

In a similar response to Computer Weekly’s questions – which included further queries about how it has accounted for the discrepancy between its claims that all data stays within the UK and the disclosed sub-processor lists which explicitly states that certain parts of the service will be provided from US companies – a COPFS spokesperson said the organisation “takes seriously its responsibility to handle sensitive information carefully” and has robust processes in place to keep data secure.  

“We continue to collaborate with partners to deliver a secure and robust pilot scheme for the digital sharing of evidence, working to ensure data is processed securely, appropriately and in compliance with relevant statutory obligations,” they said. “The pilot scheme is innovative in approach, and is one part of a sector-wide transformation which will improve the experience of those involved in criminal cases.” 

According to Sayers, however, “it is not clear how COPFS can claim or reasonably believe that they have met all statutory requirements based on their own DPIA”. He further noted that while COPFS is claiming to have met all of its statutory obligations, the PSoS response has not made such claims.   

Prior knowledge

The issues highlighted by this spate of FOI disclosures have been ongoing for a number of years.  

Sayers noted that FOI responses from policing bodies are usually heavily redacted and will only be released after a series of internal reviews or ICO appeals. “Regardless of the content, the participants in this round of FOI requests do deserve full marks for their transparency, even if that’s exposed issues they might not have wanted to presently publish,” he said.

Previous coverage

As noted in the SPA DPIA, “there have been a number of journalistic articles recently highlighting that the use of hyperscale cloud, such as Microsoft and Amazon, does not comply with the requirements of Part 3 of the Data Protection Act 2018 and as such could lead to class actions by data subjects in the future”.

In December 2020, a Computer Weekly investigation revealed that UK police forces were unlawfully processing more than one million people’s personal data on the hyperscale public cloud service Microsoft 365, after failing to comply with key contractual and processing requirements within the Data Protection Act 2018, such as restrictions placed on international transfers. 

Computer Weekly also found that UK police forces had failed to conduct the necessary data protection checks before proceeding with their Microsoft 365 deployments. 

In May 2021, Computer Weekly also reported on Microsoft’s commitment to storing all of its European Union (EU) customers’ data within the bloc by creating an “EU Data Boundary”. 

“There does not appear to have been a denial or rebuttal published by Microsoft. Indeed, there may even have been a tacit admission of weaknesses in their product when they stated in a blog post that they would make Azure GDPR-compliant by the end of 2022,” said the SPA DPIA in reference to that story.  

The SPA’s correspondence with the ICO also refutes previous claims made by other policing bodies – including from England and Wales – that the regulator had been consulted on, and subsequently signed off, the use of US cloud providers throughout the UK criminal justice sector.  

“We are not aware of any approval or assurance in terms of processing in the cloud,” it said, in reference to being told by the SPA data protection officer that the roll-out of cloud services to police across the UK had been agreed upon by the National Police Chief’s Council (NPCC) and the ICO.  

Similar claims were made to Computer Weekly regarding ICO sign-off on the roll-out of Microsoft 365, which the ICO refuted at the time in December 2020. Many of the previously disclosed FOIs also do not assess the risks highlighted in the DPIAs completed by Scottish policing bodies. For example, there is no mention of the US Cloud Act, or the fact that Microsoft only provides generic contracts.  

On top of policing bodies’ claims to have obtained regulatory sign-off, the suppliers themselves have also been aware of the issues around servicing UK policing customers for a number of years.  

In November 2019, after initially discussing the problems at a conference in Scotland, Sayers provided Axon with a detailed breakdown of the Part 3 data protection issues associated with their services, particularly Microsoft’s use of generic terms and conditions, which Axon employees noted they would “do a proper review” of.  

The same issues were also highlighted by Sayers to Microsoft in February 2019, which were then reviewed by Microsoft’s legal team. Responding to Sayers in April 2019, the legal team noted: “Microsoft is ready to work with customers in connection with their compliance with the updated law [referring to the Data Protection Act 2018, which went into effect the previous May]. 

“However, as you indicate, our customers have a range of deployment landscapes and service scenarios with us and we trust that, as in the past, they have each done a detailed review of their own circumstances. As a result, we do not plan to distribute any proactive communications to our customers regarding the proposed changes to Part 3 of the DPA, but expect that those customers with questions or concerns will contact us directly with their queries.”  

It added that if customers of Microsoft have questions or concerns, “we would be happy to work with them to address those on a case-by-case basis through the usual process, so please instruct them to contact their Microsoft account executive or contact”.

Computer Weekly contacted both Axon and Microsoft about the emails and what actions the organisations have taken with their law enforcement customers to remedy the issues raised. 

“Axon has established and continues to enhance data protection measures to support all of our customers, including our contract with the Scottish government. Axon’s information security and privacy information management systems are independently certified to best practices, including ISO 27001 and 27701,” said an Axon spokesperson. 
 
“Given the dynamic regulatory and security environment in combination with our commitment to protect the data and services our products provide, we work closely with customers to ensure robust and effective safeguards are in place. 
 
“We are committed to continuing to develop and enhance Axon’s products to ensure customers can meet data protection and privacy expectations from their communities and regulatory environment when using Axon products.” 

Microsoft did not respond.

Moving forward

Sayers said, however, that ultimate responsibility lies with the policing and criminal justice bodies. 

“Whilst there is no doubt that both Microsoft and Axon have known about these issues for some time, and have arguably done nothing to resolve them, the obligation to analyse and surface the problems for them to be addressed, or the services to be confirmed as unsuitable for police use, has always sat with the law enforcement community,” he said. 

“The problem at its root is that whilst others knew or suspected these problems existed, they haven’t applied the diligence or effort that all of the Scottish DESC controllers have” 
Owen Sayers, independent security consultant and enterprise architect

“The problem at its root is that whilst others knew or suspected these problems existed, they haven’t applied the diligence or effort that all of the Scottish DESC controllers, but in particular the SPA, have.” 

Sayers said the solution for UK policing has already been covered by ICO guidance and the emails described with Axon and Microsoft, which is to move the data processing to a platform that is 100% UK-based and which meets all of the Part 3 requirements.  

“The size of police data pools is huge, and their systems are complex,” said Sayers. “But until recently, each force has had two or more of their own datacentres that, if combined with commercial police-assured sites in the UK, could be used to create a Federated UK Justice Cloud relatively quickly.  

“The means and the skills to do it certainly exist – all that’s needed is the will to do so. Whilst it might take a few years to transition all this data from these legally difficult-to-use public clouds to new UK-based and legal alternatives, during which time risks for compensation or legal challenges might continue to exist, the end result – a UK sovereign capability employing thousands of skilled workers and controlling UK citizen data inside the UK – has to hold some attractiveness.” 

Computer Weekly asked the ICO whether it would investigate the use of US cloud providers by UK law enforcement bodies, but received no response by time of publication.  

Read more about police technology 

Read more on Business applications