chalabala - Fotolia

Swedish police fined for unlawful use of facial-recognition app

Sweden’s data watchdog has found that Swedish police failed to conduct the data protection checks required by law before using controversial facial-recognition tool

Sweden’s data protection authority IMY has fined the Swedish Police Authority €250,000 for unlawfully using controversial facial-recognition technology company Clearview AI to identify individuals in breach of the country’s Criminal Data Act.

Following reports in local media, an investigation by the IMY found that between the autumn of 2019 and 3 March 2020, Clearview AI’s app had been used by Swedish police on a number of occasions, and that several employees had used the facial-recognition tool without prior authorisation.

“The police has not fulfilled its obligations as a data controller on a number of accounts with regards to the use of Clearview AI. The police has failed to implement sufficient organisational measures to ensure and be able to demonstrate that the processing of personal data in this case has been carried out in compliance with the Criminal Data Act,” wrote IMY in statement.

“When using Clearview AI the Police has unlawfully processed biometric data for facial recognition, as well as having failed to conduct a data protection impact assessment [DPIA] which this case of processing would require.”

As part of the enforcement action, the Police Authority has been ordered to train and educate its employees further to avoid any future breaches of data protection rules when processing personal data.

It has also been ordered by the IMY to inform people whose privacy had been breached when confidentiality rules allow.

“There are clearly defined rules and regulations on how the Police Authority may process personal data, especially for law enforcement purposes. It is the responsibility of the police to ensure that employees are aware of those rules,” said IMY legal advisor Elena Mazzotti Pallard.

The data protection authority further stated it has been unable to determine what has happened to the personal data of those who the police shared images of with Clearview AI, and whether the company still holds the information. The IMY has therefore ordered police to ensure Clearview AI deletes the data it holds on them by 15 September 2021.

The IMY has previously issued a biometrics-related fine to a secondary school in October 2019 under the General Data Protection Regulation (GDPR), when it was revealed the school had also failed to carry out a sufficient DPIA before deploying facial-recognition tech in a time-limited test to identify students attending classes.

In July 2020, the UK’s Information Commissioner’s Office (ICO) and the Office of the Australian Information Commissioner (OAIC) announced the agencies would be jointly investigating Clearview AI for its alleged use of scraped data and biometrics of individuals collected from the internet without their permission.

Toni Vitale, partner and head of data protection at JMW Solicitors, told Computer Weekly at the time: “The technology allowing companies to scrape data from the internet and combine it with information about users has been around for many years, but what Clearview appears to have done is to combine personal data with photos of individuals obtained from the internet.

“If the data is used strictly for the purposes of law enforcement, consent of the individuals is not required under either UK or Australian laws, but it does beg the question how transparent Clearview has been about its practices and what it does with the data of unmatched data subjects.”

UK police also struggle with DPIAs

Police forces in the UK have also struggled with creating sufficient DPIAs before under the Data Protection Act 2018 (DPA 18), which like Sweden’s Criminal Data Act was also transposed from the European Union’s (EU) Law Enforcement Directive and therefore has similar due diligence requirements.

For example, South Wales Police attracted criticism in an appeal case for its use of automated facial recognition, in which the court ruled: “The DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found.”

Following a freedom of information (FoI) investigation, Computer Weekly revealed in December 2020 that UK police forces were unlawfully processing over a million people’s personal data on the hyperscale public cloud service Microsoft 365 (M365) after failing to conduct data protection checks before their deployments.

Although the M365 roll-out has been underway since September 2017, Computer Weekly’s FoI requests show that every force involved at that point, with the exception of Kent, were yet to conduct the DPIAs legally required prior to any new tech deployments.

Computer Weekly also found that police forces had failed to comply with key contractual and processing requirements of the DPA 2018, such as the restrictions placed on international transfers, as well as certain activity and consultation logging requirements.

During the investigation, the National Enabling Programme (NEP), which is spearheading the delivery of new cloud-enabled ways of working for UK police, told Computer Weekly that the ICO received a full copy of its M365 DPIA, and that the data protection regulator had “provided detailed comments and feedback on the document.”

Under the DPA 18, it is mandatory to send a DPIA to the ICO when the processing of personal data presents a high risk that cannot be mitigated.

However, when asked by Computer Weekly if it had indeed been consulted on the national DPIA, the ICO initially refused to confirm either way.

When told of the NEP’s claim, an ICO spokesperson then responded by directly contradicted the NEP’s claims, saying: “We provided informal data protection advice on the National Enabling Programme, but a data protection impact assessment was not formally submitted for consultation with the commissioner.”

Read more about data collection

Read more on Privacy and data protection