mixmagic - stock.adobe.com
UK data exchanges with EU can continue after adequacy decision - but for how long?
For now European businesses can continue to send data to the UK without additional safeguards and paperwork. How long will it last?
The European Commission has finally granted the UK data adequacy, only two days before the expiry of the extended transition period for data flows. This means that, for now, European businesses can continue to send data to the UK without additional safeguards and paperwork. This is great news for UK plc, especially the tech sector, but there’s a sting in the tail.
The UK adequacy decision is uniquely vulnerable. It has a sunset clause, so will have to be formally reviewed and presented again for approval in 2025.
It will be under continuous review because the Commission, the European Parliament and the European Data Protection Board all fear that the UK intends to diverge significantly from the approach to data protection enshrined in the General Data Protection Regulation (GDPR).
The EC has already included an exception – data may not be transferred to the UK for the purpose of processing immigration, because the UK Court of Appeal ruled that UK data protection exemptions for immigration control are unlawful.
Future fears
There are two reasons to be fearful about the future of UK adequacy. Firstly, the European Data Protection Board (EDPB) and its constituent supervisory authorities continue to be focused on transfers of personal data to countries that operate mass surveillance.
The impetus for this enforcement activity comes from the European Court of Justice’s (ECJ) Schrems II judgement in July 2020, which abrogated the adequacy agreement then in force for the USA because of excessive surveillance of foreign citizens.
However, the ECJ also found the UK’s equivalent surveillance regime, as enacted in the Investigatory Powers Act, to be unlawful as recently as October 2020. UK adequacy is certain to face legal challenges as a consequence and it would be difficult for the ECJ to maintain its position on the US while permitting UK adequacy to continue.
The UK is also continuing to signal an intention to diverge substantially from the European consensus on data protection. The narrative is spread across the National Data Strategy, launched in September 2020, the very public opinion piece by Oliver Dowden, the secretary of state for digital, culture, media and sport published in the Financial Times in February of this year, and the Taskforce on Innovation, Growth and Regulatory Reform (TIGRR) led by Iain Duncan Smith, which issued its report earlier in June 2021.
There are common threads to the UK’s approach throughout: data protection is an onerous burden; it should be easier to monetise personal data; there should be less regulation of so-called artificial intelligence (AI); the regulator should support the commercial exploitation of personal data.
All of these are at odds with the European position.
Fundamental rights
Data protection is enshrined in the EU Charter of Fundamental Rights - ironically, of course, inspired by the European Convention on Human Rights originally drafted by Britons – as the eighth basic human right. Europe doesn’t see data protection as superfluous bureaucracy but as a vital part of the balance between citizens, government and private enterprise.
In that European view, rights take precedence over revenue. The UK seems to be aligning itself more closely with the US position, as exemplified by the California Consumer Privacy Act, where the focus is less on protecting citizens from abuse of their data and more on ensuring that they see a slice of the profits from the sale of their data. This is made explicit in the TIGRR report in particular, which not only states (in paragraph 205), that “personal data is currency” but also suggests that some revenue-participation mechanism might be developed by the state.
AI – or machine learning, as it is more correctly known - is seen in Europe as presenting a significant risk to people. There is no shortage of evidence that machine learning can enshrine racial, gender and privilege biases – we have seen recent examples including US automated pre-trial risk assessment being biased against people of colour; Amazon’s AI candidate selection process discriminating against women; and the wide-ranging bigotry in the ImageNet database. We’ve also seen European initiatives to strengthen protections against possible harm from automated decision-making, including enacted legislation in Denmark and the proposed EU Regulation for AI.
Read more about UK data adequacy
- Assessing UK law enforcement data adequacy
- European Commission proposes UK data adequacy agreement
- UK-US data deal puts Brexit data adequacy pact at risk
The independence of the data protection regulator is a key consideration in determining adequacy, according to Article 45.2(b) of GDPR. The effectiveness of the UK’s regulator, the Information Commissioner’s Office, has been called into question already, and Dowden’s suggestion that the next commissioner will be a political appointee charged with ensuring that “people can use data to achieve economic goals” further suggests that the UK is not keen to see either government or the private sector held to account.
If the UK continues down its present road, we can expect to see legal challenges brought by pressure groups across the UK and Europe, with the strong likelihood that the ECJ will rule in their favour.
We can also expect the EDPB and its constituent members to put pressure on the European Commission to review the adequacy decision and either impose additional exceptions or abrogate it entirely.
And above all we can expect EU businesses and consumers to vote with their feet. Sending data to countries without adequacy is much more onerous than it used to be – why risk committing to transfers to the UK if adequacy is one court case or Commission decision away from being lost?
The UK’s citizens will be the real losers – caught between economic pressure from reduced business with Europe on the one hand and dilution of their fundamental rights to privacy and data protection on the other.
Ben Rapp is a co-founders of Securys, a privacy consultancy helping enterprises build stakeholder trust by delivering data privacy globally and locally.