Europol
Europol gears up to collect big data on European citizens after MEPs vote to expand policing power
The European Parliament has voted to expand Europol’s role, legalising its processing of bulk datasets containing personal information and endorsing research into predictive policing technologies
The European Parliament has voted to give Europol wide powers to collect and process data on individuals, including people not suspected of any crime, in a move that significantly widens the power of the European police agency.
MEPs voted on 4 May to widen the mandate of Europol to collect personal data from tech companies, including telecoms and internet suppliers and social media firms, and to collect and analyse data from countries outside the European Union (EU).
The proposals also give Europol the go-ahead to develop algorithms and artificial intelligence (AI) systems capable of making automated decisions and developing predictive policing models.
The Parliament’s draft regulation effectively overturns an order by the European Data Protection Supervisor (EDPS) in January 2020 that required Europol to delete swathes of personal data it had collected and processed unlawfully.
MEPs voted by 480 in favour of a significant expansion of Europol’s role, with 142 against and 20 abstentions, in a move that will extend Europol’s use of big data and artificial intelligence in criminal investigations.
The proposed regulation introduces measures to protect the privacy of individuals, including the appointment of a fundamental rights officer at Europol and independent oversight by the EDPS.
A ‘massive, unchecked expansion’ of powers
But the move has been criticised by civil society groups, and some MEPs, which claim that it amounts to a “massive, unchecked expansion” of Europol’s powers and could represent a first step on the road to creating a European version of GCHQ or the US National Security Agency.
“Europol will be allowed to collect and share data left, right and centre, without much restriction or control,” said Chloé Berthélémy, policy rights advisor at European Digital Rights (EDRi), a network of civil and human rights organisations in Europe.
The draft legislation is in part a response to growing requests for Europol to analyse increasingly large and complex datasets to identify crimes in multiple countries inside and outside of the European Union.
Chloé Berthélémy, European Digital Rights
One example is an operation by the French and the Dutch police in 2020 to hack into the encrypted phone network EncroChat, sweeping up text messages from tens of thousands of phones along with details of users’ contacts, notes, videos and voice messages, their pseudonyms, and unique phone identifiers.
The proposed new legislation will expand the range of data Europol can retain and process beyond its existing mandate, which restricts the police agency to processing data only on individuals who had a clear and established link to criminality.
The draft legislation will also allow Europol to share data – which could include IP addresses, URLs and the content of communications – with companies, including financial institutions and online platforms.
European member states provide Europol with datasets, which are understood to include passenger flight records, mobile phone locations and datasets of open source data which might include social media posts scraped from the internet.
Europol will also be able to receive data from internet service and technology companies, such as Google and Facebook, including subscriber, traffic and content data that may be relevant to criminal investigations.
It will take on an additional role to assess the strategic risks posed by foreign investors in emerging technologies in Europe, particularly those used by law enforcement and critical technology that could be used for terrorism
Artificial intelligence built on ‘dirty data’
The draft law also gives Europol a mandate to research innovative technologies including AI technology and algorithmic decision-making, which, for example, could be used to predict which individuals are likely to be involved in criminal activity.
Europol will also be able to legally train algorithms on datasets containing the personal information of individuals who are not suspected of any crime, in a move described by lawmakers as necessary to remove bias from algorithms trained purely on criminal data.
Critics point out that the MEPs’ endorsement of Europol’s use of automatic decision-making is at odds with the European Parliament’s endorsement of a ban on the use of algorithms for predictive policing.
The Parliament agreed in October 2021 that the AI techniques used today are likely to have a discriminatory impact on racial groups and marginal communities, children, the elderly and women.
Berthélémy, policy advisor at EDRi, said member states were not paying enough attention to the quality of data they were sending to Europol that will be used to train algorithms.
“Europol will develop and use algorithms based on data received from national police forces, but the nature and origin of these datasets have not been questioned enough,” she said.
“They can be distorted by racial biases or come from corrupt and unlawful practices. This ‘dirty data’ will result in Europol’s technologies overly targeting certain socioeconomic, racial or ethnic groups, ultimately reinforcing structural inequalities,” she added.
Safeguards and supervision not enough
The draft regulation includes “enhanced safeguards, democratic oversight and accountability mechanisms”, according to a report on the proposed regulation.
These are intended to ensure that the activities and tasks of Europol are carried out in full compliance with the European Charter on Fundamental Rights, stated the report by the Committee on Civil Liberties, Justice and Home Affairs.
This includes the rights to equality before the law, to non-discrimination, and to an effective remedy before the competent national court.
Processing of personal data “is limited to what is strictly necessary and proportionate and subject to clear conditions, strict requirements and effective supervision” by the European Data Protection Supervisor.
However, civil society groups claim that the proposed data protection safeguards do not go far enough.
EDRi advisor Berthélémy said the plans for Europol to internally appoint a fundamental rights officer (FRO) to protect the privacy rights of data subjects, coupled with the limited powers of the EDPS, fall far short of the independent regulation the policing agency requires.
The FRO model was taken from the European border and coast guard agency, Frontex, where it has already proved ineffective, she said.
“They are just reproducing a mechanism that hasn’t really proven to be efficient from another agency involved in human rights violations at the borders of the EU,” said Berthélémy.
Laure Baudrihaye-Gérard, legal director in Europe for the campaign group Fair Trials, said increased powers for Europol should come with an increase in oversight.
“MEPs should have taken the chance to stand up for people’s rights instead of paving the way for an unaccountable model of policing that sends a worrying signal to all police forces in Europe,” she said.
There are also concerns that there is a lack of adequate safeguards and transparency over the quality and accuracy of data shared by member states with Europol.
Unreliable data can be “laundered” through Europol, shared with member states and used as evidence in criminal proceedings, said Berthélémy.
That would make it impossible to identify errors or determine whether the evidence was collected legally by the member state that supplied it.
Chloé Berthélémy, European Digital Rights
“Europol is concluding more and more agreements with third countries, many of which have a bad record on human rights,” she said. “We have called Europol a data laundering machine, because we are not sure how trustworthy the data is, whether it’s good quality and whether it’s been validated by a judge or an independent judicial authority.”
Berthélémy said Europol was also encouraging private organisations to share data voluntarily, which it passes on to member states, in a move that could breach European Union treaties.
Baudrihaye-Gérard said courts cannot exercise meaningful judicial oversight whether data collected and analysed by Europol is correct.
Defence lawyers do not have access to the information they need to prepare a defence. “We are really facing a complete emptying of the right to a fair trial when Europol is concerned,” she said.
Europol’s move into big data raises compliance concerns
Europol began offering European member states services to forensically analyse data in 2002. Demand for these services has grown drastically over the years as police agencies across Europe sought to exploit the power of “big data”.
By 2019, the European Data Protection Supervisor reported that Europol was processing increasing amounts of untargeted data for intelligence and investigation purposes – including datasets containing information on innocent people.
In April that year, Europol’s executive director, Catherine De Bolle, reported major compliance concerns to the EDPS, sparking a formal enquiry.
The following April, Europol began receiving huge quantities of data from a novel hacking operation conducted by the French police into encrypted phone network EncroChat.
The agency received data from 120 million EncroChat messages, analysed the data to identify the country of origin, and delivered it to police forces in Germany, France, Sweden, the UK and other countries which have made thousands of arrests of organised criminal gangs.
In September 2020, the EDPS admonished Europol – without naming specific investigations – for putting data subjects’ privacy rights at risk by continuing to store large volumes of personal data, in some cases for years, without assessing whether the individuals had any link to criminality.
“The processing of data about individuals in an EU law enforcement database can have severe consequences on those involved,” the supervisor, Wojciech Wiewiórowski, wrote at the time.
“Without putting in place the safeguards provided in the Europol regulation, individuals run the risk of being wrongly linked to criminal activity across the EU, with all the potential damage to their private and professional lives that that entails,” he added.
Normalising widespread data collection
The EDPS and Europol failed to reach an agreement during subsequent negotiations and, in January this year, the EDPS ordered Europol to delete all data it held on individuals with “no established link to criminality”.
The order required Europol to erase datasets older than six months, in cases where Europol had failed to categorise the data subjects and could not be certain that they did not contain information about people with no established links to crime.
It was possible, the EDPS found, that data on innocent individuals may also have been extracted and shared with third parties.
The regulator ordered Europol to notify third parties to delete any data wrongly handed over.
EDRi’s Berthélémy told Computer Weekly that a major goal of the latest reform was to legalise Europol’s processing of datasets that contain personal data on individuals who are not suspected of any crime.
Under Europol’s 2016 mandate, the police agency is limited to processing data, defined in a list known as Annex 2, that relates to people with connections to terrorism, drug trafficking and other cross-border organised crime.
The latest proposals allow Europol to derogate from the restrictions in Annex 2, with no requirement to inform the European Data Protection Supervisor until after a criminal investigation has been completed – which could take years.
One effect of the change, said Berthélémy, would be to ensure that Europol’s mass collection of data during the operation against EncroChat, which gathered messages, photos, videos and other data, not all of which may not have been linked to criminal activity, is legalised.
“It is really about normalising this massive data collection,” she said.
Patrick Breyer, MEP for the Pirate Party and a member of the Joint Parliamentary Scrutiny Group, which monitors Europol, said that despite concerns from civil society groups and a rebuke from the European Data Protection Supervisor, Europol was to be allowed to collect and analyse “massive amounts” of data on individuals who are not suspected of crime.
“In consequence, innocent citizens run the risk of being wrongfully suspected of a crime just because they were in the wrong place at the wrong time,” he said.
According to Breyer, Europol’s plans to train “error-prone” algorithms with data from real citizens in the future threatened “false positives” and discrimination.
He said Europol should be regulated more effectively to ensure it did not breach the law further. “The supervisory mechanisms, which have been superficial so far, have not been given the necessary teeth to detect and stop illegal practices by the authority,” he said.
Schengen reforms
A separate proposal amending the Schengen Information System (SIS) regulation to enable Europol to make third country information available to frontline officers has also been agreed, but in a modified form.
Europol will not be able to submit its own SIS alerts, as originally intended, but, in a compromise agreement to meet the sensitivities of member states, it will be allowed to request member states to add alerts on its behalf.
In a statement after the European Parliamentary vote, rapporteur Javier Zarzalejos said: “This regulation, and the new mandate for Europol, mark a substantial leap forward in the capabilities of the agency, in its ability to support member states, in its governance framework and, last but certainly not least, in the enhanced system of safeguards we have put in place.”
The legal text will now head to the Council of Europe for formal adoption before it comes into force.
The Schengen proposal is due to be discussed in the European Parliament plenary between 6 and 9 June.
What you need to know about Europol’s proposed expanded mandate
Key measures
- Europol to advise on trends and threats in serious organised crime and terrorism.
- Europol to screen risks posed by foreign investments into emerging technologies, particularly those used by law enforcement and critical technology that could be used for terrorism.
- Europol will be able to request member states to investigate crimes that do not cross borders if they impact the “common interest” of the European Union (EU).
Big data
- Europol to “pre-assess” whether personal data it already holds can be legally processed and is relevant and necessary.
- Europol will be able to process large and complex datasets to detect links with crimes in member states and process the data for the duration of an investigation.
- Europol should be able to process data supplied by third countries for use in investigations in EU member states, provided a number of safeguards are met.
- Europol should verify whether it is “necessary and proportionate” to process personal data that falls outside its 2016 legal framework which restricts personal data to those suspected of a crime. The move paves the way for Europol to process data on those with no criminal connections.
AI and technology research
- Europol should support research projects to develop technologies to identify, secure and analyse data needed to investigate crimes.
- The processing of sensitive categories of personal data for research should only be allowed where strictly necessary using pseudonymous data.
- Artificial intelligence (AI) technology used for algorithmic decision-making should be trained on representative datasets to avoid bias. Europol should be allowed to train technology on data that falls outside data allowed in its 2016 legal framework, for example data on people not suspected of crimes, where strictly required to prevent bias.
- Europol should keep logs of all personal data processing used in research projects to allow the European Data Protection Supervisor (EDPS) to carry out audits to ensure that AI does not breach human rights and is not discriminatory.
- Europol should promote the development of ethical, trustworthy and human-centric AI, subject to robust safeguards, transparency and human rights.
Compliance
- EU states should ensure they comply with national and EU law before submitting personal data to Europol.
- Europol should store investigative case files until criminal proceedings in a member state are concluded to ensure the veracity, reliability and traceability of criminal intelligence.
- Europol should log all evidence and the methods by which it has been obtained to allow for effective scrutiny of evidence by defence lawyers.
Sharing data with tech companies
- Europol should act as a contact point for service providers to voluntarily share datasets of personal data covering multiple jurisdictions with Europol.
- Europol should be able to receive information from technology companies relevant to serious crime and terrorism. In exceptional circumstances Europol can also exchange personal data with technology companies and private organisations.
Sharing data overseas
- Europol should be able to process datasets supplied to it to assess their relevance to law enforcement in member states and to forward it to those states, or third countries, it has cooperation or data protection adequacy agreements with.
- Europol may transfer personal data to companies outside the EU where there are corporation agreements or data protection adequacy agreements in place.
Terrorism and child abuse
- Europol should be able to share relevant personal data, including hashes, IP addresses or URLs related to terrorism or child abuse content, with technology companies inside the EU – or outside the EU where there are agreements in place – to allow its removal or prevent its dissemination.
Data protection and regulation
- Regulation (EU) 2018/1725, which lays down data protection obligations for EU institutions, should now be made applicable to Europol.
- Expanding the mandate of Europol and the scope of its data processing will require additional financial and human resources to be given to the EDPS.
- Europol should provide a Joint Parliamentary Scrutiny Group (JPSG) with a detailed annual report on its research projects and activities. Two members of the JPSG should be invited to Europol management board meetings.
- Europol should appoint a fundamental rights officer to safeguard the rights of people when their data is used by Europol, to work alongside its data protection officer.