sdecoret - stock.adobe.com

Facebook could face ICO fine of up to £500,000

Facebook could face the maximum fine provided by the UK’s previous data protection laws for lack of transparency and security issues relating to the harvesting of data, the UK’s privacy watchdog has revealed

Facebook is guilty of two contraventions of the UK Data Protection Act 1998 and could be liable for a monetary penalty up to the maximum allowed under that law of £500,000, according to the Information Commissioner’s Office (ICO).

This is one of the key findings included in the latest update published by the ICO about its investigation into how personal information is used in modern political campaigns that was announced in March 2017.

Publication of the information is aimed at meeting the ICO’s commitment to provide Parliament’s Digital Culture Media and Sport (DCMS) Select Committee with a progress update on the investigation for the purposes of informing their work on “fake news” which includes use of personal information in political campaigns.

The ICO expressed concerns about what it called “invisible processing” referring to “behind the scenes” algorithms, analysis, data matching, profiling that involves people’s personal information.

In May 2017, the ICO announced it was launching a formal investigation into the use of data analytics for political purposes after allegations were made about the ‘invisible processing’ of people’s personal data and the micro targeting of political adverts during the EU Referendum.

Nearly a year later, the ICO revealed that Facebook and London-based data mining firm Cambridge Analytica were among 30 organisations under investigation for misusing personal data and analytics for political and other purposes.

The announcement came just two weeks after new emerged of the data exploitation scandal involving Facebook and Cambridge Analytica that involved 87 million users, including nearly 1.1 million Britons.

Read more about Facebook and privacy

Profile data was extracted by a quiz app downloaded by just 305,000 people, but it was designed to pull in data of their Facebook friends. Facebook calculated the number of people affected based on the maximum number of friend connections for each of the Facebook users who downloaded the quiz app.

Cambridge Analytica allegedly used the data for Donald Trump’s 2016 presidential election campaign. Another company named in the investigation was AggregateIQ, a Canadian data company that reportedly played a role in the Vote Leave campaign in the UK referendum on leaving the European Union.

After more than a year’s work on what has been described as the “biggest and most far-reaching” investigation the ICO has ever conducted, information commissioner Elizabeth Denham has published a detailed update on the investigation.

The update reveals that Facebook and Cambridge Analytica have been the focus of the investigation since evidence emerged that an app had been used to harvest the data of millions of Facebook users around the world.

The ICO’s investigation, said Denham, concluded that Facebook contravened the law by failing to safeguard people’s information. It also found that the company failed to be transparent about how people’s data was harvested by others.

However, Facebook has a chance to respond to the commissioner’s Notice of Intent later this month, after which a final decision will be made.

Escaping a greater fine

Facebook escaped a much greater fine because of the timing of the breaches, which meant the ICO was unable to levy the penalties under the EU’s General Data Protection Regulation (GDPR), which allows fines of up to €20m (£17m) or 4% of global turnover, which would have meant a potential fine of up to $1.6bn (£1.2bn) for Facebook. 

“Our policy on Communicating Regulatory Actions makes clear that while we would not normally publish a Notice of Intent, we may do so where there is an overriding public interest. In this case we consider that the overriding public interest and the commitment to update the DCMS committee so it can progress its work mean that we decided in favour of publishing the Notice,” the ICO said.

Other regulatory action set out in the report includes warning letters to 11 political parties and notices compelling them to agree to audits of their data protection practices.

The ICO has issued an Enforcement Notice to Cambrige Analytica’s parent company SCL Elections Ltd to compel it to deal properly with a subject access request from Professor David Carroll as well as well as a criminal prosecution against SCL Elections Ltd for failing to properly deal with the ICO’s Enforcement Notice.

The ICO has issued an Enforcement Notice for Aggregate IQ to stop processing retained data belonging to UK citizens; a Notice of Intent to take regulatory action against data broker Emma’s Diary; and ordered audits of the main credit reference companies and Cambridge University’s Psychometric Centre.

The ICO also recommends that Universities UK work with all universities to consider the risks arising from use of personal data by academics in a university research capacity and where they work with their own private companies or other third parties.

Appropriate safeguards

As part of its investigation, the ICO is considering whether Cambridge University has sufficient systems and processes in place to ensure that data collected by academics for research is appropriately safeguarded in its use and not re-used for commercial work.

The ICO is investigating allegations that Eldon Insurance Services Limited shared customer data obtained for insurance purposes with Leave.EU and that the data was then used for political campaign purposes during the EU Referendum.

The investigation is also looking at whether Eldon Insurance Limited’s call centre staff used customer databases to make calls on behalf of Leave.EU in contravention of the Privacy and Electronic Communication Regulations 2003, and investigating allegations that insurance customer data was sent to the US and in particular to the University of Mississippi.

“We are at a crossroads. Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes,” said Denham.

“New technologies that use data analytics to micro-target people give campaign groups the ability to connect with individual voters. But this cannot be at the expense of transparency, fairness and compliance with the law,” she said.

Restoring trust

Denham added that fines and prosecutions punish the bad actors, but she said the ICO’s real goal is to effect change and to restore trust and confidence in the democratic system.

According to the investigation report, political parties use many data sources, including data brokers, to build a fuller picture of potential voters that includes things like their poltical affiliation, education level, occupation, gender and house ownership.

The information collected is used to create a personal profile, which is used to target individual voters through social media using analytics to make predictions or assumptions about them.

A second, partner report, titled Democracy Disrupted? Personal information and political influence, sets out findings and recommendations arising out of the 14-month investigation.

Among the ten recommendations is a call for the UK government to introduce a statutory Code of Practice for the use of personal data in political campaigns and a call on all of the platforms covered in the report to roll out planned transparency features in relation to political advertising to the UK.

One of the most concerning findings from the investigation was a significant shortfall in transparency and provision of fair processing information, the ICO said.

An ethical pause

In response, Denham has called for “an ethical pause” to allow government, parliament, regulators, political parties, online platforms and the public to reflect on their responsibilities in the era of big data before there is a greater expansion in the use of new technologies.

“People cannot have control over their own data if they don’t know or understand how it is being used. That’s why greater and genuine transparency about the use of data analytics is vital,” she said.

In addition, the ICO has published research commissioned from the Centre for the Analysis of Social Media at the independent think tank Demos. The research report examines current and emerging trends in how data is used in political campaigns, how use of technology is changing and how it may evolve in the next two to five years. 

The ICO investigation, one of the largest of its kind by a data protection authority, remains ongoing. According to the ICO, the 40-strong investigation team is pursuing active lines of enquiry and reviewing a considerable amount of material retrieved from servers and equipment.

The next phase of the ICO’s work is expected to be concluded by the end of October 2018.

Read more on Privacy and data protection