momius - stock.adobe.com
GDPR: An overview of the latest data protection legislation
The forthcoming General Data Protection Regulation (GDPR) will have wide-ranging implications for every organisation that processes personal data. This comprehensive overview tells you what you need to know
Most people associate data protection with legislation, most recently the Data Protection Act 1998 and the General Data Protection Regulation 2016 (GDPR). However, it should be appreciated that this legislation is merely one method, albeit an important one, by which an individual or company can seek to protect data.
Other methods, which are sometimes relevant, include intellectual property rights, such as the law of confidence, which seeks to protect confidential commercial information; patent rights, which may prevent the use of certain information without the permission of the patent owner; and database rights. Under database rights legislation, which applies to databases created on or after 1 January 1998, the creator of a database has the right to prevent the extraction or re-utilisation of the whole or a substantial part of the contents of the database.
Where does data protection legislation come from?
In the UK, the first iteration of data protection legislation – the Data Protection Act 1984 – stems, not from membership of the European Union (EU), but from the UK’s membership of the European Court of Human Rights, under which there is an inherent right of an individual to personal privacy.
The more recent Data Protection Act 1998 stems from a European Community Directive of 1995. In contrast, the GDPR, being a regulation, does not require any instantiation into UK law and applies automatically, with full effect from 25 May 2018.
The purpose of the Data Protection Directive and the GDPR is to provide a “common market” throughout the European Union for data protection. By abiding with the legislation, the “benefit” is that no member state can prevent data flowing into or out of its territory, into another country within the European Community, on the grounds that the other European country does not have data protection as strong as its own.
In practice, large differences exist between various member states. For example, in Scandinavia, criminal prosecutions for breaching data protection legislation are far more common than in the UK where they are a rarity. Some of these differences will be maintained after the GDPR comes into force.
Overview of the GDPR
No realistic explanation of the GDPR can be given in anything less than a text book. The legislation itself exceeds 92,000 words. To put that into context, the length of a typical novel is between 75,000 and 100,000 words.
Neither is there a magic bullet to understanding the breadth and significance of the GDPR. The focus of the previous legislation was on registration with the information commissioner. Through that registration, an organisation would indirectly inform individuals as to what that organisation did with personal data – where and how it was collected, how it was processed, and to whom it was given.
Fundamentally, the GDPR changes that, so an organisation must now directly inform individuals of what it does with their personal data – where and how it collects the data, how it is processed, and to whom it is given. The organisation has duties to make this communication when it first collects data about individuals and when it changes how it will use that data in the future.
The GDPR, as with the earlier data protection legislation, deals only with data about living individuals, not business data. Data from which individuals cannot be identified, for example anonymised data, is not subject to the restrictions of the legislation.
The legislation primarily applies to a data controller – the organisation that “determines the purposes and means of processing personal data”. Some obligations are also placed on a data processor – an organisation that “processes personal data on behalf of the controller”.
Data processing justifications
Any processing of personal data by a data controller must be justified. There are a number of gateways by which that justification can be shown, most importantly:
- The data subject has given consent;
- The processing is necessary for the performance of a contract (or a potential contract) to which the data subject is party;
- The processing is necessary so that the controller can comply with a legal obligation;
- The processing is necessary to protect the vital interests of an individual;
- The processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject.
Data protection principles
Data protection has always provided for a number of principles that must be complied with. These principles have been modified by the GDPR. More importantly, the GDPR increases the penalties for non-compliance with these principles (see “enforcement by the information commissioner” section below).
The first, which was initially limited to data being processed fairly, has now been extended to a requirement that data be processed fairly and in a transparent manner.
Other data protection principles include:
- Personal data shall be collected for specified, explicit and legitimate purposes, and not further processed in a manner that is incompatible with those purposes;
- Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which it is processed (data minimisation);
- Personal data shall be accurate and, where necessary, kept up to date;
- Personal data shall remain identifiable for no longer than is necessary;
- An obligation to ensure appropriate security of the personal data.
There is also a new obligation to take appropriate organisational and technical measures to ensure that the system by which data is collected and organised is in accordance with the principle of “privacy by design”. This means data should only be collected and organised in a manner which meets these principles and only processed where “necessary for [the purposes] of the processing”.
This principle should be adhered to both at the time the system is first designed and at the time the processing takes place. As has been noted, however, it is not an absolute requirement, rather it is an obligation to take “appropriate technical and organisational measures”. Factors affecting what is appropriate include the size of the company undertaking the processing, its financial resources, as well as the nature of the data being processed.
Appropriate security
A full discussion of the data protection principles is outside the scope of this article. However, the most important from a cyber security perspective is the obligation to ensure appropriate security of personal data.
The full principle is that the data controller – the company determining how the personal data is to be processed – must ensure that the personal data is “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures”.
While there is no guidance on the interpretation of this principle under the GDPR, the principle under the previous legislation was to be interpreted having regard to:
- The state of technological development;
- The cost of implementing any measures;
- The harm that might result from such unauthorised or unlawful processing or accidental loss, destruction or damage;
- The nature of the data to be protected.
The overriding principle is that the security measures adopted must be “appropriate”.
In the context of cyber security, some obvious conclusions can be drawn:
- The security of a system should be regularly reviewed. LinkedIn is but one example of a well-known company caught out by having out-of-date security;
- More security should be offered where the loss of data might cause greater damage, for example as with credit card or identity data;
- More security should be offered where the data is more sensitive, for example in respect of health data;
- Just because your system is infiltrated by a professional hacking group such as Anonymous does not mean that you have inadequate security.
Data protection rights
The GDPR provides an extensive number of rights for individuals in respect of their data, which are chiefly:
- The right to object to data profiling and direct marketing;
- The right to be forgotten;
- The right to obtain a copy of their personal data;
- The right to have inaccurate data corrected;
- The right to data portability;
- The right not to be subject to a decision based solely on automated processing.
These rights are discussed below.
The right to object to data profiling and direct marketing constitutes two separate rights. There is a right to object to data profiling where the justification for data processing (see “data processing justifications” section above) is on the grounds that it “is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject”. The data processing must stop unless the data controller can show “compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims”.
The right to object to direct marketing is even more extensive, since Article 21 states: “Where personal data are processed for direct marketing purposes, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, which includes profiling to the extent that it is related to such direct marketing.”
The right to be forgotten has attracted a lot of publicity in the general press. More correctly it is known as the right to data erasure. The most important trigger for this right is where the data subject withdraws his consent to marketing and that is the justification (see “data processing justifications” section above) being used by the controller. However, in any event, the data controller is expected to delete data where “the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed”. The obligation, where triggered, is for the controller to delete the data “without undue delay”.
Subject access request
The right to obtain a copy of personal data is not new. It was included in the 1995 Directive and is commonly known as a subject access request. Indeed, subject access requests were often the only interaction between many companies and the pre-GDPR data protection legislation – usually where in the course of a dispute with an ex-employee, the ex-employee would make such a request to try to obtain additional “ammunition” to further their case.
In the GDPR, this right is known as the right of access by the data subject, and is much more extensive than in the past. As well as having the right to receive a copy of the personal data, the data subject also must be given the following information:
- Confirmation that the processing is taking place;
- The purposes of the processing;
- The categories of personal data held;
- The (categories of) the recipients of that data;
- Whether the data is sent to third countries;
- The expected period for which the data will be stored.
In addition, when responding to the request for information, the data subject should be told that they have the right to request the data controller to rectify or erase personal data (see “data protection rights” section above) and the right to complain to the Information Commissioners Office (ICO). Further, if the data controller is using an automated decision-making process, the data subject must be informed about that process.
Further data protection rights
If the data subject discovers something inaccurate, they can have it corrected. The right in the GDPR is succinct, stating: “The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.”
The right to data portability is just as onerous for data controllers. It states that “the data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format, and have the right to transmit those data to another controller”. Normally, the data should be transmitted electronically. The right exists where the justification for data processing is based on consent or a contract (see “data processing justifications” section above).
This right has far-reaching ramifications. For example, a data subject can require a gas or electricity service provider to pass data about themself (and their gas or electricity consumption) to a rival utility company. Similarly, a hotel chain with a loyalty scheme could be required to pass data about a guest to a rival hotel chain.
The right not to be subject to a decision based solely on automated processing does not arise where the decision is further to a contract with the data subject or is undertaken with the consent of the data subject. This removes from its scope many commercial circumstances – whereas certain functions carried out by, for example, local authorities would be fully within the ambit of this right.
However, even in circumstances where the data processing is further to a contract or undertaken with consent, the legislation states that the data controller shall provide “the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision”.
Breach notification
There is a new obligation under the GDPR to notify a “personal data breach” to the information commissioner. A “personal data breach” is defined in Article 4 as a “breach of security leading to destruction, loss, alteration, unauthorised disclosure, access, transmission or processing of personal data”. This is a wide definition. There is an exception which applies where the personal data breach is “unlikely to result in a risk to the rights and freedoms of” individuals.
Read more about the GDPR
- With the GDPR deadline fast approaching, organisations should be focused on the minimum viable product that they should be delivering, says top privacy lawyer.
- How to be prepared for GDPR by 25 May – seven key areas organisations should review to ensure compliance.
- GDPR gotchas and how to handle them.
- We explore some common myths surrounding the General Data Protection Regulation.
This exception is difficult to analyse, not least because in all circumstances of data loss there will be some risk – defined in international standards as the “effect of uncertainty on objectives – to individuals. It is simply that in many circumstances the data loss is unlikely to result in harm to individuals. But the legislation doesn’t refer to “harm”, it refers to “risk”. A working party committee, formed of representatives of all the various information commissioners throughout Europe, has issued guidance on breach notification, but it fails to clarify this matter.
Arguably, all breaches should therefore be reported to the information commissioner. In practice, though, only those personal data breaches which are likely to harm individuals should be reported. A full analysis of the realities of the circumstances in which breaches should be reported is beyond the scope of this article.
However, one quick point can be made: it is clear that the information commissioner isn’t expecting most breaches to be reported when one of her deputies has publicly said she expects about 30,000 breach notifications a year. Given how many breaches actually take place a year, 30,000 isn’t a lot. According to the government’s own Cyber security breaches survey 2017, 46% of all businesses were aware of a breach having occurred in the past year.
A notifiable breach must be reported “without undue delay” and in any event, where feasible, within 72 hours. If later, the notification should be accompanied by reasons why it is late. In most security breaches, it is likely that the scope of the breach and the reason for the breach will not be immediately apparent. The legislation is intended to operate in such a way that when the controller is aware of the breach, it should notify. As and when further information comes to light, more information should then be notified.
There is a corresponding obligation to notify affected individuals of a personal data breach where the personal data breach “is likely to result in a high risk to the rights and freedoms of natural persons”. Again, to make sense of this, this must be read “is likely to result in a high risk of harm to the rights and freedoms of natural persons”. This notification must be made in “clear and plain language” without delay.
Export of data
Personal data may not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.
The level of protection must be adequate in all the circumstances, having regard in particular to:
- Whether the rule of law prevails in that country;
- What rights exist for data subjects in that country;
- The international obligations of that country or territory.
It is a European Union body, namely the European Commissioners, that determines whether a country or territory has such a level of protection. In addition, for the purposes of data protection, the relevant European area includes Norway, Liechtenstein and Iceland.
There is a further, European Union-approved, limited “exemption” for certain US companies, under what is known as the Privacy Shield regime which replaced the Safe Harbour principle. US companies can voluntarily accede to the regime and then must comply with equivalent data protection principles. The list of companies that have voluntarily signed up to the Privacy Shield regime includes many of the larger outsourcing companies such as Amazon, IBM and Microsoft.
However, many cloud service providers based in India and other low-cost countries often process data in breach of data protection legislation. This is because those countries do not have equivalent data protection legislation of their own, recognised as such by the European Commission. Although in theory it is possible to adopt a model contract to ensure compliance with the Data Protection Act, even when appointing a data controller in such a low-cost country, that model contract is rarely used in practice. Having been drafted by a committee, it is simply too complex and burdensome.
Another, equally expensive, method of allowing the export of data, is achieved by using binding corporate rules. These apply within a group of companies. They are a binding set of rules, formally approved by a European information commissioner.
How important is the data protection legislation?
Most breaches of the Data Protection Act go unnoticed. When they are noticed, there is often a public outcry, but little else changes.
In particular, there is little practical means for financial redress for the person who suffers from a breach of the data protection legislation. This is because the primary means of enforcement of the legislation is by means of a civil court case. However, the cost of bringing such a case in the UK vastly exceeds the amount of damages that would be obtained as a result of court enforcement.
Take, for example, the case of HFC Bank (then a subsidiary of HSBC Bank), which in September 2004 emailed 2,600 people. However, instead of blind copying the recipients, it accidentally carbon copied all the recipients, so all 2,600 could see one another’s email addresses. The matter was further compounded because some customers had their automatic “out of office” responses on, which responded to all 2,600, giving them further personal details.
HFC Bank immediately apologised and credited the affected customers’ accounts with £50 compensation. There was still an outcry by disgruntled customers. In fact, £50 was at the time almost certainly reasonable compensation for the wrongful publication of the email addresses. Certainly, there are no reported cases of any customers subsequently suing HFC Bank for greater damages.
To the general statement that the existing legislation is largely toothless, there are exceptions. It is these exceptions which can be considered to be the real drivers behind compliance with data protection legislation. These exceptions will be dealt with in turn and are enforced by the Financial Conduct Authority (FCA) and information commissioner.
Enforcement by the Financial Conduct Authority
Where a company is regulated by the FCA, then the FCA has power to apply an unlimited fine for breaches of the data protection legislation.
The predecessor to the FCA, the Financial Services Authority (FSA), had even let it be known that, in appropriate cases, it would consider imposing criminal sanctions against individuals within organisations, where they were shown to be in breach.
The FSA had a history of enforcing the data protection legislation. For example, in February 2007, Nationwide Building Society was fined £980,000 for issues arising out of a stolen laptop. In January 2008, Norwich Union was fined £26m for administrative failures leading to a data breach. But its successor, the FCA, has yet to show much appetite to do so, having concentrated more on the direct regulation and policing of the financial markets.
Enforcement by the information commissioner
The information commissioner is responsible for the enforcement of the data protection legislation. The previous Data Protection Act 1998 did, in fact, provide for some criminal enforcement. The information commissioner from time to time prosecuted those who failed to register, such as second-hand car dealers, financial advisers and even plumbers. However, the information commissioner has a very limited budget, and is unable to enforce the legislation comprehensively through prosecutions.
Rather, the information commissioner must cajole people into complying with the legislation through the issue of good practice guides. Since 6 April 2010, the information commissioner had the power to levy civil fines of up to £500,000. To date, the information commissioner has fined a couple of hundred organisations, the majority of those being public bodies such as local authorities and NHS trusts. One reason for this is that public bodies feel obliged to report security breaches. However, there is no general duty for private companies to do so. What the information commissioner does not know about, she cannot punish.
In this regard, the GDPR introduces a major change. Indeed, many people unnecessarily fear the new powers of the information commissioner. The maximum fine which can be levied has, for most breaches of the legislation, been increased to the higher of €20,000,000 or 4% of group annual turnover. A breach of the obligation to notify (see “breach notification” section above) attracts a lower maximum penalty of the higher of €10,000,000 and 2% of group annual turnover.
For the overwhelming majority of companies, which are not part of a group of companies with turnover in excess of half a billion euros, that means a maximum fine of €20,000,000. The primary offences for which this fine can be levied include:
- Failure to follow the revised data protection principles (see “data protection principles” section above);
- Failure properly to justify consent to processing (see “data processing justifications” section above);
- Failure to comply with the rights of data subjects (see “data protection rights” section above);
- Wrongful transfer of personal data to a third party country (see“export of data” section above).
These offences can be committed as easily in cyberspace as in the physical world. Where a company has a data system which collects personal data about European Union citizens, the data controller should comply with the GDPR.
However, if the data controller fails to do so, the probability of the data controller being prosecuted is minimal where its assets are overseas. If a group of companies has assets within the jurisdiction (ie, in the European Union), the legislation can be enforced. Where a company has no assets in the European Union, in practice, the legislation cannot be enforced because any fine, were it not to be paid, could not be collected.
In most circumstances, the information commissioner would not even attempt to prosecute a “difficult” case such as where the company has no assets in the UK, let alone where it has no assets in Europe, where at least in theory she could enforce any data protection breach judgement. She will simply seek an easier target.
In practice, the information commissioner has insufficient resources to police cyberspace to any meaningful degree. The exceptions are those situations where there is political or public pressure, which forces her to do so. An example is the data security breach by Sony.
In April 2011, due to ineffective security measures in its PlayStation Network Platform, Sony allowed hackers to obtain access to the personal information of millions of customers, including their names, addresses, dates of birth, account passwords and, potentially, payment card details. Some reports suggested as many as 77 million PlayStation users worldwide were affected. In January 2011, the information commissioner imposed a £250,000 fine. While Sony initially said that it would appeal the £250,000 fine, it has since stated that it would not, since to do so would be to reveal sensitive details about its network security.
Despite the fine imposed on Sony, relatively few fines have been issued against purely commercial organisations. A freedom of information request showed that of 29 fines issued from 2011 to 2013, only five were against commercial organisations. Why? The reason is that the information commissioner, in common with most underfunded, overworked public enforcement bodies, concentrates on easy targets.
In this context, public authorities and NHS trusts are easy targets since they are unlikely to mount a robust defence. Also, the type of data that the ICO has been concerned with in these organisations – data about vulnerable children, medical records and criminal convictions – is the sort of data which is clearly sensitive and ought to be securely protected.
The most recent statistics for 2016 show that while the information commissioner received some 18,300 complaints and was notified of 1,950 breaches – almost all of which were made on a voluntary basis by public authorities, such as local authorities and health trusts – she only issued 16 fines. She does not have the resource to do otherwise. While there is no doubt that with additional resource under the GDPR she will issue more fines, the percentage of notifications to fines will reduce from the current 0.82% simply because of the requirements for mandatory breach notification.
Historically, when it comes to private companies with real clout, the UK information commissioner has been much more cautious. So, to use another cyber security example, despite the wrongful acquisition of Wi-Fi data by Google when sending cars out to roam the British streets to collect data for Google maps, the UK information commissioner has not taken any robust action. In comparison, other countries, such as France, Switzerland and Germany, have all taken significant enforcement action.
Read more on Business applications
-
Surrey and Sussex police spared fines after recording 200,000 phone calls without people’s knowledge
-
How 2022’s most significant data privacy trends affect your organisation
-
UK GDPR faces changes under planned reforms
-
Campaign groups claim police have bypassed Parliament with plans for live facial-recognition tech