Leigh Prather - stock.adobe.com
What the EU’s decision on Facebook means for social media
Recent ruling by the Court of Justice of the European Union will have global implications for social media companies and any organisations that host online content
On 3 October 2019, the Court of Justice of the European Union made a preliminary ruling on the Facebook vs Eva Glawischnig-Piesczek case. It advised that social media platforms can be ordered to take down illegal content, throughout the European Union (EU) and potentially beyond.
Austrian politician Eva Glawischnig-Piesczek had requested that Facebook remove disparaging posts about her that were made public by a user of the platform. Facebook declined, so Glawischnig-Piesczek approached the Vienna Commercial Court, which agreed with her and requested that Facebook remove the offending material, as well as “identical” and “equivocal” posts. Facebook complied by removing the original post for users based in Austria.
Subsequent discussions about proactive monitoring and removal of similar posts led to an impasse, which both parties sought to be resolved by the Austrian Supreme Court, which, in turn, referred the case to the EU’s Court of Justice (ECJ).
The ruling states that Article 15 of the EU’s e-commerce directive does not prevent EU member states from ordering injunctions against platforms, for example to take down offending material. The court held that these injunctions can cover a wide variety of material, as well as reposts and “equivalent” posts, not just on Facebook, and can be applied worldwide, subject to the appropriate treaties and international agreement.
As this is an advisory confirmation of an existing EU regulation, which is now enacted in all EU member states, this should now be considered part of EU-wide law. Therefore, all organisations will be subject to this new clarification of the existing laws.
“We note the ECJ’s judgment and are considering its implications,” said a spokesperson from the UK’s Department for Digital, Culture, Media & Sport. “Britain is leading the world in developing a comprehensive regulatory regime to keep people safe online, protect children and other vulnerable users and ensure there are no safe spaces for terrorists online.”
The preliminary ruling states that Article 15(1) of the directive on electronic commerce does not preclude a court of a member state from:
- Ordering a host provider to remove information which it stores, the content of which is identical to the content of which was previously declared to be unlawful, or to block access to that information, irrespective of who requested storage of that information.
- Ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of, and search for, the information concerned by such an injunction is limited to information conveying a message, the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content.
- Ordering a host provider to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law.
Although the subject of the case was Facebook, the wording explicitly refers to any online platforms, which means this ruling will affect all social media companies and online platforms operating within the EU which host user-generated content. Also, because this has been issued by the Court of Justice, there is no recourse for appeal.
Illegal content refers to any content that has been hosted by an online platform, which has been declared by a court within the EU to be illegal, if it has been sufficiently viewed as to cause reputational harm.
“It has to be a substantial number of members of the public, sufficient to damage your reputation,” says Peter Adediran, an expert in digital media law and founder of PAIL Solicitors. “The more views, the more you can show that there is potentially damage to your reputation. Those views have to come from within the EU.”
Illegal content explained
Illegal content includes not just defamatory content, but also a breach of privacy and malicious falsehoods.
A breach of privacy is typically the public disclosure of private information and the unauthorised use of someone’s name or picture. Defamatory content is the false and unprivileged statement of facts that are harmful to someone’s reputation. A malicious falsehood is similar to defamatory content, but is typically a lie that was uttered with malice – in this case, the defendant knew the statement was not true or did not take proper care to check.
As the ruling states it will “block access to that information worldwide within the framework of the relevant international law”, such a ruling will affect any of the EU member states, as well as those countries outside the EU that have the appropriate international treaties with an EU member state.
This essentially allows the court of an EU member state to declare a post to be illegal and request the offending post, and any equivocal content, to be blocked from being viewed on the online platform within the country of the member state, the other EU member states, and any country outside the EU that has the appropriate treaty with an EU member state.
One area that could be subject to interpretation is in the second point of the declaration, where it refers to “content of which is equivalent to the content of information which was previously declared to be unlawful”. This is apparently designed to prevent people from sharing information that has been declared illegal, but altering it so that it is not exactly the same. However, what “equivalent” means precisely could be subject to interpretation.
Read more about Facebook
- Leaked documents reveal how Facebook used and abused app developers, cut off data to competitors, gave privileged access to its friends and used privacy as a cover story.
- We need to move fast and fix Facebook before it breaks us, writes Leighton Andrews of the Cardiff Business School.
- An open and transparent culture that encourages developers to take risks and assume the best intent of co-workers has been the defining character of Facebook’s engineering organisation.
A Facebook spokesperson says: “This judgment raises critical questions around freedom of expression and the role that internet companies should play in monitoring, interpreting and removing speech that might be illegal in any particular country.
“At Facebook, we already have community standards which outline what people can and cannot share on our platform, and we have a process in place to restrict content if and when it violates local laws.
“This ruling goes much further. It undermines the long-standing principle that one country does not have the right to impose its laws on speech on another country. It also opens the door to obligations being imposed on internet companies to proactively monitor content and then interpret if it is ‘equivalent’ to content that has been found to be illegal.
“To get this right, national courts will have to set out very clear definitions on what ‘identical’ and ‘equivalent’ mean in practice. We hope the courts take a proportionate and measured approach, to avoid having a chilling effect on freedom of expression.”
Ultimately, any online platform wanting to operate within the EU must now abide by any requests made by the EU member nations to block or remove illegal content. This is regardless of where the organisation’s headquarters are located or where the content was posted. It only needs to be viewable, and found illegal, within the EU.
In the coming months, as content is declared to be illegal, online platforms can expect court orders requesting that they take down or block the offending material, as well as any “equivocal” content, such that it is not viewable within the EU and any of countries that have the appropriate treaties with EU member states in place.
The past few years have seen a fundamental shift in the various EU governments’ approach to the internet. Previously, there had been a lack of regulation of the web, but there is now an increasing trend for governments and the EU to want the internet to become a more regulated space.
For example, Article 13 of the forthcoming EU copyright directive promises to make online platforms more responsible for copyright-infringing material being hosted on their websites. “The judgment is really saying: get ready for more EU regulation,” says Adediran.
Brexit impact unlikely
Despite the uncertain nature of the UK’s future relationship with the EU, it is unlikely that Brexit will have any impact on this. The UK has been proactive in harmonising its laws with those of the EU. For example, the Information Commissioner’s Office has previously stated: “The government has made clear that the General Data Protection Regulation will be absorbed into UK law at the point of exit, so there will be no substantive change to the rules that most organisations need to follow.”
However, this increased regulation will mean increased costs, as online platforms will be expected to comply with them. Although this is less of an issue for larger companies, smaller online platforms and startups may struggle to meet the demands of greater regulation, which could stifle innovation.
James Klymowsky, CEO of Peddler, says: “The regulators didn’t have any malicious intent, but were ultimately trying to protect the privacy of the users. They didn’t really consider how much money it can cost the startup or scaleup to comply with some of the regulations being put out there.”
Ultimately, maintaining an awareness of the shifting judicial landscape, especially with regard to the regulatory requirements for each region they operate in, will ensure organisations are not taken by surprise.
Adediran adds: “There is going to be a much greater burden on companies, both large and small, to effectively police their content and make sure they act expeditiously when they are informed of defamatory comments.”
Read more on Privacy and data protection
-
Crime agency criticises Meta as European police chiefs call for curbs on end-to-end encryption
-
Meta faces GDPR complaint over processing personal data without 'free consent'
-
Chat control: EU lawyers warn plans to scan encrypted messages for child abuse may be unlawful
-
Meta to appeal £345m fine for Facebook and Instagram privacy breaches