Christina, incidentally, is a researcher with the Institute for Information Law (IViR) of the University of Amsterdam. She is currently writing her PhD thesis on intermediary liability in the EU. This is what she writes:
Defamation v Freedom of Expression: The ECHR Grand Chamber Hands Down Judgment in Delfi v Estonia
On 16 July 2015, the European Court of Human Rights (ECHR) handed down its much-awaited judgment in Delfi AS v Estonia. The judgment –- at 88 pages –- is uncommonly lengthy, containing two concurring and one dissenting opinions. The last of these contains a scorching condemnation of the majority opinion, which probably goes a long way to explaining the extended wait leading up to the judgment’s release. The case concerned a possible clash between two competing human rights: the freedom to impart information of the applicant internet news portal (protected by Article 10 of the Convention) and the right to privacy of the victims of the unlawful speech of its users (protected by Article 8 of the Convention). The possibility of a clash arose when the Supreme Court of Estonia held the news portal liable for that speech. The ECHR found that no violation of freedom of expression had occurred and that therefore no clash had taken place. If the reaction to the previous Chamber judgment of the Court, delivered in 2013, as well as a brief scan of Twitter and the blogosphere are any indication, the decision can be expected to cause great consternation among commentators.
Reasoning of the Court
The Court confirmed its approach in previous rulings according to which, where there is a clash between competing fundamental rights, a fair balance must be struck between them that retains the essence of each right. This balancing exercise is positioned within the usual three-pronged examination of any infringement of Article 10 of the ECHR. According to this, “an interference with the applicant company’s right to freedom of expression must be ‘prescribed by law’, have one or more legitimate aims in the light of paragraph 2 of Article 10, and be ‘necessary in a democratic society’.”
(a) ‘Prescribed by Law’. The Court confirmed that the interference had been “prescribed by law”: the Estonian Supreme Court applied the national Obligations Act to impose fault-based liability on the news portal. This outcome should have been foreseeable to Delfi, as it is a professional publisher that ought to have been familiar with relevant legislation and case-law and could have sought legal advice. The Court emphasised that “it is not its task to take the place of the domestic courts” in the identification and application of national law, but only to determine whether the methods adopted and the effects they entail are in conformity with the Convention.
This reasoning can be disputed, and indeed was so by dissenting Judges Sajó and Tsotsoria. In their Opinion, the judges observed that the EU’s E-Commerce Directive – and in particular the safe harbour provisions of Articles 12-15 – would indicate a very different outcome, as would its Estonian implementation (the Information Society Services Act). These should arguably be viewed as lex specialis. This casts doubt on the foreseeability of the Estonian courts’ approach to the case:
“A legal adviser could not have informed Delfi with sufficient certainty that the Directive on legal aspects of information society services did not apply. The applicable law was not obvious, to the extent that even in 2013 a court in Cyprus found it necessary to ask the Court of Justice of the European Union for a preliminary ruling in a related matter, namely the liability of news portal publishers (see Case C-291/13, Papasavvas, CJEU). If there was uncertainty in 2013 in the European Union on a similar but less complicated matter, which was clarified in 2014, how could learned counsel have been sufficiently certain in 2006?”The dissenting judges concluded that “[o]nly divine legal reasoning” could have provided the operator with any indication that their acts would result in liability. “Vaguely worded, ambiguous and therefore unforeseeable laws have a chilling effect on freedom of expression”, they underlined.
(b) “Legitimate Aim”. The second step of a “legitimate aim” was provided by the protection of the reputation and rights of others. This was undisputed by the parties and accepted as given by the Court. At the same time, it is worth noting in this regard the concerns of the dissenting judges, who observed that a troubling lack of specificity surrounds the legal characterisation of the comments posted by the end-users: both the Estonian Supreme Court and the ECHR merely note that the illegality of the comments in question is “manifest”, yet they characterise that illegality in a variety of different ways, sometimes talking of disrespecting the victim’s honour and good name, sometimes of degrading their human dignity, sometimes of hate speech, sometimes of incitement to violence and sometimes of an attack on the victim’s reputation.
Are you calling me a rascal?
Individual examination of the 20 comments in question confirms the uncertainty: as the dissenting judges note, some of the comments are indeed clearly racist. Naturally, “[r]acism and constraining others to live in an environment full of hatred and real threats cannot take refuge in freedom of expression.” At the same time, other comments – such as the description of the plaintiff as a “rascal” – are perhaps less “manifestly” unlawful. This is also true of the more unsavoury examples: does stating on the internet a wish to see someone harmed amount to a real threat? A detailed break-down on the illegal nature of each comment is entirely – and unaccountably – missing. Greater precision would have been very welcome in this regard. It is telling that the illegality that is contended to be so manifest raises questions as to its precise nature with the judges of one of the highest courts in Europe. The fact that, on this undefined basis, the Court concludes not only that there is a “legitimate aim” for interference with the freedom to provide information of the platform, but that the authors of the comments do not enjoy freedom of expression makes this absence all the more troubling.
(c) “Necessary in a Democratic Society”. As is usually the case, the bulk of the analysis concerned the final question of whether the interference was “necessary in a democratic society”. This is where the principle of a “fair balance” enters the equation. A “fair balance” is the applicable solution, because Articles 8 and 10 deserve “equal respect”. The Court observes that in the execution of the balancing exercise the national authorities enjoy a wide margin of appreciation.
The question of a “fair balance” was equated with the notion of “proportionality”. According to the established case law of the ECHR, in order to assess the proportionality of the measure, it is necessary to examine “whether the domestic courts’ finding of liability on the part of the applicant company was based on relevant and sufficient grounds in the particular circumstances of the case.” The factors identified in the Chamber decision of 2013 were confirmed as relevant. These were:
* the context of the comments;
* the liability of the actual authors of the comments as an alternative to the applicant company’s liability;
* the measures applied by the applicant company in order to prevent or remove defamatory comments;
* the consequences of the domestic proceedings for the applicant company.
Each of these was examined in turn by the Court.
With regard to the context of the comment, the Court emphasised that Delfi was “a professionally managed Internet news portal run on a commercial basis which sought to attract a large number of comments on news articles published by it”, while it also had a substantial degree of control over those comments and an economic interest in their posting. With regard to the liability of the authors of the comments, the Court noted that, although the release of information regarding their identity might be ordered, such measures would be of uncertain efficacy, while Delfi had not put instruments in place that could help in this regard. With regard to the measures taken by Delfi to prevent or remove illegal comments, the Court agreed that, in view of the disclaimer posted warning users against posting unlawful comments, the installation of an automatic filtering system for the deletion of comments containing the stems of certain vulgar words, the notice-and-take-down system in place and the occasional human moderation, Delfi could not be said to have wholly neglected its duty to avoid causing harm to third parties. However, the Court observed, these mechanisms proved insufficient in the event and accordingly Estonia was entitled to impose liability. The dissenting judges take especial issue with this reasoning:
“It was decisive for the Court that the filtering mechanism failed. There is no review of the adequacy of the filtering mechanism (was it state-of-the-art; can there be a duty to apply state-of-the-art systems; is there any reason for being held liable with a state-of-the-art filtering system?). The Court itself finds that filtering must have been a simple task and that the system failed. No expert opinion, no cross-examination. We are simply assured that setting up a dedicated team of moderators is not ‘private censorship’.”
Finally, with regard to the consequences attached to the finding of liability, the majority emphasised that damages amount to only EUR 320 were imposed. They also opined that the applicant’s business model had not been affected, a strange conclusion in view of the fact that, as the Court itself observes, following the events of the case, Delfi had set up a team of dedicated moderators, i.e. precisely adjusted its business model.
As the dissenting judges note, the selection of the criteria for the balancing exercise is rather arbitrary: “If one applies a balancing approach, then the other side of the balance must also be considered.” They suggest another set of factors that offer offsetting considerations: the importance of press and journalism to democratic societies, the chilling effects created by the punishment of journalists for the dissemination of the speech of others, the importance of the provision of fora for the expression of views concerning public matters and the relevance of the comments to the public interest.
Scope of the decision
The Court tried to very carefully hedge the effect of the judgment: in addition to pushing aside the question of the freedom of expression of the authors of the comments by refusing to examine each comment individually and bypassing consideration of the supremacy of European Union law, at its very beginning the majority opinion stresses that the “present case relates to a large professionally managed Internet news portal run on a commercial basis which published news articles of its own and invited its readers to comment on them.” Accordingly:
“the case does not concern other fora on the Internet where third-party comments can be disseminated, for example an Internet discussion forum or a bulletin board where users can freely set out their ideas on any topics without the discussion being channelled by any input from the forum’s manager; or a social media platform where the platform provider does not offer any content and where the content provider may be a private person running the website or a blog as a hobby.”In this way the Court appears to be suggesting that the effects of the decision will not be that great. Once again however, the dissenting judges remain unimpressed: “Freedom of expression cannot be a matter of a hobby.”
|Estonian Supreme Court|
What about the basis on which liability was sought? In a last ditch attempt to account for the realities of the internet, in their concurring opinion, Judges Karakas, De Gaetano and Kjølbro attempt to draw a line between liability for “preventing” unlawful comments from being published and liability for not subsequently “removing” the comments. They assert that Delfi was found liable as it did neither and claim that this avoids the difficult question of the possible liability of a news portal for not having “prevented” unlawful user-generated comments from being published. The concurring judges suggest that this makes a difference, as the imposition of an obligation on a news portal to prevent the publication of unlawful comments would require “pre-monitoring” that would entail the risk of an excessive burden amounting to a disproportionate interference with the platform’s freedom of expression. Troublingly, this suggests a lack of a clear understanding by the majority judges of the question at issue. Once again, the dissenting judges hit the nail on the head: “[c]ontrol”, they emphasise, “requires knowledge” and such knowledge does not automatically accrue simply because a comment has been published on a news portal: while the three concurring judges maintain that “not being aware of such clearly unlawful comments for such an extended period of time almost amounts to wilful ignorance”, in reality the publication of comments by end-users normally occurs without the decision of the portal and without its knowledge. More importantly, such knowledge cannot, even after publication, arise without either notification – which Delfi allowed through its notice-and-take-down platform, but which the majority deemed insufficient – or active monitoring: “[t]he duty to remove offensive comments without actual knowledge of their existence and immediately after they are published means that the active intermediary has to provide supervision 24/7.” It is unclear why “pre-monitoring” would be a more excessive burden than “post-monitoring”, particularly if the “post-monitoring” has to be, as the Estonian courts and the majority opinion agree, “without delay”.
Regardless, it is true that the case concerns only the assessment of the Estonian rulings in the context of the Convention and does not create a strict liability of internet news portals for the statements of their users. Yet its effects should not be underestimated. As the dissenting judges again astutely observe, the Court’s approach is an “invitation to self-censorship”: even in less demanding legal regimes, the threat of liability alone causes intermediaries to err on the side of caution and remove content even without authoritative confirmation of its illegality. If neither specifically introduced EU-level safe harbours nor their national transpositions can serve to shield internet platforms from the threat of general tort law, then these might be excused from concluding that nothing but immediate deletion at the slightest hint of possible third party illegality can protect them: “[t]o avoid trouble, for active intermediaries the safe harbour will simply be to disable comments”. In addition, the very monitoring of all content that the Court advocates in itself would also create chilling effects for the user, who must become conscious of being constantly watched by an intermediary acting as an agent of the State, while her speech is assessed by overcautious non-legal professionals.
Hopefully, at least within the EU, Member States will realise that, even if they are permitted to demand monitoring under the ECHR, EU law prohibits it: the judgments of the CJEU in Google France [on which see Katpost here] and L’Oréal v eBay [on which see Katpost here, here and here] have made this clear for a while now. As a result, more than anything else, the Estonian courts have created a problem for themselves by misapplying EU law and refusing to submit a preliminary question to the CJEU. The resulting problem of how to bring their law into line with the EU requirements should now also be limited to them, not the rest of EU. What the effect will be for European countries outside of the EU is a more troubling consideration.
But let us not be too pessimistic: things could always be worse. Indeed, this is nicely illustrated by the concurring opinion of Judge Zupančič. Rather bizarrely ignoring the lowest common denominator approach traditionally taken by the Court in dealing with cases where there is no pan-European consensus -– and with particular questionable timing in view of both the CJEU's recent rejection of the draft agreement on an EU accession to the ECHR and the British government’s ambiguity with regard the Convention – the judge concludes that the common law’s caution towards privacy rights is incorrect and should be supplanted by the recognition of strong personality rights along the Germanic tradition. Thus, the judge rejects even a balancing of privacy with freedom of expression, suggesting instead that the limits to freedom of expression lie precisely “at the point where somebody else’s freedom and personal integrity is negatively affected”. On this basis, Judge Zupančič goes even further to condemn anonymity on the internet and suggest that internet portals should be prohibited from publishing anonymous content. Whether such a ruling by the Court would have served to suppress free speech online or simply undermine the authority of the Court can only be speculated at. Either way, it appears a bullet has been dodged.