|
Also The IPKat has a Facebook page (not used for libellous and defamatory comments alas though) |
When it comes to content removal in the context of an injunction, how is this to be done in order to comply with the prohibition of a general monitoring obligation, as per Article 15 of the E-commerce Directive?
This, in a nutshell, is the issue at stake in Facebook, C-18/18, a referral for a preliminary ruling from the Austrian Supreme Court made in the context of national proceedings concerning defamatory comments published on Facebook.
Yesterday, Advocate General (AG) Szpunar delivered his Opinion, which opens with a quote from The Social Network (the film about the beginning of Facebook): “The internet’s not written in pencil, it’s written in ink”. Indeed, as the AG effectively summed up, this case concerns:
whether a host which operates an online social network platform may be required to delete, with the help of a metaphorical ink eraser, certain content placed online by users of that platform.
The Opinion in brief
In his Opinion, AG Szpunar advised the Court of Justice of the European Union (CJEU) to rule that Article 15 of the E-commerce Directive does not preclude a host provider from being ordered – by means of an injunction – to seek and identify, among all the information disseminated by users of its service, the information identical to the information that has been found to be illegal by a court that issued that injunction. With regard to equivalent information, the duty the host provider to search and identify such information is only in relation to the information disseminated by the user that disseminated that illegal information. To this end, the effects of the relevant injunction must be clear, precise and foreseeable, and the authority that issues such injunction must also take into account and balance different fundamental rights, as well as complying with the principle of proportionality.
In relation to the territorial scope of an injunction, the AG noted that the E-commerce Directive is silent on this point. Hence, an injunction can also impose removal of information on a worldwide basis.
Let’s see a bit more in detail what the case is about and how the AG reasoned.
Background
The background national proceedings relate to an application for an injunction that an Austrian politician sought against Facebook, following failure by the latter to remove information posted by a user and containing disparaging comments relating the politician. The Vienna Commercial Court issued the requested injunction, and ordered Facebook to remove the relevant content. Facebook complied with the injunction, but only disabled access to the content in Austria.
On appeal, the Vienna Higher Regional Court upheld the order made at first instance as regards identical allegations, and dismissed Facebook’s request that the injunction be limited to Austria. However, that court ordered the removal of equivalent content should be only done in relation to content notified by the applicant to Facebook.
Both parties appealed to the Austrian Supreme Court, which decided to seek guidance from the CJEU regarding both the content (identical and equivalent information) and scope (territorially limited or worldwide) of an injunction like the one at issue in this case. These are the questions referred to the highest EU court:
(1) Does Article 15(1) of Directive [2000/31] generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of [that] directive, but also other identically worded items of information:
(a) worldwide?
(b) in the relevant Member State?
(c) of the relevant user worldwide?
(d) of the relevant user in the relevant Member State?
(2) In so far as Question 1 is answered in the negative: Does this also apply in each case for information with an equivalent meaning?
(3) Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?
|
AG Maciej Szpunar |
The analysis of AG Szpunar
The AG began his analysis by noting that, “[i]rrespective of the doubts that one might have in that regard”, Facebook is a host provider within the meaning of Article 14 of the E-commerce Directive potentially eligible for the safe harbour protection offered therein. This conclusion, in any case, does not remove the possibility for such a subject to be the addressee of an injunction.
It is true that the conditions and detailed procedures applicable to injunctions against intermediaries are a matter of national law. However, the rules set by individual Member States must comply with the requirements laid down under EU law, in particular in the E-commerce Directive.
Prohibition of general monitoring
In this sense, this piece of EU legislation (Article 15(1), read in combination Article 14(3)) prohibits the imposition of a general monitoring obligation on, inter alia, host providers. If a general monitoring obligation was imposed, this would also remove the safe harbour, as the host provider targeted by such an injunction would no longer be neutral:
the activity of that host provider would no longer retain its technical, automatic and passive nature, which would imply that the host provider would be aware of the information stored and would monitor that information. Consequently, the implementation of a general monitoring obligation, imposed on a host provider in the context of an injunction authorised, prima facie, under Article 14(3) of Directive 2000/31, could render Article 14 of that directive inapplicable to that host provider.
Possibility of specific (targeted) monitoring and need for a global assessment
The E-commerce Directive (Recital 47 and Articles 14(3) and 18) does not however prohibit monitoring obligations in a specific case: a host provider might well be ordered to prevent an infringement, and not just to bring an existing infringement to an end. The CJEU has confirmed this in its case law, including the seminal decision in L’Oréal v eBay (which also concerned Article 11 of the Enforcement Directive), in which it held that an obligation for the future must concern infringements of the same nature by the same recipient of the same rights.
The AG also recalled that in his Opinion in Mc Fadden [Katpost here], he had advised the CJEU to rule that, in order for a monitoring obligation to be considered applicable to a specific case it must, in particular, be limited in terms of the subject and the duration of the monitoring. In relation to the latter, the AG observed that a permanent monitoring obligation would be difficult to reconcile with the concept of an obligation applicable ‘in a specific case’.
In any event, a ‘global assessment’ is required in order to answer whether a certain injunction is compatible with Article 15(1).
Bearing all this in mind, the AG considered the specific case at issue, also with regard to proportionality and the need to balance different fundamental rights, and concluded that Facebook may be ordered to:
- Seek and remove information identical to the information characterized as illegal when this is also disseminated by other users of the platform;
- Seek and remove information equivalent to the information characterized as illegal only when this is disseminated by the user who disseminated said information. Holding otherwise, and extending Facebook’s obligation to information disseminated by other users, would entail a general monitoring on the side of the provider (that could no longer be considered neutral), and would also fail to achieve a fair balance of different rights and interests.
|
It's a Katworld |
Removal worldwide
Turning to the question of the geographic scope of the removal of content that has been found illegal under the law of a certain Member State and standing the technical possibility for a provider like Facebook to remove content worldwide, the AG considered that a different assessment should be made depending on the right at issue.
The legal situation at issue here (defamation) is one for which there has been no substantial harmonization. This would make the assessment different from the case of a substantially harmonized right (as it is for the right to be forgotten in CNIL v Google). In his Opinion in that case, AG Szpunar advised that de-referencing should be only carried out in relation to the EU, albeit there might be situations which warrant application of the provisions in Directive 95/46 beyond the EU.
With regard to the present case, the AG deemed it necessary to address, first, who would have jurisdiction to adjudicate on an injunction with extraterritorial effects and, second, what (geographic) scope such injunction should have.
As regards the former, the AG noted that the E-commerce Directive does not contain jurisdiction rules and that the Brussels I Regulation recast rules would apply. In eDate Advertising, the CJEU found that when it comes to personality rights violated by content disseminated online, one could sue where they have their centre of interests and the court seised would have jurisdiction to adjudicate on all damage suffered. From this it follows that:
the court of a Member State may, as a general rule, adjudicate on the removal of content outside the territory of that Member State, as the territorial extent of its jurisdiction is universal. A court of a Member State may be prevented from adjudicating on a removal worldwide not because of a question of jurisdiction but, possibly, because of a question of substance.
Turning to the issue of scope of the removal obligation, the AG noted that the E-commerce Directive is silent in this respect, and that the applicant in the background proceedings did not rely on EU law. As such, the situation would be different from the one at issue in Google v CNIL. According to the AG, EU law does not prevent the imposition on a host provider of an obligation to remove content worldwide, also because international law does not prevent an injunction from having extraterritorial effects. This said,
the court of a Member State may, in theory, adjudicate on the removal worldwide of information disseminated via the internet. However, owing to the differences between, on the one hand, national laws and, on the other, the protection of the private life and personality rights provided for in those laws, and in order to respect the widely recognised fundamental rights, such a court must, rather, adopt an approach of self-limitation. Therefore, in the interest of international comity, to which the Portuguese Government refers, that court should, as far as possible, limit the extraterritorial effects of its junctions concerning harm to private life and personality rights. The implementation of a removal obligation should not go beyond what is necessary to achieve the protection of the injured person. Thus, instead of removing the content, that court might, in an appropriate case, order that access to that information be disabled with the help of geo-blocking. Those considerations cannot be called into question by the applicant’s argument that the geo-blocking of the illegal information could be easily circumvented by a proxy server or by other means.
|
The face you make when you have perfectly understood what you need to do |
Comment
This is yet another thoughtful Opinion by AG Szpunar whose echo will be heard in the future and in different fields, including in IP and outside the realm of intermediary injunctions.
Starting from the latter (injunctions), the AG reminded us that when it comes to these measures, different rights are at issue and need to be carefully balanced. Also, a proportionality scrutiny and, more generally, a 'global assessment' need to be undertaken. In all this, injunctions can entail specific monitoring obligations.
None of the above is new, although over time some commentators have attempted to propose a narrow interpretation of the scope - both substantial and territorial - of injunctions against intermediaries.
Turning to something else, what is particularly interesting in this Opinion is the relationship between Article 15 and intermediaries' obligations in relation to monitoring and filtering. In particular, it appears that the Opinion of AG Szpunar may shed some light also onto something that, of course, is not at issue in this case, and that is Article 17 of the
DSM Directive [here].
As readers know, the text eventually adopted includes a specific mention that "the obligations established in this Directive should not lead to Member States imposing a general monitoring obligation" (Article 17(8) and Recital 66).
Yet, it is unclear where the line is to be drawn between specific and general monitoring, and what actual obligations online content sharing service providers (a category which is likely to include Facebook) are subject to. In this sense, the different treatment of identical and equivalent content, as proposed by AG Szpunar, might find its application also in the context of Article 17 obligations.
Let's now wait and see what the CJEU decides. Stay tuned!
I'm just curious, is the AG's opinion postive, neutral or negative? :)
ReplyDeleteIn what respect? I think it's pretty multifaceted. In case you're interested, here's a longer take on it: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3438102
DeleteIn what respect? I think it's pretty multifaceted. In case you're interested, here's a longer take on it: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3438102
ReplyDelete