[Guest post] The Implementation of Article 17 CDSMD in EU Member States and the Evolution of the Digital Services Act: Why the Ban on General Monitoring Obligations Must Not Be Underestimated

The IPKat is pleased to host the following guest post by Martin Senftleben and Christina Angelopoulos, focusing on the forthcoming national transpositions of Article 17 of Directive 2019/790 (CDSMS) [Katposts here] and what lies ahead for EU policy and law.

Here's what Martin and Christina write:

The Implementation of Article 17 CDSMD in EU Member States and the Evolution of the Digital Services Act: Why the Ban on General Monitoring Obligations Must Not Be Underestimated

by Martin Senftleben and Christina Angelopoulous

Filtering in practice
Article 17 of the Directive on Copyright in the Digital Single Market (‘CDSMD’) is an inexhaustible source for debate. Setting forth new obligations to prevent the appearance of illegal user uploads on online content-sharing platforms (Article 17(4)(b) and (c)), it has triggered a never-ending controversy on content filtering and potential overblocking. With national legislators in EU Member States trying to find the right implementation strategy at the domestic level and the European Commission working on a new architecture for internet service provider liability in the Digital Services Act (‘DSA’), the debate has reached a new peak. How far-reaching are the new content moderation obligations in the field of copyright law? Does it make sense to rely on Article 17 CDSMD as a template for regulating online platform services in the DSA?

Important new insights into these questions follow from a recent research project that focused on the prohibition of general monitoring obligations in EU law. This research shows clearly that, in the current debate on implementation options, Article 17(8) CDSMD offers important guidelines that must not be overlooked. This provision contains the traditional prohibition of general monitoring obligations known from Article 15(1) of the E-Commerce Directive (‘ECD’): providers of online content platforms may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. The content moderation duties arising from Article 17(4)(b) and (c) CDSMD, thus, must be reconciled with the general monitoring ban laid down in Article 17(8) CDSMD.

This, of course, leaves open the possibility of monitoring ‘in a specific case’ (see also Recital 47 ECD). In the current debate, however, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme. Rightly understood, a prohibited general monitoring obligation in the sense of Article 17(8) CDSMD arises whenever content – no matter how specifically it is defined in rightholder notifications received under Article 17(4)(b) or (c) – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, the research concludes that a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers.

More concretely, the requirement of ‘double specificity’ means that rightholder notifications under Article 17(4)(b) and (c) are required to be specific not only in respect of works and other protected subject matter, but also in respect of the circle of potential infringers belonging to the audience of the content platform at issue. To the extent that filtering is employed as a way of abiding by the requirements of Article 17(4)(b) and (c), it should therefore be limited to the uploads of such limited circles. In any case, Article 17(4)(b) and (c) should not be interpreted as requiring intermediaries to engage with notifications from rightholders containing long lists of protected works that have not been tailored to a specific sub-group of the platform audience. Otherwise, individual notifications may reach such a volume that, adding up the total number of notified specific works, a filtering duty arises which de facto amounts to a prohibited general monitoring obligation.

The decisions of the Court of Justice (‘CJEU’) in L’Oréal v eBay, Scarlet v SABAM, SABAM v Netlog and McFadden all point towards the CJEU’s embracing a definition of general monitoring in the area of copyright that bans the monitoring of all or most of the content handled by an intermediary on behalf of all or most of its users, even when this is carried out to identify infringements of a specific right. While the CJEU has adopted an alternative approach in Eva Glawischnig-Piesczek for the area of defamation, this cannot change this equation. By definition, defamation cases are more context-specific than copyright cases. They differ substantially from copyright scenarios because of the absence of holders of large right portfolios who could trigger a snowball effect by notifying long lists of protected subject matter. Filtering requests in copyright cases concern content that is fixed after publication. In defamation cases, by contrast, the legitimacy of a filtering request depends on the specific – defamatory – nature of uploaded content. Considering these substantial differences, the CJEU decision in Eva Glawischnig-Piesczek fails to provide guidance for the assessment of content moderation measures relating to literary and artistic works.

In addition, the requirement of ‘double specificity’ is indispensable in the field of copyright to prevent encroachments upon fundamental rights. The aforementioned CJEU jurisprudence has shed light on several aspects of the general monitoring ban, including its anchorage in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business (Articles 8, 11 and 16 of the Charter) and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) ECD and Article 17(8) CDSMD. Accordingly, Article 15(1) ECD and Article 17(8) CDSMD can be regarded as exponents of fundamental rights and freedoms and the accompanying principle of proportionality.

Against the background of these insights, it is important that national legislators implementing Article 17 CDSMD note that the wording of Article 17 CDSMD itself leaves room for the introduction of a requirement of double specificity in the context of the new content filtering obligations. Article 17(4)(b) obliges rightholders to furnish ‘the relevant and necessary information’ for ensuring the unavailability of notified works. Similarly, Article 17(4)(c) requires a ‘sufficiently substantiated notice’ of an existing infringement. In the light of the need to safeguard fundamental rights and freedoms, a rightholder notifying only specific works, but failing to notify specific infringers, cannot be considered to have provided all ‘relevant and necessary information’ or a ‘sufficiently substantiated notice’. As a result, the notification will be incomplete and incapable of imposing a valid filtering obligation on providers of online content-sharing platforms.

The insights of this research on the general monitoring ban also have particular importance with regard to the DSA that is expected to take concrete shape in December. From the perspective of fundamental rights, any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. Even if the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality:
  • any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon fundamental rights and freedoms. To the extent that filtering obligations would fail to provide adequate protection to rights protected under Articles 8, 11 and 16 of the Charter or the principle of free movement of goods and services in the internal market, they cannot be imposed on intermediaries;
  • even if new legislation were to set forth obligations to monitor platform content generally, such monitoring would have to be made sufficiently ‘specific’, as a result of the need to strike a ‘fair balance’ between all protected fundamental rights;
  • if the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality.
[Guest post] The Implementation of Article 17 CDSMD in EU Member States and the Evolution of the Digital Services Act: Why the Ban on General Monitoring Obligations Must Not Be Underestimated [Guest post] The Implementation of Article 17 CDSMD in EU Member States and the Evolution of the Digital Services Act: Why the Ban on General Monitoring Obligations Must Not Be Underestimated Reviewed by Eleonora Rosati on Wednesday, November 18, 2020 Rating: 5

No comments:

All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.

It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.

Learn more here: http://ipkitten.blogspot.com/p/want-to-complain.html

Powered by Blogger.