[Guest post] ‘Upload filters’ and human rights: implementing Article 17 of the Directive on Copyright in the Digital Single Market

The IPKat is happy to host a guest contribution by Felipe Romero Moreno (University of Hertfordshire), based on a recent scholarly article available here, on the national transpositions of Article 17 of the Directive on copyright in the Digital Single Market [Katposts here], which the European Parliament adopted a year ago today and which individual EU Member States will have time to transpose into their own legal systems by 7 June 2021.

Here's what Felipe writes:

‘Upload filters’ and human rights: implementing Article 17 of the Directive on Copyright in the Digital Single Market
by Felipe Romero Moreno

A careful balance
The EU Directive on Copyright in the Digital Single Market (the CDSM Directive) requires the European Commission, in cooperation with copyright holders, services, user groups and others, to meet up to discuss potential practical solutions for the implementation of Article 17 therein (the provision concerning online content sharing service providers (OCSSPs)’ obligations in relation to the making available of user uploaded content (UUC), also known in jargon as the ‘value gap’ provision). 

Article 17 also requires OCSSPs to enter into licensing agreements with rightholders for the making available of copyright content uploaded by users of their service. If a licence is not concluded, these services must make ‘best efforts’ to prevent the making available of infringing content. 

However, as of March 2020, it remains to be seen whether domestic implementations of Article 17 will respect the fundamental rights to a fair trial, privacy and freedom of expression under the European Convention on Human Rights, the EU Charter, the E-Commerce Directive 2000/31 and the General Data Protection Regulation (GDPR). Let me break this down for you.

Compatibility of Article 17 with the right to fair trial, privacy and freedom of expression under ECHR

According to the Commission, uploading GIFS, memes or similar UUC is expressly permitted, as Article 17(7) states that the CDSM Directive must in no way impact lawful uses [see however the IPKat’s take on the lawfulness of GIFs and memes as such in Europe here], thereby allowing users to rely on copyright exceptions for the purposes of review, quotation, criticism, pastiche, caricature or parody. However, there’s just one problem. This is is to be read vis-a-visa Ofcom’s warning that the lawfulness of these types of UUC cannot easily be determined using content analysis alone, but it rather requires knowledge of the surrounding context.

Compatibility of Article 17 with general monitoring obligations under Article 15 E-Commerce Directive

But it gets better. Article 17(8) CDSM Directive expressly prohibits Member States from implementing upload filters in a way which could lead to the imposition on service providers of a general monitoring obligation of UUC. Pursuant to Article 15 E-Commerce Directive, Member States are prohibited from imposing on these services a general obligation to monitor user information. Recital 47 E-Commerce Directive also allows Member States to require services to perform a monitoring obligation in a specifically targeted situation. Moreover, as per Recital 48 E-Commerce Directive, such services can also adopt ‘duties of care’ to identify and prevent unlawful activities.

Consistent with the CJEU decision in Glawischnig-Piesczek v Facebook, for general monitoring obligations to become lawful ‘duties of care’ and ‘specific’ enough to comply with Recitals 47 and 48 E-Commerce Directive, services should deploy a hierarchical identification technique, which is less data processing-demanding. Moreover, rightholders should create databases of ‘relevant and necessary information’ along with business rules that exclusively tackle commercial-scale copyright infringement.

Compatibility of Article 17 with the GDPR

Under Article 17(9) the implementation of the CDSM Directive must not result in any disclosure of user information or in the processing of personal data, except pursuant to the GDPR and the E-Privacy Directive 2002/58. As stated in Article 22(1) GDPR, when deploying upload filters, service providers can exclusively perform automated decision-making with lawful or equally important effects including those premised upon user profiling when the decision is: (a) required for the performance of contracts; (b) allowed by EU law, such as for implementing the obligations laid down in Article 17 CDSM Directive; or (c) based on user’s express consent.

And on top of that, whilst Article 22(3) GDPR compels these services to adopt suitable means to safeguard user’s rights and freedoms, Article 22(4) GDPR additionally requires that users’ sensitive data such as political or religious copyright content only be processed where it is necessary on substantial public interest grounds. Considering the CJEU’s Rigas case, does the legitimate interest of services prevail over users’ rights and freedoms? For Article 17 to be GDPR-compliant, users should have the right to be informed through clear just-in-time notices about the gathering and use of their data and access these data. However, in case of wrong identification, users should also exercise their right to rectification, erasure, restriction of processing, object to profiling and receive compensation.

So what’s the solution?

The European Commission will consider the input received during the six-stakeholder dialogue meetings once the Covid-19 crisis allows this. That final meeting would be the chance to offer their view on the substance of the guidance for Article 17 implementation, before launching a written stakeholder consultation.

The Commission has said that it ‘remains available to engage in discussions and receive and publish position papers by stakeholders not selected for the physical meetings’.

At a time when the stakeholder dialogue is the most carefully thought-out provision laid down in the CDSM Directive, the Commission would be ill-advised not to take on board academic advice and feed this into the guidance for Article 17 implementation (see here for a more detailed analysis).
[Guest post] ‘Upload filters’ and human rights: implementing Article 17 of the Directive on Copyright in the Digital Single Market [Guest post] ‘Upload filters’ and human rights: implementing Article 17 of the Directive on Copyright in the Digital Single Market Reviewed by Merpel McKitten on Thursday, March 26, 2020 Rating: 5

No comments:

All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.

It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.

Learn more here: http://ipkitten.blogspot.com/p/want-to-complain.html

Powered by Blogger.