|
Mind the gap, please |
Following a number of delays, a few days ago the JURI Committee (Legal Affairs) of the European Parliament finally adopted the text of the Report on the proposed Directive on copyright in the Digital Single Market (DSM Directive) as drafted by its Rapporteur, MEP Voss.
This development, which would allow the European Parliament to begin trilogue negotiations (negotiations between the Council and the European Parliament to reach a compromise between the respective versions of the DSM Directive) follows the earlier vote and, with it, agreed negotiating mandate in the Council of the European Union [here].
This
week (5 July) the plenary of the European Parliament will vote on whether the
trilogue negotiations, on the basis of the mandate represented by the JURI
Committee Report, may begin [see this helpful Politico infographic].
The
text of the proposed DSM directive has attracted significant attention since
its release
by the EU Commission in September 2016. Among other things, the provision in
Article 13 (value gap or transfer of value) has been commented extensively.
According to critics, this proposal – if adopted – would affect dramatically
the functioning of the internet and introduce brand-new obligations for online
actors (online content sharing service providers).
But
would this be actually the case?
The structure and content of Article
13 in the JURI Committee Report
Compared
to earlier versions of the value gap provision, the most notable elements of
the JURI Committee version are probably those outlined below.
Definition
of online content sharing service providers
Online
content sharing service providers that would be subject to Article 13
obligations are defined (Recital 37a and Article 2) as ‘information society
service providers one of the main purposes of which is to store and give access
to the public or to stream copyright protected content uploaded / made
available by its users and that optimise content, including amongst others
promoting displaying, tagging, curating, sequencing the uploaded works or other
subject-matter, irrespective of the means used therefor, and therefore act in
an active way.’
Non-commercial
service providers (eg online encyclopaedias), providers which allow
content be uploaded with the authorization of all rightholders concerned,
providers of private, open source software developing platforms, and online
marketplaces whose main activity is online retail of physical goods are outside
the scope of the definition of online content sharing service providers and,
therefore, Article 13 of the DSM Directive.
Nature
and obligations of online content sharing service providers
The
JURI Committee Report (Article 13 and Recitals 37-39c) states that providers
within the definition above:
- are deemed to make acts of
communication to the public and are therefore responsible (and potentially
liable) for user-uploaded content (UUC) made available through their services;
- if they make acts of
communication to the public, are not eligible for the safe harbour within
Article 14 of the Ecommerce Directive.
In any case, the safe harbour would not apply when a service provider plays an active role, including by optimizing the
presentation of the uploaded works or subject-matter or promoting them,
irrespective of the nature of the means used therefore;
- are under an obligation to
conclude ‘fair and appropriate’ licensing agreements with relevant rightholders
(the latter are, however, under no obligation to issue licences), and this
obligation also applies to information society service providers that automatically
reproduce or refer to significant amounts to copyright works and make them
available to the public for the purpose of indexing and referencing (Article
13b) (the latter appears to resemble a law
adopted in France in 2016);
- the licences granted also cover
any liability of users of the service for non-commercial UUC in line with the
terms of the relevant licence;
- shall, in cooperation with
rightholders, take ‘appropriate and proportionate’ measures to ensure the
functioning of the licensing agreements concluded for use of relevant works on
their services;
- are under an obligation to
prevent the availability of infringing content by adopting proportionate and
effective measures – based on information provided by rightholders – while not
preventing the availability of lawful UUC (this obligation subsists also when
the safe harbour protection applies and in the absence of licensing
agreements);
- are under an obligation of
transparency towards rightholders and users alike regarding the use and
implementation of relevant measures.
Obligations
of EU Member States
In
the JURI Committee version EU Member States:
- are under an obligation to ensure
that the implementation of the measures referred to in Article 13 is: (i) proportionate; (ii) strikes an appropriate balance
between different fundamental rights protected by the EU Charter of Fundamental Rights;
and (iii) is in accordance with the
prohibition of general monitoring within Article 15 of the Ecommerce Directive.
- must ensure that: (i) providers put in place effective
and expeditious complaints and redress mechanisms to prevent misuses or
limitations to the exercise of relevant exceptions and limitations (any complaint filed under such
mechanisms is to be processed without undue delay); (ii) the measures adopted by online
content sharing service providers to prevent the availability of infringing
content be compliant with the GDPR
and the Directive on privacy and
electronic communications and
require no identification of individual users and the processing of their
personal data; (iii) users have access to judicial
remedies to assert reliance on an exception or limitation; (iv) authors and performers, who do
not opt for a non-exclusive usage right for all users free of charge, receive
fair and proportionate remuneration for the exploitation of their works,
including online.
What the law already says
The
version of the value gap proposal approved by the JURI Committee moves from the
idea that online content sharing providers make acts of communication to the
public and are ineligible for the safe harbour protection in relation to their
own copyright-relevant acts.
Commentaries
on the JURI Committee Report have rapidly emerged [eg, here, here, here, here].
Criticisms have focused, in particular and among other things, on the following
aspects:
- Online content sharing service
providers do not make acts of communication to the public;
- Online content sharing service
providers are eligible for the safe harbour protection;
- Online content sharing service
providers may not be required to implement filtering systems;
- Article 13 of the DSM Directive would
seriously impair freedom of expression/information, as well as data
protection/privacy, thus breaching users’ fundamental rights.
The
current legal framework – as developed at the level of both the Court of
Justice of the European Union (CJEU) and national courts – seems to have moved
already towards the direction envisaged by the value gap proposal. It may be
indeed the case that Article 13, also if adopted in the form proposed by the
JURI Committee, would not represent a dramatic shift from the way in which the
law has developed up till now. In relation to critics’ legitimate and important
concerns, guidance appears in fact already available under the existing
framework.
Responsibility
and liability for unauthorized acts of
communication to the public
The
right of communication to the public within Article 3(1) of the InfoSoc Directive
has been progressively construed by the CJEU through its nearly 20 judgments on
this point. In more recent cases the Court has focused in particular and among other things on the
‘indispensable intervention’ of the user/defendant, and referred to the user’s
profit-making intention.
In Filmspeler [here], the Court held that an
intervention enabling a direct link to be established between those who make
available infringing works and users of such works, “is quite different from
the mere provision of physical facilities, referred to in recital 27 of [the
InfoSoc Directive].” Such an intervention – made
“with full knowledge of the consequences” of such a conduct – facilitates access to unlicensed
content that would be otherwise more difficult to locate and triggers the
liability of the subject who makes it.
Coherently with this understanding, in its 2017 judgment in The Pirate Bay [analyzed more in detail here], the CJEU concluded that the operators of an
online platform may be liable for unauthorized acts of communication to the
public.
The undertaking by the platform
operators of indexing, categorization, deletion, or filtering activities – no
matter how they are performed – excludes any assimilation to the mere provision
of facilities. The making available and management of an online sharing
platform must be therefore considered an act of communication for the purposes
of Article 3(1) of the InfoSoc Directive.
Although the platform
at issue in The Pirate Bay was (and
still is) principally devoted to piracy, it appears questionable to hold the
outcome of that case would be only applicable to egregious scenarios like the
one at stake there. Acts of communication may be made also by the operators of
other, non-piracy focused, platforms giving access to UUC. National case law
has begun emerging and confirmed this point: a court in Austria (in the context
of interim proceedings) has recently ruled that YouTube makes
acts of communication to the public. The Federal Court of Justice in Germany is
also
expected to rule on whether YouTube may be regarded as
primarily responsible (and liable) for acts of communication to the public in
September.
Overall, the evolution
– also at the judicial level – has been in the sense of acknowledging that
certain platforms may have moved away from a nature of pure, passive hosts.
|
Thrilled to be in the water ... where's that safe harbour? |
Safe
harbour protection
With
regard to the unavailability of the safe harbour for hosting providers within
Article 14 of the Ecommerce Directive to platforms liable for unauthorized acts
of communication to the public, this should not come as a surprise (although in
its original proposal the EU Commission suggested that the safe harbour could
be still available to platforms making acts of communication).
The
safe harbours in the Ecommerce Directive are only available to passive
providers. This is clear from the language of the relevant provisions as well
as CJEU case law, including Google
France and eBay.
Furthermore,
the Ecommerce Directive (Recital 44) explicitly excludes the applicability of safe harbours
in case of direct infringements of mere conduit and caching providers in
collaboration with recipients of their services. For hosting providers it appears
fair to assume that the same regime applies. The safe harbour relates in fact to
possible liability of a hosting provider on a secondary basis for third-party
infringements, not direct infringements by the provider (Recital 46 and Article
14(2)).
Although some scholars have suggested a different reading of the
Ecommerce Directive (holding that Ecommerce safe harbours would apply
irrespective of the form of liability), the CJEU appears to have taken a
different direction in The Pirate Bay.
Unlike the Opinion of the Advocate General, the decision contains no references to the
Ecommerce Directive and envisages further hypotheses of liability than what the
Opinion does.
Lack of references to the Ecommerce Directive in that judgment suggests
that the exemptions/limitations from liability envisaged in the latter would be
inapplicable in case of primary infringement by platform operators. This is
coherent with the idea that the insulation offered by the safe harbours is only
available to information society service providers that act as mere
intermediaries.
|
Filtering |
Filtering
obligations
With regard to the suggestion that EU law prohibits the imposition of
monitoring (filtering) obligations, this is true with regard to general monitoring only (as also
confirmed in eBay and McFadden).
Some commentators have referred to the twin decisions in Scarlet and Netlog, arguing that the CJEU has
clarified that an obligation like the one that would be imposed under Article
13 of the DSM Directive would be absolutely contrary to EU law.
It is suggested that this is an incorrect reading of those judgments,
which are narrower in scope than what has been claimed. In fact, what the CJEU
found incompatible with EU law in those case (see also the operative part of
the decision) would be only a
filtering system imposed on a provider that would: (1) filter information which is stored on its
servers by its service users; (2) which applies indiscriminately to all of
those users; (3) as a preventative measure; (4) exclusively at its expense; and (5) for an unlimited
period, which is capable of identifying electronic files containing copyright
material, with a view to preventing those works from being made available to
the public without a licence.
In The Pirate Bay the CJEU held that liability for unauthorized acts of
communication to the public arises in case of actual and constructive knowledge
and – potentially – also in cases in which knowledge is presumed (in a GS Media sense). In this
sense, operators of platforms with a profit-making intention would have an ex
ante reasonable duty of care and be subject to an ex post notice-and-takedown system, which would also include an
obligation to prevent infringements of the same kind, eg by means of re-uploads
of the same content. This appears in line with eBay, in which the CJEU clarified the obligations of a ‘diligent
economic operator’ and also held that an injunction against an intermediary may
be aimed not just at repressing existing infringements but also preventing new
ones from occurring. Also national courts have reached similar conclusions
regarding preventing re-uploads of infringing content, eg in Germany and
Italy.
This, in substance, is what the language of the value
gap proposal also suggests and does not appear to be at odds with developments
already occurred at the CJEU and national case law levels.
Fundamental
rights, including freedom of expression/privacy and data protection/privacy
Finally,
some
have argued that the value gap proposal would ban memes and GIFs and ‘censor’
the internet. Yet Article 13: (a) requires undertaking a balancing of different
rights and interests, and (b) is without prejudice to available exceptions and
limitations (which remain optional for Member States to introduce, irrespective
of whether the value gap proposal is adopted or not).
Some
have also suggested that automated filtering systems are not in a position to
determine whether a certain use of a copyright work falls under an available
exception. Whilst this might be true (yet platforms that already have filtering
systems in place are flooded with parodies, quotations, reviews, etc), the
proposal also clarifies that Member States must ensure that systems to prevent
misuses or undue limitations to the exercise of relevant exceptions and
limitations are in place, together with complaint and redress mechanisms.
If
there is a problem with copyright exceptions and limitations in the EU (and the
freedom to make and post GIF and memes), this has existed for a long time, well
before and independently from the release of the proposal for a DSM Directive [The IPKat
discussed it here].
It is due to: (a) the fact that Article 5 of the InfoSoc Directive leaves
Member States free to pick and choose what exceptions and limitations transpose
in their legal systems (with the sole exclusion of temporary copies) and (b) as
a matter of fact, national transpositions of Article 5 exceptions and
limitations have been different across the EU, with the result – if one did not
consider the work of the CJEU in this area – that there appears to be no real
level playing field for copyright exceptions and limitations across the EU.
Finally,
some critics have also noted that the obligations imposed on providers to
prevent the availability of infringing content would be contrary to data protection/privacy
principles and Article 8 of the EU Charter. This concern is difficult to fully
grasp in abstracto, as the proposal
refers expressly not only to the respect of fundamental rights but also to the
GDPR and the Directive on privacy and electronic communications.
Conclusion
While
the proposal in Article 13 (read in light of Recitals 38 and 39) raises practical
questions (eg different approaches will likely be required depending on the
type of content at issue, information regarding content will need to be
provided accurately by rightholders to comply with the eBay decision, providers will need to make technical choices and
adopt appropriate filtering systems, etc), some of the concerns raised against
it relate to issues that have been already addressed within the existing EU
framework, as also interpreted at the judicial level.
In
this sense, the adoption of the value gap proposal would hardly signal a major departure
from the law as it has already developed under existing legislative
instruments: it would rather represent a consolidation and possibly a clarification thereof. While there is
room to improve the text of Article 13 further, its main tenets do not appear at
odds with EU law, including fundamental rights.
Very well written. I could not disagree more, unfortunately. While you are completely correct on a number of points you discuss (in particular in claiming that some of the elements in the proposal have already been tried and tested in CJEU case law), the most important point lies elsewhere. You suggest that the proposal is not a major departure but a consolidation or clarification. Not so. While post-2001 EU law regulating the Internet in the content layer is liberalizing and permissive, the post-2015 package, this directive included, is built on fear, lobbying and defence from real and imagined dangers. It is as far from the 1997 EU Communication on E-Commerce or Clinton/Magaziner vision of the Internet as enabler as is possible. This proposal is in open conflict with the main beliefs of the 2001 E-Commerce Directive (and I would not even begin discussing the issue of “platforms” being the subject of regulation rather than ISSs). No amount of subtle manoeuvring around SABAM would ever remove the taste of fear that Juncker/Ansip digital agenda brings. If I have learned anything in my now 17-year-long career as Internet lawyer, it is that problems such as these (the scope of copyright law) are always a manifestation of a larger lack of vision and fear of the new. This proposal – and in particular Arts. 11 and 13 - needs to be killed, and killed swiftly and permanently, if we are to regain trust in the EU as digital regulator.
ReplyDeleteThis is all very interesting; the message I take away from it is that determinations on which services need to implement what types of copyright filtering mechanisms will take many years, and millions of Euros in legal fees, to figure out, if they ever get figured out at all, and if it even matters anymore by the time they are figured out. The vagueness and ambiguity here help no one.
ReplyDeleteThe Voss proposal took a bad idea in EU law and reinforces it: where a website optimizes or promotes infringing content, it becomes liable. But what if the 'optimization' is without knowledge of infringement? Then, a fledgling site with 'optimizing' functionality, say displaying or sequencing, could be obliged to pay a huge licence fee for content a user has uploaded and which it has no knowledge of and no desire to host.
ReplyDeleteOn the other hand, an obligation to adopt content-recognition technology makes practical sense so long as it is clear which sites should comply.
Excellent news that this the draft directive has been rejected. Now, rather than attempting to consolidate or clarify the existing bad law, they should attempt to draft new law that is good.
ReplyDeleteIf the proposal is ever adopted-either in the form as proposed by the Commission or the even worse mess made of that proposal by the Council and by large supported by JURI, it will not survive a challenge before the CJEU.
ReplyDeleteThe challenge may come from one of the MS that did not support it in the Council e.g. Germany and Benelux. Or it may take the favoured route of a challenge via judicial review in the Irish courts and there is no time limit for that. Remember that Digital Rights Ireland was based on a person who bought a phone and then challenged the state for retaining his data.So if one meme gets blocked, it is reviewable potentially. On what grounds -fundamental rights.
This is what happens when rightsholders push it too far.
Having just listened to Axel Voss (the rapporteur) on Euractiv and his histrionic, ultra rightwing, divisive speech -Europe First it could have been -had I been an MEP thinking of voting for the law, this would have swayed me to vote against! Who wrote that for him!!! I think they are misreading the mood badly here. However, in case you all forget -the EPP did exactly the same thing to the other proposal on introducing country of origin for online transmissions. Big rightsholders opposed it. So the JURI report proposal was also voted down and yet it had the support of those who voted down this particular file.
ReplyDeleteSo let the horse trading begin.
#anonymous 11.14: it is possible the E-Commerce Directive could be re-opened. That could result in better rules... But tbh aren't the major sites taking voluntary steps to address illegal UGC, so perhaps new rules are unnecessary.
ReplyDeleteSomeone above argued "But what if the 'optimization' is without knowledge of infringement?" --
ReplyDeleteThis is not very important on a practical level when you consider the realities of the situation. What content provider is likely to 'optimise' some random cat video upload or something with similarly minute commercial value?
It's far more likely that what ends up being optimised is something of commercial value, like a pop song. That's not to say that cat videos don't go viral, but there rather that there are no copyright issues there because what is being exploited is an original work owned by the uploader, and not an unauthorised one.
With that in mind a content provider will almost always have at least constructive knowledge of what they're 'optimising' or 'promoting' or 'communicating' etc via their network.