German ‘hate-speech’ law tries to regulate Facebook and others - will it work?

Caught in the net(work)
In September 2017, a law with the euphonious name ‘Netzwerkdurchsetzungsgesetz’ (‘network enforcement law’) was adopted in Germany. Its goal is to force social networks to remove hate speech and certain other unlawful content within 24 hours in obvious cases, otherwise within 7 days upon being notified. The transition period ended on 31 December 2017 - meaning the new rules can now be enforced. Failure to delete content within the given deadlines can result in heavy administrative fines of up to EUR 50m.

A short history of the law

A proposal for the new law was first made in March 2017 in reaction to several high profile cases of fake news that went viral and defamatory speech against migrants on platforms such as Facebook and Twitter. The law was widely criticized for its lack of clear definitions and its chilling effects, especially by incentivizing ‘overblocking’ and censorship as a precautionary measure to avoid fines. Due to the broad criticism, the draft was changed numerous times, until the fifth and final version was ratified only three months after the initial proposal.

What is a social network?

A key issue of the new law is the definition of ‘social networks’ that the NetzDG applies to. Section 1 (1) defines ‘social networks’ as “tele media providers who operate commercial platforms that are meant to enable users to exchange or share any kind of content with other users or to make such content available to the public”. While only Facebook, YouTube and Twitter were mentioned by name during the law-making process, this very broad definition clearly includes dozens - maybe even hundreds - of other platforms. The initial draft went even further and made explicit mention of instant messengers (think ‘WhatsApp’) being a target of the law, though this was not included in the final version.

To make this overly broad definition a bit more reasonable, platforms with journalist content for which the platform operator takes full responsibility are excluded from the definition. However, some uncertainty remains about what exactly will be considered ‘journalism’ and whether the exemption also applies to comments on such sites, for which the operator does not take full responsibility.

Platforms that focus on ‘specific topics’ are also excluded from the NetzDG. This provision is meant to exclude sites like LinkedIn, but also online games and online shops. Looking at the wide range of topics regularly discussed on such sites, this line remains blurry, too.

Finally, ‘small’ social networks were excluded by a threshold of 2m registered users. The initial draft only mentioned ‘users’ here, without providing any definition of how such users would have to be counted. Page impressions per week, month, year or since the sites beginning? While the limitation to registered users does provide some clarity, it still leaves room for uncertainty and strange results. While MySpace most likely (still) has more than 2m registered users, is it still being used by that many users? The NetzDG definitely provides and incentive for platforms to get rid of users that are not active on a regular basis. Staying beneath the 2m users’ threshold provides an exemption from potential EUR 50m fines, an obligation to provide procedures for the handling of complaints and to send regular reports to the responsible authority (the Bundesamt für Justiz).

What kind of content is affected?

There is no legal definition of ‘hate speech’ or ‘fake news’ under German law. The original draft focused on 24 provisions from the German Criminal Code (StGB). These covered a wide range of offences, ranging from defamation of the President of the Federal Republic of Germany (section 90 StGB) to the depiction of violence (section 131 StGB), the founding of a criminal organization (section 129 StGB), to criminal insults (section 185 StGB) and  threats (section 241 StGB). While some offences have been dropped and others added in the final version of the law, there are now 21 different criminal offences that are eligible to be treated as ‘hate speech’ under the NetzDG. The full list can be found in Section 1 (3) NetzDG.

Before and after

While it needs to be said that Facebook, Twitter and other platforms have been obliged to take down illegal content after receiving a notification under German law, the NetzDG changes the legal landscape significantly, at least for content that falls under one of the 21 criminal offences. Before the ratification of the NetzDG, a user would have first had to send a notification to the platform, asking for removal of certain content (‘notice-and-takedown’). Should the platform operators choose to leave the content online, the user would then have had to take civil action against the network. This is a step that few users could take, due to the costs and uncertain outcome of such proceedings.

Under the new law, the process has changed significantly. First, ‘social networks’ have to provide effective and transparent procedures for the handling of complaints under the NetzDG. As part of these obligations, the platforms have to provide a German address, similar to a designated DMCA agent. As a result of this, platforms such as Facebook and Twitter can now be served legal documents within Germany for the first time. They have also added methods for users to ‘flag’ certain content as a violation of the NetzDG.

In case of doubt – delete

Section 3 (2) NetzDG obliges social networks to expeditiously take note of complaints and process them within a very short timeframe. Content that is ‘obviously unlawful’ under one of the 21 criminal offences has to removed or blocked within 24 hours from the receipt of the complaint. This is quite astonishing, since the 24-hour timeframe is already very short, but on top of that it is not dependent on actual knowledge of the infringement, but merely on the receipt of the complaint. The German legislator however did not see any conflict with Art. 14 Section 1 (a) of the E-Commerce-Directive, which asks for “actual knowledge of illegal activity or information”. A definition of what constitutes ‘obviously’ unlawful content is not provided, leaving legal uncertainty for the social networks and providing incentives to rather delete too much content than too little. This effect is further strengthened by the fact that there are repercussions for failure to delete content after a notification, but not for deletion of lawful content.

Content that is not ‘obviously’ infringing must be removed or blocked expeditiously, regularly within seven days upon receipt of the complaint. While this does provide some extra time to look into more complicated cases, the social networks will have to look at every complaint within the first hours after their receipt. Otherwise, complaints about ‘obviously’ unlawful content might be identified too late for a removal within 24 hours.

Failure to remove unlawful content within the given time frames can result in administrative fines of up to EUR 50m. The original draft of the law stated explicitly that fines could be invoked by a single violation of Section 3. However, the final law takes a more liberal approach. Only ‘systematic errors’ when handling complaints will lead to fines, according to the rationale of the law. This does provide some guidelines, but still leaves room for interpretation, especially since no mention of ‘systematic errors’ is made in the wording of the law itself. The federal agency in charge of imposing such fines (the BfJ) has yet to release its ‘guidelines for fines’ which are supposed to provide legal certainty for social networks and their users.

Conflict with the E-Commerce-Directive

One problem appears to have been overlooked completely by the legislator. Until now, social networks - which are hosting providers in the sense of Art. 14 E-Commerce-Directive - had to expeditiously remove any illegal information or activity after obtaining knowledge or awareness of it. Art. 14 has been implemented in section 10 TMG (the German Telemedia Act).

The NetzDG provides social networks with an opportunity to let an external party (an institution of regulated self-regulation) [this is the closest translation I can come up with. It does sound weird in German, too] decide whether certain content is unlawful under one of the 21 criminal offences or not (Section 3 (3) b). The social network has seven days from the receipt of the complaint to hand over the matter to one of these institutions. The institution then has another seven days to decide whether the content is to be deleted or remain online. After the institution finds the content to be unlawful, the social network must then delete or block the content expeditiously. This could lead to a severe impairment of the victims’ legal position. Before the NetzDG, platforms had to act expeditiously in order to stay exempt from liability under Art. 14 E-Commerce-Directive / Section 10 TMG. Now, the deletion can potentially take 14 days or longer according to the NetzDG. This paradox result was clearly not the goal of the NetzDG. Social networks will have to think twice before handing over the decision about a complaint’s validity to an external party. While they may be safe from a fine under the NetzDG, their general liability under the civil law has not been changed.

First visible effects

The NetzDG was only in full force for 24 hours when German politicians raised concerns about its effects. Beatrix von Storch and Alice Weidel [IPKat reported about her before], both politicians of the right-wing AfD, had some of their Tweets and Facebook posts deleted by the social networks. The account of Beatrix von Storch was also suspended for several hours, according to news reports in Germany. It appears that the comments by von Storch were reported for incitement to racial hatred (section 130 StGB) and then removed by the social networks. Von Storch criticized this as a severe limitation of her political work as a member of the German parliament. According to von Storch, Facebook - and not the public prosecutors - deciding whether an activity was a criminal offence results in the end of the rule of law.

While this Kat believes that statement is a bit far-fetched, it is quite possible that the same Tweets would have remained online before the NetzDG came into force, even after a complaint to the network.

Many questions remain: how much (legal) content will be deleted by the social networks? What will the German courts and the CJEU think of the new law? Will there ever be a fine against a social network? How would such a fine against a foreign company be enforced? What can users do to reinstate legal content that was removed under the NetzDG? Is there an actionable right to participation in social networks? Time will tell!
German ‘hate-speech’ law tries to regulate Facebook and others - will it work? German ‘hate-speech’ law tries to regulate Facebook and others - will it work? Reviewed by Mirko Brüß on Wednesday, January 03, 2018 Rating: 5

No comments:

All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.

It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.

Learn more here: http://ipkitten.blogspot.com/p/want-to-complain.html

Powered by Blogger.