Outsourcing Censorship, Attacking Civil Liberties: Germany’s NetzDG

Germany’s new Network Enforcement Act has been described by the UN Commission on Human Rights as a law that raises grave concern on freedom of expression and online privacy. MPP Student Subhodeep Jash weighs upon the harsh overreach of an act that confers to private networks the public powers of being judge, jury, and executioner when regulating hate speech, fake news and terrorist propaganda.

“Be conservative in what you send and liberal in what you accept” is how Jon Postel, a pioneering figure in the development of the internet, described the medium. The antithesis of this tenet seems to be the basis of Germany’s new Network Enforcement Act (‘NetzDG’), which came into force three weeks ago, as digital platforms face increased international political pressure to curb hate speech, fake news and terrorist propaganda.

The Act’s raison d’être is “to improve enforcement of the law in social networks” towards combating hate speech and fake news by regulating content that is unlawful. Social networks are obliged to remove access to such content within a time frame of 24 hours to a week depending on its gravity. In the case of non-cooperation, the provider of the social network can be imposed a maximum fine to the tune of €5 to €50 million. The act is a harsh overreach and surprising amidst a global environment of receding trust in public institutions. The UN Special Rapporteur to the High Commissioner for Human Rights, David Kaye, has aptly summed up the law as one that raises grave concern on freedom of expression and online privacy.

The legislation has the bearings of a power contestation between commercial logic and state interests. In Foucault’s theory of power, this Act would translate into a form of disciplinary control that is continuously exercised employing surveillance. It is a disproportionate response to the glaring problem posed by the excessive influence that a small number of private firms exercise in the control of content that the public creates and consumes online. To the Internet’s devotees, most of whom embrace some variety of libertarianism, the Internet’s structural resistance to censorship, or any externally imposed filtration, is “not a bug but a feature.”

It is necessary to put into perspective why a law that outsources the task of regulating censorship to private actors meddles with the architecture of the cyberspace is so troubling. First, a system conferring to private networks the public powers of being judge, jury, and executioner, and determining what constitutes merely unlawful and ‘manifestly unlawful’ content within the realm of the German Criminal Code skews the balance of power that is embedded in society. It would have a ‘chilling effect’ on censorship to delegate a context-specific legal function of determining illegality of content purely in the hands of the social networks.  Secondly, the Act extends its application only to those social networks having at least two million registered users. However, there is an exception in its applicability to: platforms offering journalistic or editorial content (for e.g., the online edition of ‘Tagesspiegel’)  and platforms designed for individual communication (for e.g., Whatsapp or Telegram) or dissemination of specific content. It isn’t evident how the user-base would be computed: would they consider the active user base or an average value aggregated over a period of time? Thirdly, the Act doesn’t seem to indicate how it seeks to address its extraterritorial application. Would the platforms only remove content that is posted by registered users in Germany? What happens to the content that originates outside of Germany?

The silver lining, if any, in this legislation is the mandated establishment of a local point of contact within these transnational social network providers to cooperate with law enforcement authorities on takedown requests.

Going forward, what needs to be taken into account by platforms, such as YouTube, when dealing with such legislation? According to a recent BBC report, YouTube receives 200,000 reports of inappropriate content a day, and manages to review 98% of them within 24 hours. But under the NetzDG, even a 2% proportion – if left unaddressed – implies a violation of the Act.  Therefore, platforms would first need to embolden the tools and resources they have to actively monitor and review its Community Guidelines enforcement system. They would need to streamline efforts in building a robust complaint management procedure that is envisaged under the scheme of Act by January 1, 2018. The platforms should also seek clarity from the Federal Office of Justice on certain aspects of the law, such as the importation of extra-territorial user content (i.e., content originating outside of Germany) that is allegedly unlawful; as well as in the meaning of “organizational deficiencies” under Section 3(4) of the Act. The platforms could also seek to leverage the European Internet Forum to undertake a strong advocacy campaign on the incompatibility of such regulations with the International Covenant on Civil and Political Rights and the upcoming EU E-Privacy Directive.

It remains to be seen, however, whether the law can withstand a constitutional challenge before a court of law. The enabling rules for NetzDG are expected to come out shortly, given the Act comes into effect on January 1, 2018, and should hopefully provide a patch to the ‘bugs’ in the law. The fact that this legislation has now spawned a model template for Russia to enact a similarly repressive law doesn’t speak well for a state expected to play its part as a vanguard defender of (offline and online) civil liberties within the region.

Subhodeep Jash is a Class of 2018 Master of Public Policy Candidate. He is presently working on a project in the Global Governance unit at the Berlin Social Science Center. He is also a trivia nerd and is gearing up for a Mindhunter & Stranger Things (Season 2) binge on Netflix