Germany’s Network Enforcement Act (Netzwerkdurchsetzungsgesetz or NetzDG) entered into full force on January 1, 2018. Known colloquially as a “hate speech law,” it is arguably the most ambitious attempt by a Western state to hold social media platforms responsible for combating online speech deemed illegal under the domestic law. Other countries, like France, are using NetzDG as a basis for proposed legislation, making analysis of NetzDG urgent and important. While NetzDG has encouraged accountability and transparency from large social media platforms, it also raises critical questions about freedom of expression and the potential chilling effects of legislation. During a first meeting at Ditchley Park, UK (February 28-March 3, 2019), the Transatlantic Working Group (TWG) analyzed NetzDG based on an earlier draft of this document. The current version was revised to incorporate crucial insights from those discussions. This updated document introduces NetzDG’s content and political context and discusses its implications for freedom of expression. It closes with next steps and recommendations for more research and transparency, while suggesting how to mitigate the troubling elements of the law. At the outset, it is important to clarify that NetzDG does not actually create new categories of illegal content. Its purpose is to enforce 22 statutes in the online space that already existed in the German criminal code and to hold large social media platforms responsible for their enforcement. The 22 statutes include categories such as “incitement to hatred,” “dissemination of depictions of violence,” “forming terrorist organizations,” and “the use of symbols of unconstitutional organizations.” NetzDG also applies to other categories, such as “distribution of child pornography,” “insult,” “defamation,” “defamation of religions, religious and ideological associations in a manner that is capable of disturbing the public peace,” “violation of intimate privacy by making photographs,” “threatening to the commission of a felony” and “forgery of data intended to provide proof.” NetzDG targets large social network platforms, with more than 2 million users located in Germany. It requires these platforms to provide a mechanism for users to submit complaints about illegal content. Once they receive a complaint, platforms must investigate whether the content is illegal. If the content is “manifestly unlawful,” platforms must remove it within 24 hours. Other illegal content must be taken down within 7 days. Platforms that fail to comply risk fines of up to €50 million. NetzDG also imposes transparency requirements. If a platform receives more than 100 complaints per year, it is required to publish semi-annual reports detailing its content moderation practices. The act stipulates in some detail what types of information must be included.3 The first round of reports was published in June 2018; the second round appeared in early 2019. Their results are discussed in further detail below.
