LSE - Small Logo
LSE - Small Logo

Martin J. Riedl

September 14th, 2020

A primer on Austria’s ‘Communication Platforms Act’ draft law that aims to rein in social media platforms

0 comments | 12 shares

Estimated reading time: 5 minutes

Martin J. Riedl

September 14th, 2020

A primer on Austria’s ‘Communication Platforms Act’ draft law that aims to rein in social media platforms

0 comments | 12 shares

Estimated reading time: 5 minutes

As governments tighten their regulatory grasp of big tech, a reterritorialization of the Internet is taking shape. Laws such as the German Network Enforcement Act (NetzDG) and the French Avia law expand liability regimes for takedowns of unlawful or hateful content. While the EU Commission is expected to harmonize laws across the bloc with the Digital Services Act, the newest kid in town comes from the small Alpine Republic of Austria. The country proposes its own platform liability law: the Communication Platforms Act (KoPl-G). Martin J. Riedl, a PhD researcher from the University of Texas at Austin, explains.

On 3 September, Austrian lawmakers presented a new law for platform accountability in Austria: the Kommunikationsplattformen-Gesetz (KoPl-G), or Communication Platforms Act in English, is a “draft federal act on measures to protect users on communication platforms.” The law is part of a larger package targeting “Hass im Netz” (online hate), amending the Austrian civil and penal codes — as well as media law — well beyond the introduction of the Communication Platforms Act itself.

Yet another NetzDG?

To platform policy enthusiasts, many aspects of the Austrian draft law will look familiar. After all, it is unabashedly modeled after the Network Enforcement Act (NetzDG), passed by the German Federal Parliament in June 2017. The NetzDG requires social media platforms to implement procedures for reporting and the takedown of illegal content within specific time limits. If platforms do not abide by these rules, they risk incurring heavy fines. The NetzDG garnered its share of criticism from civil society organizations as well as the political circuit, so much so that in 2020, the German Federal Ministry of Justice and Consumer Protection proposed substantive amendments to the law.

(Why) should one care about the Austrian Communication Platforms Act?

 The Austrian draft law is an interesting case study in at least three respects:

(1) It is slated to preempt the highly anticipated Digital Services Act, a proposal by the EU Commission set to create bloc-wide standards. The Austrian draft law is a litmus test of both how platform operators, as well as the EU Commission, might react to (yet) another national solution.

(2) It copies, in many respects, the German NetzDG, but also introduces new ideas in the domain of platform regulation. Some of these are particularly noteworthy, such as indirect financial pressure leveraged toward non-compliant platforms via their debtors (by blocking ad revenues), or mandatory reporting requirements on the workforce tasked to carry out content moderation.

(3) It serves as a blueprint to monitor debates about the roles and responsibilities that platforms should assume in societies, and as a vehicle for tracking discourse around how platform liability regimes in the EU walk a fine line between maintaining freedom of expression and cracking down on online hate and illegal content.

Key points about the Austrian Communication Platforms Act

  1. It targets platforms with either more than 100,000 users or an annual revenue exceeding €500,000. Notably, there are exemptions for Wikipedia, online news forums, and online marketplaces.
  2. Platforms must take down certain types of illegal content within 24 hours (the law defines this as content whose “illegality is already evident to a legal layperson”) and otherwise unlawful content within seven days after a complaint has been filed. Takedowns, including the content itself, as well as information identifying the user and date/time, must be archived for 10 weeks to retain evidence for possible prosecution.
  3. If platforms don’t comply, they could be subject to fines of up to €10 million, although the regulator would have leeway in assessing the scope of the fine by factoring in dimensions such as size of platforms and previous violations.
  4. Platforms must introduce comprehensive due process reporting systems for contesting takedowns, but also for users to report content and to inquire in case something they reported hasn’t been deleted. Appeals are made to the platform, but complaints can be issued to the RTR Austrian Regulatory Authority for Broadcasting, which has as its independent supervisory body the Austrian Communications Authority (KommAustria).
  5. Platforms must introduce a responsible point of contact — essentially a person who can be served. If platforms fail to provide contact information for such a person through whom the courts and users can reach platforms, they can ultimately be fined indirectly by forbidding their debtors (e.g., businesses who advertise on platforms) to pay what they owe to platforms.
  6. Platforms must publish reports on takedowns — on an annual basis if they have more than 100,000 users, and on a quarterly basis for those exceeding 1 million users. What is truly innovative are reporting requirements on content moderators: platforms have to provide a “[d]escription of the organisation, personnel and technical equipment, technical competence of the staff responsible for processing reports and review procedures, as well as the education, training and supervision of the persons responsible for processing reports and reviews” (German and English versions of the draft law here).

Reception of the draft law

Commentators have lauded different aspects of the law, such as the fact that authorities retain the right to explicitly define how transparency reports should look by issuing ordinances to render better cross-platform comparability. Austrian legal scholar Hans Peter Lehofer alludes to the symbolic nature of a small country like Austria leveraging national platform governance, thus “testing the Commission’s capacity for suffering” (translated from German). ISPA, the Austrian association of digital service providers that represents companies including Google and Facebook, says it is generally in favor of tackling online hate but critical of national solo efforts instead of European solutions.

Concerns mimic criticism that the German NetzDG received upon its inception – both from civil society organizations, as well as in the political realm. These include fears over the law’s impact on freedom of expression, the proclivity of any law that increases liability regimes to potentially cause so-called “overblocking,” the notion that private companies are rendered arbiters of deciding what speech is acceptable, and economic concerns for small businesses for which compliance with the law may be difficult. Austrian political parties’ responses have been relatively welcoming, except for the far-right Freedom Party, which surmises that the law aims to censor and/or silence political competitors.

Next steps

As the Austrian Communication Platforms Act undergoes a national and European evaluation phase, it will be important to monitor the degree of politicization in the country itself, and what kind of response from the EU Commission it will trigger. The Commission might, for instance, be interested in pulling the brakes on an Austrian law in anticipation of its own Digital Services Act. It is conceivable that the broader impact of the Austrian law, beyond pressuring the Commission, will be minor, especially if compared with other important cases (see here, here, here, or here) in the larger domain of platform governance originating in Austria.

This article represents the views of the author, and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

About the author

Martin J. Riedl

Martin J. Riedl (MA, Hanover University of Music, Drama and Media; MA, Humboldt University of Berlin, Germany) is a PhD candidate in the School of Journalism and Media at the University of Texas at Austin, where he works as a research associate for the Center for Media Engagement (CME) and for the Technology and Information Policy Institute (TIPI). His interests include journalism, content moderation, media sociology, internet governance, and social media.

Posted In: Internet Governance

Leave a Reply

Your email address will not be published. Required fields are marked *