LSE - Small Logo
LSE - Small Logo

Blog Administrator

May 29th, 2019

Germany proposes Europe’s first diversity rules for social media platforms

5 comments | 87 shares

Estimated reading time: 5 minutes

Blog Administrator

May 29th, 2019

Germany proposes Europe’s first diversity rules for social media platforms

5 comments | 87 shares

Estimated reading time: 5 minutes

Tighter regulation of social media and other online services in now under discussion in several European countries, as well as in the UK where the government has released a white paper outlining its proposed approach to tackling online harm. Here, Professor Natali Helberger, Paddy Leerssen and Max Van Drunen from the Institute for Information Law at the University of Amsterdam argue that a German proposal to impose diversity obligations on social media platforms’ algorithms deserves more scrutiny. 

The proposed law

Germany continues to spearhead the regulation of social media. Last year the country made headlines with the Netzwerkdurchsetzungsgesetz (‘network enforcement law’ or ‘NetzDG’), the most ambitious attempt to regulate platform content moderation processes in Europe to date. Now, the German Broadcasting Authority (Rundfunkkomission) has proposed another law targeting social media platforms, though it has has received far less attention than the NetzDG –and far less than it deserves. Perhaps this is due to its even more confounding name: Staatsvertraglicher Neuregelungen zu Rundfunkbegriff / Zulassungspflicht, Plattformregulierung und Intermediäre – or “Medienstaatsvertrag for short . To our knowledge, this is the first regulatory proposal in Europe, and perhaps the world, to impose binding diversity obligations on social media platforms’ ranking and sorting algorithms.

This decision comes in response to widespread concerns about the dominance of social media platforms such as Facebook and YouTube. While substantial disagreement remains about the precise nature of the problem, it has become clear that these platforms have become highly influential in steering audience attention, and providing a gateway for users to receive all kinds of information — and that their commercial interests in optimizing algorithms for ad revenue may not always coincide with the democratic ideal of a diverse public sphere. In response, numerous regulators, NGOs and academics have argued that social media platforms should incorporate diversity into their algorithms (e.g. here and here). The Rundfunkkommission, however, is the first to address these concerns with a binding regulatory proposal.

In short, the new law would regulate algorithmic diversity and transparency in two categories of services: (1) “video platforms” such as Netflix and Hulu and (2) so-called “media intermediaries”, which covers a wide range of online services including social media platforms and search engines. This post describes the key features of this regime, and then discusses its significance for online media policy.

Europe’s first ex ante regulation of social media ranking and sorting algorithms

Firstly, the Rundfunkkommission is proposing regulation of “video platforms”, which it defines as any “service that consolidates audiovisual media into a single offering determined by the provider”. This language appears to address streaming services like Netflix or Hulu, and appears to exclude user-generated content services such as YouTube, which are instead regulated as ‘media intermediaries’. The law exempts video platforms that reach less than 1 million users in Germany per month, as well as e-commerce platforms, and services that serve exclusively private or family purposes (53c(1)).

The rules for video platforms are based on Germany’s earlier regulation of PayTV platforms and their Electronic Program Guides. This regime includes requirements for the diversity and findability of content, which would now be applied to online video platforms and their sorting algorithms. In general, video platforms are required to “ensure that the technology they rely on enables a diverse offering”. (52)(c)(1). More specific duties are also included:

  • Non-discrimination: video platforms are prohibited from “unfairly hindering” the content they carry, or “treating it differently without a commercially justified reason”, both in terms of the access conditions for content providers and the search and browsing features for users (52(c)(2), 52(e)(2).
  • Priority for public broadcasting content: Public broadcasting content, for those platforms that offer it, should be “especially highlighted and made easy to find”. (52(e)(3).
  • User choice and customization: Video platforms must offer users the choice between at least two different types of sorting logics, such as alphabetical, chronological or view-based sorting. More generally, the algorithm must also be customizable by the user (52(e)(2).
  • Search features: Users must be able to access the video platform’s content through a search function, which must be “discrimination-free”. (52(e)(2))

In addition to video platforms, the draft law also creates a new and relatively lighter regime for so-called ‘media intermediaries’ (Medienintermediär). Media intermediaries are services that “aggregate, select and make journalistic-editorial third party content publicly available without bundling that content into a service of its own” (Art. 13b). The law provides a number of rather wide-ranging examples, including search engines, social media services, app portals, user generated content portals, blogging portals and news aggregators. An interesting question is whether voice-assisted navigators like Apple’s Siri or Microsoft’s Cortana are covered by the definition.

The main requirement for media intermediaries is non-discrimination. The phrasing is similar to the non-discrimination rules for video platforms, albeit more narrow: media intermediaries “may not unfairly disadvantage (directly or indirectly) or treat differently providers of journalistic editorial content to the extent that the intermediary has potentially a significant influence on their visibility” (Art. 53(e)). Unlike video platforms, they have no duty to offer a choice of algorithms, customization options, priority of public broadcasting, or search features. In sum, then, the draft law creates relatively detailed rules for “walled garden” video services such as Netflix (‘video platforms’), and a more light-touch regime for user-generated content services such as YouTube or Facebook (‘media intermediaries’).

Transparency obligations

The draft law also imposes new transparency obligations on both video platforms and media intermediaries, in order to inform users, content providers, and regulators.

For users, both video platforms and media intermediaries must disclose the selection criteria that determine the sorting and presentation of content. (Art 52(f) & Art 53(d)). These disclosures must be made in easily recognisable, directly accessible and constantly available formats. In addition, media platforms must also disclose the way these criteria are weighted, the functioning of the algorithm, and also disclose to content providers the conditions for access to the platform. Users must be informed how they can adjust and personalize the sorting and explain the reasons that motivate content recommendations. This is a much more detailed, tailored approach to algorithmic transparency than in the GDPR, which uses open norms to demand an as-yet unspecified level of transparency, and may therefore have a strong added value. It would have had an even stronger added value if the obligation to inform users about ways of adjusting the sorting criteria would have also extended to media intermediaries.

For content providers, video platforms must also disclose, upon request, certain information to content providers (52(g)(3), including information about the sorting, presentation of content, conditions of access and use of metadata.

For media regulators, both video platforms and media intermediaries must also disclose “necessary” documentation” upon request (53(f), 52(g)(1). These rules are undoubtedly a response to the growing frustration in government and civil society about the opacity of social media and their proprietary recommendation algorithms. If the vagueness of the phrase ‘necessary documentation’ is a measure of the level of frustration, the frustration must run high. At the same time, this vagueness could be the source of considerable legal uncertainty and will likely be contested in future litigation.

Discussion

Diversity isn’t just about non-discrimination

The wording of Art. 53(e)(1) leaves no doubt that the non-discrimination rules for media intermediaries serve the goal of promoting and protecting diversity. This is remarkable, and makes the German draft law, to our knowledge, the first national regulation in Europe with explicit diversity obligations for social media and news aggregators. Non-discrimination, and the preservation of the so-called “market place of ideas”, is arguably one aspect of media diversity. But media diversity is about so much more.

The Council of Europe, for example, has defined media diversity (pluralism) in a recent recommendation as promoting “the availability, findability and accessibility of the broadest possible diversity of media content as well as the representation of the whole diversity of society in the media” (Recommendation on Media Pluralism and Transparency).

In this light, one may wonder whether equating diversity with non-discrimination is not too modest an approach, or even counterproductive. In an environment of information overload and attention scarcity, realizing a diverse offering of information may actually require one to give visibility to particular types of content – an act of editorial differentiation that goes beyond, or may even conflict with mere equal treatment. Ironically, from this perspective, a worst-case scenario is that the non-discrimination obligation could actually reduce diversity for media intermediaries, by restricting their ability to curate their rankings based on (discriminatory) editorial standards. Of course, platforms do not exercise editorial control in the same way that traditional media do, nor would this necessarily be desirable. But to purport that Art. 53(e)(1)’s non-discrimination rule ensures diversity on platforms is to simplify the concept of ‘diversity’ beyond recognition.

It is also clear that imposing diversity obligations on intermediaries is a complex endeavor, and fraught with fundamental rights concerns; for a regulator to proscribe specific criteria for content ranking could quickly veer into unacceptable forms of censorship or propaganda. The recently adopted Council of Europe Declaration on the financial sustainability of quality journalism attempted to avoid this problem by outsourcing the development of such criteria (including credibility and diversity) to media actors, civil society, and platforms:

“In the exercise of their curatorial or editorial-like functions whereby they categorise, rank or display content, they develop, in collaboration with media actors, civil society and other relevant stakeholders, mechanisms and standards for assessing credibility, relevance and diversity of news and other journalistic content. Content that complies with such standards should be promoted over disinformation and other manipulative, malicious or blatantly false content, notably through improved distribution processes and clear information to the users on how to find and access credible news sources. (12(b)).

An interesting question is whether the German non-discrimination obligation limits platforms’ ability to use these criteria to prioritize credible news over malicious content, as the Council of Europe desired. Similar conflicts could also emerge with the European Commission’s Code of Practice on Disinformation, which demands the prioritization of ‘trustworthy’ content.

This is not to say that the non-discrimination requirement is inconsequential. Platforms are increasingly rolling out their own content services (examples here and here and here), which creates a risk of unfair competition and concentration in content markets. Some of these cases could already be addressed ex post through competition law (analogous to the famous ‘Google Shopping’ case), but such claims involve a relatively high burden of proof — including the difficult question of assessing market power. The Rundfunkkommission’s ex ante rules provides a more expansive and enforceable safeguard, which could help to tamp down on unfair competition in German markets for online media.

An interesting follow-up question is how these provisions will relate to the country’s sector-specific competition law; Germany, unlike some other member states, still has rules on media concentration, and a specialized regulator, the KEK. Another relevant connection is the EU’s proposal for a Regulation on Platform-to-Business Trading Practices, which creates redress and transparency rules for platforms that give ‘differentiated treatment’ to affiliated products (here). The regulation expands on the Rundfunkkommission’s transparency rules by requiring platforms to give reasons when they suspend operators. However, it only requires platforms to be transparent about the criteria they use to differentiate, and does not actually prohibit discrimination. Should it pass and be applied to the media, the regulation (which strives for maximum harmonization) could clash with the more stringent German rules.

Enforcement: civil society need not apply?

Only affected content providers are able to file a complaint with the responsible regulatory authority. This seems like a missed opportunity, since civil society actors could play a valuable role in policing platform algorithms. Academia and NGOs in particular have played in the past years an important role in drawing attention to, and scrutinizing the way social media platforms and search engines exercise selection decisions, and the potential impact this can have on diversity and the health of the public sphere. The Rundfunkkommission could benefit from involving them in monitoring and enforcement processes. This would allow the public to not only be informed about platforms’ impact on the public sphere, but also a way to act on what they have learnt.

Concluding remarks

The German law is only the beginning of a new approach to platform regulation, not the final answer. For a law purporting to promote media diversity, the focus on non-discrimination leaves many questions open. Its most immediate effect may be to limit self- and co-regulatory attempts to curate newsfeeds based on editorial standards, such as the Council of Europe and European Commission’s demands to prioritize trusted news sources. Depending on one’s views, this can either be seen as obstruction of the fight against online disinformation, or as a safeguard against illiberal privatized gatekeeping.  In any case, the draft law already contains significant diversity rules for proprietary video platforms such as Netflix. More broadly, it signals a much-needed departure from Europe’s light-touch ‘e-commerce’ approach to the regulation of platforms that act as important gateways to access to public information. Overall, the German draft law is an important national initiative towards greater transparency and societal responsibility for video platforms and media intermediaries, and can be expected to spur further discussion in Europe.

This article represents the views of the authors, and not the position of the LSE Media Policy Project, nor of the London School of Economics and Political Science. 

About the author

Blog Administrator

Posted In: Internet regulation | LSE Media Policy Project | Truth, Trust and Technology Commission

5 Comments