LSE - Small Logo
LSE - Small Logo

Natali Helberger

July 2nd, 2020

Challenging rabbit holes: towards more diversity in news recommendation systems

0 comments | 37 shares

Estimated reading time: 10 minutes

Natali Helberger

July 2nd, 2020

Challenging rabbit holes: towards more diversity in news recommendation systems

0 comments | 37 shares

Estimated reading time: 10 minutes

Access to diverse sources of news and information is more important than ever in this time of global crisis, yet far too often, people can find themselves diving into ‘rabbit holes’ of information and opinion that are hard to escape. Here, the following authors provide an indepth assessment of how algorithmic recommendation systems for news can play a more constructive role in a diverse media landscape:

  • Abraham Bernstein, Professor at the Department of Informatics (Institut für Informatik) and Co-Director of the Digital Society Initiative at the University of Zurich
  • Natali Helberger, Distinguished University Professor for law and digital technology, with a special focus on AI at the University of Amsterdam
  • Wolfgang Schulz, Professor of Media Law at the University of Hamburg, and Direktor of the Leibniz-Instituts für Medienforschung │ Hans-Bredow-Institut (HBI)
  • Claes de Vreese, Professor of Political Communication at the Amsterdam School of Communication Research (ASCoR)

The COVID-19 pandemic is also an ‘infodemic.’ Concerned about the flood of disinformation, and in response to the COVID crisis, UNESCO has emphasized once more the critical role of a healthy media ecosystem and media literate users for the resilience of the digital society. In times of crisis, citizens rely more than ever on access to accurate and reliable media, as the European Parliament has also stressed.

An important indicator of the health of the media sector is the presence of diversity of speakers and ideas. Exposure to diverse sources of news supports social cohesion, tolerance and peaceful coexistence of different cultures, ideologies and viewpoints. This is especially true in our COVID-19 times, which have shown to what degree our digital societies have become dependent on online sources and digital tools to help us find and access information and news. In its recent Communication on Tackling Covid-19 disinformation, the European Commission emphasized that “free and plural media is key to address disinformation and inform citizens,” and announced various actions to protect and stimulate media diversity, also in the digital realm.

Algorithmic recommendation systems for news are one of these tools. Driven by data and machine-learning, they can automatically select the content of newsletters, personalise news apps, or populate social media news feeds. Such systems also play an increasingly critical role in digital media markets as they enable news media and citizens alike to filter the abundance of information online. The functioning of social media sites is inextricably linked to the quality of their recommendation algorithms. Mobile aggregators and legacy (online) media are increasingly using algorithms to match content with users. The proliferation of news recommendation algorithms has given rise to lively discussions about the potential negative effects of algorithmic aggregation, ordering, and filtering for the public sphere, with concerns about polarisation, filter bubbles, echo-chambers and mis- and disinformation also figuring prominently in the discussion around the European Union’s eDemocracy Agenda.

We would like to argue, however, that filter bubbles, echo-chambers, and polarisation as a result of mis- and disinformation are not an inevitable consequence of digital technology, but rather the result of bad recommendation systems’ design. Recommendation systems can make or break filter bubbles. They can be instrumental in realising or obstructing public values and freedom of expression in a digital society. Much depends on the design of these systems. Are they merely designed to generate clicks and short-term engagement? Or are they programmed to balance short-term engagement and relevance with the longer-term interests of helping users to discover diverse news and viewpoints, while not missing out on important information?

Finding a way to realise the potential of algorithmic recommendation systems while at the same time promoting public values such as diversity, was the challenge that an international group of experts from computer science, AI, political science, media law and theory, and communication science tackled late last year. In mid-November, this multistakeholder group met and brainstormed ideas for a week a remote Schloss Dagstuhl in Germany to define a joint research agenda to address challenges related to the proliferation of algorithmic recommendation systems.

Here we will briefly recount some of the key insights for moving research and the debate about diverse recommendation systems design forward. The full Manifesto, which defines the group’s research agenda, can be read here.

New conceptualisations of media diversity are needed

Diversity of news is deeply ingrained in our understanding of what it means to live in a democratic society – a society that embraces the idea that each member of a democracy is entitled to a set of fundamental rights, including political rights, and is able to participate and have a voice. Having said so, digital technologies affect not only the way we use media, but also the way public values are translated into technological decision and automated decision making routines.

Current definitions of media diversity are still largely informed by the traditional mass media model. New measures and models of diversity are needed because existing models typically fail to capture the multi-dimensionality of diversity and because a high level of abstraction and vagueness do not lend themselves well for informing recommender design. To build diverse recommenders, far more concrete and measurable metrics are needed.

A research agenda to achieve diversity in recommendations

Diversity in recommendations is a form of value-by-design and requires finding ways to conceptualise and implement exposure diversity in a way that can inform the design of recommendation algorithms. Given this need, some of our research aims include:

  • Building bridges between research in computer science, social sciences and humanities

Translating a value into concrete design requirements is not a straightforward process and involves at least three steps: conceptualising the value, translating a general value into norms (or metrics), and, ultimately, into specific design requirements. The data and models currently used in both computer and communication science are often too restricted in terms of representativeness, duration, and depth to reflect the complexity of diversity as a societal concept.

What does this mean in practice? In the computer science literature, generating a recommendation is often seen as a reranking problem. Given a set of items, the goal is to present these items in such a way that the user finds the item he or she is most interested in at the top, followed by the second-most interesting one, and so forth. If the goal is to generate more diverse recommendations, typically an element of serendipity or novelty is being included that compares a set of recommended items to what the user has seen before, whereas diversity in this sense compares the differences (respectively the distance between certain topics).

This approach requires modeling/considering differing opinions and viewpoints about each topic, or aspects of representation or inclusiveness (e.g., vis-à-vis cultural minorities), or to what extent certain mixes of content promote deliberation, tolerance, participation or other goals that diversity, as a normative concept, does serve – an activity which needs to be informed by humanities and social science-based considerations.

From the perspective of (democratic) media theory, diversity is a ‘concept with a mission’ and considered pivotal in promoting the values that define democratic societies. The media diet that people are exposed to should reflect, in one way or another, the diversity of voices and ideas in a society. Diversity, in turn, is considered instrumental in realising a whole range of goals that we tend to value in a democratic society: from stimulating informed citizens and open-mindedness, tolerance, cultural inclusion and equal opportunities.

Achieving diverse recommendation systems that accommodate this democratic role of a diverse media diet requires a willingness of computer scientists to discuss social theories , as well as an increased readiness of social scientists to learn about technical aspects such as the functionalities of recommendation algorithms.

Addressing these challenges therefore demands collaboration between disciplines and the development of new, interdisciplinary, grounded measures of various diversity types that are aligned with societal goals and are concrete enough to lend themselves to computational implementation.

  • Developing new computational methods and ways of responsible research

The proliferation of news recommendation systems is both a result of and factor in the way in which digital technology is changing media consumption and the relationship between audiences and the media. Understanding how news recommendations systems can make a positive contribution to diversity of exposure requires a deeper understanding of both the impact on users and society and the possible risks generated by the system in question. Designing diverse recommendation systems therefore also requires taking into account the desired and projected effects on society, as well as the causal relationship between the chosen metrics and the types of diversity. These are novel questions that require new (computational) research methods and, first and foremost, access to data.  

Much has been said about the need of enabling researchers reasonable access to the masses of data social media companies continue to gather. The central importance of allowing such access in order to optimise recommendation systems for public values is another argument in favour of researchers’ access to social media data. Creating a safe harbour for academic research with industry data should be a prominent point of attention in the European Commission’s AI strategy and requires responsible behavior from both researchers and the industry. Looking forward, a code of conduct under Article 40 of the General Data Protection Regulation (GDPR) should be created to give this kind of data sharing a solid legal basis. (Such a code can enable the proper application of GDPR for the needs of researchers and industry in this regard.)

  • Understanding the broader context

So far, many existing ‘diversity-by-design’ projects concentrate on recommendation systems. We cannot, however, see recommendation algorithms as design objects in isolation, but also need to look at the broader societal context in which the algorithm is being implemented, including the cultural context, the affected stakeholders, and how diverse stakeholders negotiate certain values.

Users comprise one such stakeholder grouping. A news recommendation system is very often a dynamic system in which explicit user choices, as well as indirect traces derived from user behaviour, feed into the algorithm and partially define the scope and diversity of future choice options, which in turn lead to new user choices. This highlights the importance of research into interface designs and the adaptive nature of these systems, the dynamic interaction between users, algorithms and news platforms, and how news recommendation algorithms can support the user to become more aware of the system’s inner workings and its influence on their media consumption.

Another important stakeholder that is often overlooked in this debate are governments. Realising digital diversity is not a matter that can be solved with technical design only. It is also the result of how professional and end-users interact with the technology, the way digitisation and automation change journalistic value chains, how decision-making power is organised and, more generally, what the conditions are for digitising media. These conditions are influenced and shaped by laws and regulation, starting from copyright and data protection law to the role of law in stimulating innovation, accountability and a diverse (local) media landscape. While, for instance, legal and media governance research may not immediately figure as one of the usual suspects in diversity-sensitive design, research into diversity-by-design also means diversifying research teams beyond the ‘usual suspects’ and including disciplines and experts that are able to understand the broader context in which news recommendation systems are evolving.

Including legal and media governance perspectives is therefore pivotal to providing governments with the toolset to regulate the media landscape without stifling innovation and democratic rights.

Concluding reflections

News recommendation systems can be powerful tools to help users find their way in the plethora of available news; they can fight news fatigue, shape public opinion, and serve as a foundation for public cohesion. They are also extensions of the traditional editorial task. Hence, they should not just maximise for clicks and short-term revenue, but, mindful of the democratic function of the media, also optimise for values that align with the overall mission of a news outlet. To achieve that goal, researchers from different disciplines as well as industry and governments have to join forces and together revisit the fundamental functions of diversity in society, and the impact of news recommendation systems on society.

This article represents the views of the authors, and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Obi Onyeador on Unsplash

 

About the author

Natali Helberger

Natali Helberger is Distinguished University Professor for law and digital technology, with a special focus on AI, at the University of Amsterdam.

Posted In: Algorithmic Accountability

Leave a Reply

Your email address will not be published. Required fields are marked *