LSE - Small Logo
LSE - Small Logo

Blog Administrator

May 26th, 2016

Facebook is a news editor: the real issues to be concerned about

5 comments | 2 shares

Estimated reading time: 5 minutes

Blog Administrator

May 26th, 2016

Facebook is a news editor: the real issues to be concerned about

5 comments | 2 shares

Estimated reading time: 5 minutes

NEDERLAND, AMSTERDAM, 29 OKTOBER 2013 Prof.dr.N. Helberger (Natali), hoogleraar Informatierecht, in het bijzonder met betrekking tot het gebruik van informatie, aan de UvA. Foto: Jeroen Oerlemans130307264 Damian Trilling LR_cropNatali Helberger and Damian Trilling, both of the University of Amsterdam and the Institute for Information Law (IViR), write that whilst Facebook’s use of human editors may bring comfort to some, there are wider issues to do with editorial responsibility that need to be addressed.

It’s out. Facebook is not some magic black box news machine. It’s using human editors. It is a bit ironic – for years, scholars and activists have been concerned about the fact that Facebook is automating news distribution by using (very opaque) algorithms to determine what content to feed its users. But now the cat has been put among the pigeons: it has become apparent that Facebook has hired a team of human editors to check, and indeed to edit, the algorithmic output. In other words, Facebook is (and behaves like) a news editor.

But… isn’t that actually good news? Facebook is turning to the wisdom of real human editors to train its algorithm. And Facebook also relies on the expertise of the traditional news media to determine which trending topics are real and which are fake. Admittedly, the reliance on ten or so trusted sources (many US-based, all English language) is a bit meagre, but still – the black box has acquired a human face. So why all the commotion?

First, this revelation by Facebook seems to have ended a myth for some commentators: algorithms are neither neutral nor some sort of Delphian oracle for what is going on in society – despite the fact that the Facebook algorithm can tap into probably the most comprehensive database of human insight and behaviour on the globe. Even the Facebook algorithm can make errors: it is not able to distinguish fake news from real news, nor to make an independent judgement of what trending topics actually are. In other words, the algorithm needs human input to learn which characteristics of a news story can predict whether it will be of interest to a given user (for more information about the use of supervised machine learning in the social sciences, see here).

According to the way in which Facebook claims it identifies Trending Topics, it is users who determine what is trending: “Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timelines, Pages you’ve liked and your location.” However, in reality, Facebook seems less convinced that users alone could determine Trending Topics, contrary to these claims. Perhaps counter-intuitively, the wisdom of the masses is not the best reflection of what is trending, and so Facebook takes over by bringing in professional editors and instructs them to keep an eye on traditional media sources to determine what should be trending.

The editorial team’s task is to screen topics and to add background information to news stories where necessary. In doing so, they provide input data that is used when training the algorithm to identify stories that might be of interest to users. It is not too surprising that humans provide such data: an algorithm needs some data to start with, some seed. Based on signals like user clicks, the algorithm can rank a story up or down ­– but if a potentially interesting story is not offered to at least some users in the first place, it simply cannot be ranked up or down because there is no signal at all. In fact, what is done by Facebook’s editorial team is something that is essential to supervised machine learning: some human annotators are attaching labels to data, in order to enable the system to learn from the data.

But while the fact that this is happening hardly comes as a surprise from a technical point of view, it has important implications which lead us to the real issues we should be concerned about. The Facebook case makes clear that Facebook is no longer simply a place to “help you connect and share with the people in your life”. Facebook is now aiming higher. It wants to inform people about what is newsworthy and what is worth knowing. Facebook is no longer simply a “platform for all ideas”. Trending topics is about newsworthiness and setting an agenda for the Facebook community. And here things become tricky. This is because it is now clear that Facebook has crossed the line that distinguishes content hosts from content editors, and because being an editor comes with legal and ethical responsibilities.

What are the implications of this? Until now, the law divided the online world into ‘hosts’ and ‘editors’. Editors were the traditional media, namely those who decided which content to make available and in which manner to present it. Hosts were those who stored the online content of third parties, such as users or other editors, without engaging with it in any way. While in legal terms editors are fully responsible for the content they publish (including content posted by users), the responsibility of hosts is typically very limited. Though the legal situation is fragmented across the European Union, intermediaries such as social networks, news aggregators or search engines have in a number of cases asserted successfully the status of a host. As such, these platforms cannot be held principally responsible for the (quality of) the content published by others, the activities of their users or compliance with a whole range of legal and ethical requirements for the traditional media. But Facebook is no longer a host. Facebook has now clearly and officially crossed the line and shown itself to be an editor. According to the Council of Europe, the “editorial process involves a set of routines and conventions that inform decision making as regards content” – in other words, exactly what Facebook is doing.

What does that mean? It means that Facebook has editorial responsibility for its trending topics, and perhaps also other parts of the platform. It can be held accountable for its editorial policy and for the quality of its recommendations. And when viewed from this perspective, trending topics are bad news – literally.

This is because, if one is to believe a recent report, Facebook editors operate under challenging working conditions. Editors are working under the immense pressure to focus on quantity rather than quality. For example, US Facebook editors must approve at least 60 trending topics per day (preferably 80) to show in users’ feeds. Why 60? If there were fewer than 60 topics, the personalised ranking of trending topics would not work anymore and everybody would see the same news items, according to Facebook’s leaked editorial guidelines. This is an interesting example of how personalisation strategies affect editorial policy.

A closer look at the editorial guidelines paints an even gloomier picture of what Facebook thinks is news. Apparently, the main editorial criterion for editors to consider is whether a topic reflects a real-life event, or whether the event has been invented. In order to understand what is trending, Facebook relies heavily on the wisdom and expertise of the traditional media. All that is left for Facebook’s editors to do is to tweak the headings, provide some context, check the spelling and add some video content. There is no mention of checking for accuracy, encouraging media diversity, respecting privacy and the law, or editorial independence. The interesting story in last week’s news therefore is not that humans are attaching labels to the content that is fed into Facebook’s algorithm, but rather the criteria on which their work is based.

Facebook would like to have the role of an editor without the associated editorial responsibility. This is not the way the game is played. Now that Facebook has crossed the line, it is time that media regulators and academics woke up and took a closer look at what Facebook thinks is news, and the role that it wants to play in the news business.

This blog gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science. 

About the author

Blog Administrator

Posted In: Intermediaries | LSE Media Policy Project

5 Comments