LSE - Small Logo
LSE - Small Logo

Blog Administrator

September 15th, 2016

Facebook is a new breed of editor: a social editor

9 comments | 9 shares

Estimated reading time: 5 minutes

Blog Administrator

September 15th, 2016

Facebook is a new breed of editor: a social editor

9 comments | 9 shares

Estimated reading time: 5 minutes

NEDERLAND, AMSTERDAM, 29 OKTOBER 2013 Prof.dr.N. Helberger (Natali), hoogleraar Informatierecht, in het bijzonder met betrekking tot het gebruik van informatie, aan de UvA. Foto: Jeroen OerlemansFacebook’s approach to allowing, censoring or prioritising content that appears in the news feed has recently been the focus of much attention, both media and governmental. Professor Natali Helberger of the Institute for Information Law at the University of Amsterdam argues that we need to seek to understand the new kind of editorial role that Facebook is playing, in order to know how to tackle the questions it raises.

One moment, Facebook is required to exercise control over the content on its platform: the European Commission pushed Facebook and others to sign a code in which it commits itself to remove hate speech from its platform in 24 hours, and the draft Audiovisual Media Service Directive suggests expanding the responsibility of platforms for controlling hate speech and harmful video content. The next moment, however, when Facebook indeed exercises control and removes content, a global debate ensues. The most recent incident concerned the case of Facebook removing allegedly pornographic content, which in reality was an iconic documentary photograph from the Vietnam war. In response, the editor-in-chief of the Norwegian daily Aftenposten called Zuckerberg the ‘world’s most powerful editor’.  But what kind of editor is Zuckerberg really?

What exactly constitutes the extraordinary power of Facebook to influence the media landscape, news readers and the news? It is clear that Zuckerberg has neither the training, mission nor ambition to act as an editor as we know them – the business interest of Facebook is clicks, not news. In fact, the last true human editors have left Facebook, so that Facebook’s true editor is now its algorithm. It’s also worth considering whether it is actually a good idea to officially convey Facebook the status of editor in a legal sense, with all the editorial power and freedoms that come with it – particularly when it is clear that the current legal framework is not particularly well prepared to deal with the kind of editorial power that Facebook has. Understanding the nature of Facebook’s editorial power is critical to be able to formulate the standards and policy responses that fit the media world we want to live in. In this post I will argue that Facebook is actually a new breed of editor, a social editor, and that it is this socially-constructed dimension of editorial control that we should strive to better understand: Facebook distributes not only the content, but the very architecture in which users  encounter and engage with news.

Facebook is a “social editor”

In an earlier post for this blog, we argued that Facebook has crossed the line from being a mere host of user-created content to functioning as an editor of (professional) media content, at least for certain parts of its website, such as Trending Topics. However, it is also clear that Facebook is not an editor in a traditional sense, and certainly not in the sense that media law and policy are accustomed to. Facebook does not itself produce news, but it does aggregate news, it closes deals with media publishers for Instant Articles and even commissions content , for example for Facebook Live. Facebook establishes editorial and community guidelines – guidelines that apply to Facebook’s users, not to Facebook, as Facebook itself has no editorial mission. The social network plays a pivotal role in providing the edited recommendation service ‘Trending Topics’, with its ability to bring important issues to the attention of a wide range of users and to rank other topics into oblivion. Most importantly, social networks like Facebook organise the way in which the public debate around content takes place. It does so by collecting and integrating data from Facebook users into the recommendation process, by calculating popularity and shareability and by offering an entire architecture of tools for users to engage and share. This makes Facebook first and foremost a social editor that exercises control not only over the selection and organisation of content, but also, and importantly, over the way we find, share and engage with that content.

Privately controlled public sphere

A characteristic of social editors such as Facebook is the combination of content distribution, intimate knowledge of their users, and the ability to determine the architecture and terms under which users can engage with content, share it with others, ‘like’ it, and allow users to comment, creating all kinds of social interactions. By so doing, platforms such as Facebook organise not just a  single element of communication, but the entire process of political persuasion, here understood as the content, space or platform to speak and deliberate about politics, as well as the network of people to speak to. In terms of public deliberation, Facebook is probably best described as nothing less than a privately-controlled, public sphere. This already is a reason for concern, seeing the growing importance of Facebook as (primary) source of information (e.g. for the 28% of young people in the UK who claim that Facebook is their main source of news). But what is at least as worrying is the fact that this is a privately-controlled public sphere that exhibits a new form of vertical integration, one in which data and control over the user play a central role. With the information Facebook has about its users, and the ways it is able to control all steps in the process of deliberation and political persuasion (content, distribution and means of engagement), the social network has unheard-of opportunities to influence not only the diversity of supply and exposure, but also how people engage, the effects of exposure, and even what people think and feel after having been exposed to content by others.  This is a very new, data-driven and social form of opinion power – one that traditional media policies  have not given much thought to yet. The task of academics, and eventually also law and policy makers, is it to turn a watchful eye to this form of social opinion power, the way it can influence (intentionally or unintentionally) the process of democratic deliberation and affect equal opportunities for opinions and ideas for access to, and engagement with, the audience. But how can this be achieved?

Needed: a new concept of organisational responsibility for social editors

Facebook’s ‘editorial control’ is not so much about controlling the production of the content (as in the case of  traditional editors). Take the example of media diversity, i.e. the availability of a variety of contents and opinions from diverse sources. As a ‘social editor’, Facebook creates and controls the organisational and technical conditions within which Facebook users are exposed to, and engage with (diverse) content. This is where Facebook’s potential impact on media freedom, pluralism and the market place of ideas is. This is also the reason why  applying traditional content regulations and diversity  safeguards (such as the rules about media concentration, licensing, gatekeeper regulation and programming requirements) to Facebook is not the solution (see also here).  Instead, the key to addressing how to protect and promote media values, such as media diversity, on social networks is to better understand the effects of the technical design, the design choices that inform it, and the way these design choices affect media diversity as the result of users’ interaction with media content. And if academics, law and policy makers arrive at the conclusion that action must be taken, such action will not consist of imposing the traditional diversity safeguards that we impose on traditional editors. The responsibility of social editors would be of a more organisational, technical nature.

Interestingly, a first step towards conceptualising such a form of alternative, organisational responsibility has been taken with the draft proposal for a revised Audiovisual Media Service Directive. Essentially, this is the idea that certain platforms, even if they have no editorial responsibility in the traditional sense, can be obliged to design the platform in a way that complies with media policy objectives. Under the draft Audiovisual Media Service Directive, this is by putting in place contractual, organisational and technical measures to protect minors from harmful content; and to protect all citizens from incitement to violence or hatred (Art. 28a of the draft proposal). Arguably, this proposal is a first step into developing a concept of social responsibility of platforms  in media law, and it is a useful point of departure. The next step  would be to explore how intermediary platforms can be organised to enable values, such as media diversity, truth in reporting, or the ban on hate-speech, in this socially-constructed dimension of platforms.

To stay with the example of media diversity,platforms should not impose their own standards of diversity on readers (as we have seen in the Vietnam picture example, doing so is doomed to failure  because of the lack of editorial expertise). Instead, platforms such as Facebook should provide  a choice of recommendation options, with next to ‘trending topics’ more serendipitous or diverse options, and leave it to the professional media and users to make choices. Platforms should assist users in discovering other people with alternative backgrounds and views, rather than continuing to suggest like-minded only. They should encourage the development of the growing array of apps that aim at creating awareness about users’ media diet (such as Many Angles, Bobble, Balancer, or MIT’s Media Meter .  They should take care not to use their knowledge to  persuade users to view (and like) particular contents, but make neutrality and independence their mission. And open themselves to the scrutiny of others, including journalists and academics, to assess whether they have succeeded. Pluralism and media diversity today is not any longer only about the mix of contents users are exposed to, it is also about the way users encounter, share and engage with those contents autonomously. And the task of social editors, such as Facebook, is to leave these choices intact.

This post gives the views of the author and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science. 

About the author

Blog Administrator

Posted In: Algorithmic Accountability | EU Media Policy | Filtering and Censorship | Intermediaries | LSE Media Policy Project | Media Plurality and Ownership

9 Comments