Resources for investigative journalism are diminishing. In the digital age, this was a foreseeable evolution: publishers typically regard these pieces as time-consuming and expensive, and the results of the research are often unpredictable and potentially disappointing. In this first of two posts, Pieter-Jan Ombelet of the KU Leuven Interdisciplinary Centre for Law and ICT (ICRI-CIR) analyses automated journalism (also referred to as robotic reporting) as a potential solution to combat the diminution of investigative journalism, and looks at the potential (positive and negative) impact of automated journalism on media content diversity.
What is automated journalism?
Automated journalism was defined by Matt Carlson as “algorithmic processes that convert data into narrative news texts with limited to no human intervention beyond the initial programming”. Narrative Science and Automated Insights are arguably the biggest companies at the moment specialising in this algorithmic content creation. Once there is core data to work with, the software of these companies can extrapolate complete news stories out of this data. To date, the most common uses of this software have been in the field of sports and financial reporting, often creating niche content that would not exist without the respective software (such as reports on ‘Little League’ games).
Don’t forget the humans!
Once these algorithms are optimised to allow newsrooms to use robotic reporters to write and edit news stories independently, this could have a serious impact on human journalists. Stuart Frankel, CEO of Narrative Science, envisions a media landscape in which “a reporter is off researching his or her story, getting information that’s not captured in some database somewhere, really understanding the story in total, and writing part of that story, but also including a portion of the story that in fact is written by a piece of technology.” In his vision, journalists would not be discharged. The labour would merely be reallocated, hereby ensuring a higher level of efficiency. Moreover, the portions written by the algorithm would often provide meaningful output from complex data, and be less biased and in that sense more trustworthy than could be expected from a human journalist.
Other voices have expressed more caution. They emphasise the humanity that is inherently linked to high quality journalism. This argument is valid, especially for wholly automated articles, which indeed lose a sense of complexity, originality, authenticity and emotionality that only a human can express. An article written by an algorithm will never intentionally contain new ideas or viewpoints. And this generic nature is one of the downsides of automated journalism when ensuring a diverse media landscape. The media play a crucial role in a representative democracy, characterised by its culture of dissent and argument. Generic news stories do not invigorate this culture.
Still, evolving to a media landscape which uses algorithms to write portions of the story should be embraced. However, there is an important caveat: these pieces should be edited by human journalists or publishers and supplemented by parts written by the human reporters themselves, to combat a sole focus on quantitative content diversity, i.e. a merely numerical assessment of diversity, without taking quality into account.
Moreover, one must not underestimate the possibility of human journalists simply losing their jobs or seeing their jobs change to the role of an editor of algorithmic output. Carlson even highlights the predictions of certain technology analysts, who foresee that “recent developments in computing may mean that some white-collar jobs are more vulnerable to technological change than those of manual workers. Even highly skilled professions, such as law, may not be immune”.
Quality content remains crucial
Indeed, these are possible risks. Still, one should not overestimate these negative side effects and lapse into doom scenarios. People will remain interested in qualitative content. Reallocation of resources due to converging media value chains have had remarkably interesting consequences that often show this interest. Original content creation by streaming services such as Netflix and Amazon has had incredible success. Furthermore, the proliferation and popularity of user-generated (journalistic) content and citizen investigative journalism websites (e.g. Bellingcat) has shown that there is interesting new content emerging, albeit in maybe a less traditional sense. We should therefore remain hopeful that Frankel’s attractive vision of reporters using technology to enhance the quality of their news stories will have a positive impact on media diversity and pluralism.
Follow-up post coming soon! In the second post on automated journalism and its effects on media pluralism, Pieter-Jan Ombelet will explore the possibilities of smart algorithms creating multiple customised versions of a specific news story for each individual reader. More specifically, the post will focus on the effects of this future usage of robotic reporting on exposure diversity and personal data protection.
This article gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.