LSE - Small Logo
LSE - Small Logo

Blog Administrator

August 17th, 2015

Send in the robots: automated journalism and its potential impact on media pluralism (part 2)

0 comments

Estimated reading time: 5 minutes

Blog Administrator

August 17th, 2015

Send in the robots: automated journalism and its potential impact on media pluralism (part 2)

0 comments

Estimated reading time: 5 minutes

Pieter-Jan_Ombelet

In his previous post, Pieter-Jan Ombelet of the KU Leuven Interdisciplinary Centre for Law and ICT (ICRI-CIR) analysed automated journalism (also referred to as robotic reporting) as a potential solution to combat the diminution of investigative journalism. Here, he focuses on the future possibilities of robotic reporting in personalising specific news stories for each reader and assesses the potential (positive and negative) impact of automated journalism on the diversity of media exposure and personal data protection.

Don’t forget the individual user!

In the future, automated journalism could be used to bring personalised news products to individual users. Paul Bradshaw remains unsure of the added economic value in personalised automated news stories. Yet, media personalisation techniques and complex algorithms, such as Google’s Page Rank algorithm or Twitter’s Trends list, are already designed to define every user’s profile in order to develop an individualised relationship with them. These filtering techniques are used to customise news services to serve the users’ specific needs and interests and help them shift through the vast stores of online information. Following the same user-centric approach, algorithms could create multiple customised versions of a specific news story to better suit the taste, viewpoints or profile of every individual user.

Evgeny Morozov fears that this customisation on an individual reader’s level would discourage him or her from thinking critically, eruditely and unconventionally. Indeed, this personalisation of news items could become worrisome once the news stories automatically produced by algorithms are not merely factual but also include some adjustable viewpoints. Luckily, this is not yet the case in practice. Nevertheless, even if the articles remain neutral, Morozov worries that “some people might get stuck in a vicious news circle, consuming nothing but information junk food and having little clue that there is a different, more intelligent world out there.”

Trapped in the filter bubble?

This view resembles the fears expressed by Neil Richards. In his work, Richards coined the term intellectual privacy, defined as “the protection from surveillance or unwanted interference by others when we are engaged in the processes of generating ideas and forming beliefs—when we’re thinking, reading, and speaking with confidants before our ideas are ready for public consumption.” Indeed, once news stories were adjusted for each individual, one’s intellectual privacy would be hindered. Trapped in a prison, in a prism of light, the idle audience will concentrate its attention on a very niche array of sources, a filter bubble, solely focusing on their very specific needs and interests and containing only like-minded speech. Exposure diversity of the individual user could therefore be threatened by these customised versions of the same news story. In a pluralistic marketplace, where no source can lay claim to ‘absolute truth’, the aware citizens, exposed to diverse and complementary sources and content, will have an advantage over the oblivious ones. If citizens do not realise that they are reading a different version of the same news story than their neighbour, even critical citizens will partly lose their freedom of choice in composing a diverse media diet.

Additional issues surface when legally assessing these personalised news stories. Personal data of individual users’ needs to be processed to properly conduct this far-reaching type of profiling. For example, ad networks use tracking techniques, cookie based technologies, and data mining software to establish profiles on individual users. Online advertising systems often further classify data subjects into segments, for example by their marketing categories (examples are “gardening” or “cars”, etc.). The location of the data subject is further deduced from the IP address of the terminals and WiFi access points.

The importance of data protection

Along the lines of this example, the personal data processing involved in personalising news stories should be in line with the European Privacy and Data Protection Framework. More specifically, the provisions in the E-Privacy Directive (ePD) – or Cookie Directive – and Data Protection Directive should be respected whenever automated journalism involves personal data processing. In order to use the personal data to write the story, the robotic reporter will have to obtain unambiguous consent of the user (Article 5.1 ePD and Article 7 DPD), signifying his agreement to personal data relating to him being processed. Individuals will have a general right not to be subject to solely automated processing of data which evaluates certain personal aspects relating to them (Article 15 DPD). Personal data should further only be collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes (Article 1 (b) DPD). Every new purpose for processing data, such as personalising news items, must have its own particular legal basis. The robotic reporter cannot use the personal data that was initially acquired or processed for another purpose, e.g. advertising. Moreover, the Draft General Data Protection Regulation (GDPR) explicitly grants every natural person the right not to be subject to profiling (Article 20 GDPR). The ambiguity of the legal status of profiling – also in the context of personalised news stories – will therefore be removed once the regulation enters into force.

Conclusion: trusting the transparent robot

It is still unclear how sophisticated these news-content-creating algorithms will become. Yet, taking into account existing algorithms that compose music and write poetry comparable to human composers and poets, it is never too early to be aware of the remarkable, for some even frightening, possibilities of artificial intelligence. Especially when personalised robotic news stories become reality, informing the readers of the processing of their personal data involved in producing these stories will be crucial.

This post was not written by an algorithm.

This article gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science.

About the author

Blog Administrator

Posted In: Filtering and Censorship | LSE Media Policy Project | Media Plurality and Ownership

Leave a Reply

Your email address will not be published. Required fields are marked *