LSE - Small Logo
LSE - Small Logo

Charlie Beckett

November 18th, 2020

10 things I have learnt about AI and journalism in 2020

0 comments | 2 shares

Estimated reading time: 5 minutes

Charlie Beckett

November 18th, 2020

10 things I have learnt about AI and journalism in 2020

0 comments | 2 shares

Estimated reading time: 5 minutes

LSE’s Professor Charlie Beckett, Director of POLIS and of the JournalismAI project, reflects on the 1-year anniversary of the publication of the JournalismAI report: New Powers, New Responsibilities

It has been a year since we published the results of our global survey on what news organisations are doing with and thinking about AI technologies. The big themes that stuck out for me from that research were:

Augmentation:  Most use cases were designed to improve the effectiveness and efficiency of the work of human journalists, not to replace them. AI-powered technologies were used to connect content better to the public, rather than to replicate the editorial process.

Knowledge gap:  There was a shortage of people skilled in AI technologies and, just as important, a lack of knowledge across news organisations about the potential and pitfalls of those technologies. 

Strategy:  In a fast-moving, highly-pressurised industry, there was a lack of strategic thinking about a set of technologies that can have a systemic impact on all aspects of journalism.

I don’t think the report missed much or got things wrong, but since then we’ve had a year of intense and wide-ranging research and activities with journalists using AI around the world. We’ve created online training courses and innovation workshops to address some of the issues that the report raised. It’s been an exciting and educational time. In December we’ll showcase at our JournalismAI Festival the most interesting results, and discuss the issues that have been raised during the year. But here are some of the things I have learnt since we published the report, or things that were small in the report but now feel much bigger:


Despite all the historic bias against women in technology generally, we have worked with many brilliant women at the cutting edge of journalism and AI in the last year. It could be because there are plenty of great women in journalism anyway and they see the potential of AI. It could be that the barriers are coming down slowly. But it also appears that the new interdisciplinary, cross-category ways of working with AI open up avenues for women with the talent and aptitude to work in innovative ways.


Collaboration isn’t just a cosy word, it’s essential. It is highly unusual for most news organisations to have all the knowledge, skills and resources to implement AI on their own. Working across departments, between news organisations, or with external partners such as universities, start-ups or tech companies is a great way to accelerate development. Journalists are notoriously competitive but inspired by their more collaborative-minded colleagues in other departments, they are starting to see the benefits of networks that support innovation.


The inequality between news organisations – and between the news industry and other sectors – is much bigger than I’d realised. Some of it replicates traditional divides, such as between local and national news organisations, or between different language markets. But the danger is that AI could exacerbate the inequality as the pioneers accelerate ahead. However, as our Collab process showed, by sharing insights and working collaboratively it is possible to mitigate those structural inequalities.


Implementation is difficult. There are some off-the-shelf tools but generally it takes a significant amount of time and resources to integrate, review and iterate any new AI process. It takes a strategic approach to understand the editorial and ethical implications of AI and how it fits in with your journalistic mission and production flows. Many are scared about “the robots taking our jobs” but the real challenge is to prepare for the new roles and skills that are needed to handle the technology.


Bias is a complex problem. There is the well-known issue of gender and race discrimination inherent in data or programming. But there are other potential risks when editorial starts to depend on any technology, including AI. Story selection and sourcing can become influenced by algorithms. The good news is that AI can also be used to understand those biases and even mitigate human or organisational biases. It’s one of the most exciting ideas that emerged from our Collab.


So-called ‘filter bubbles’ created by AI, for example, in personalising content feeds, are not a problem (yet). But everyone raises it as a potential problem. Right now there is a much bigger challenge for newsrooms of optimising the experience of time-poor, easily-distracted users. It is important that news organisations promote content diversity and serendipity for the public. But AI can also help newsrooms to broaden the public’s engagement, not just to give them ‘what they like’.


AI is not that intelligent (yet). Journalism can be complicated and stories can’t always be reduced to data. Most current AI uses are based on processes that are binary and predictive. To work at scale they have to be kept quite simple. It’s fun to speculate about ‘robots’ writing sophisticated articles or reading the news. But for anything out of the routine, humans are still the most efficient and effective journalists.


Journalism can improve AI. The news industry certainly needs to learn and get support from other sectors, such as technology researchers and companies. But journalism is (supposed to be) a critical, informed, independent practice with the public interest at its heart. Journalists must better understand these technologies so they can report on them and hold algorithms accountable. But journalists should also help the public understand better the ethical and social impact that AI might have on our lives in general.


Workplace cultural barriers to the adoption of AI, as for most new technologies, are still significant. Again, I have been surprised at the persistence of semi-myths such as ‘the robots will take our jobs’. This is understandable at a time when the news media faces so many challenges: shrinking revenue, political pressures, and the impact of the pandemic. Journalists are weary of constant change and this is a major challenge for leadership teams. They need to get across this technology and discover how it might help meet the problems journalism and journalists face every day.


The best thing that I have learnt in 2020 is that, despite all these challenges, so many journalists are incredibly hard-working, talented, resilient, innovative, collaborative, curious and committed. In this pandemic year, hundreds have taken time out of their incredibly busy days to contribute to our project in a variety of ways. Judge the results for yourself at our JournalismAI Festival in December, but for me the last year has been an inspiring journey with some wonderful people. It’s been a tough period for all of us in some way, personally or professionally. But we have all learnt that good journalism is more vital than ever. Properly understood, and responsibly deployed, I am more convinced than before that AI can continue to add to journalism’s value and its power.

JournalismAI is a project of POLIS, supported by the Google News Initiative. This post was originally published on the POLIS blog and is reposted with thanks. This article represents the views of the author and not the position of the Media@LSE blog, nor of the London School of Economics and Political Science.

Featured image: Photo by Markus Winkler on Unsplash

About the author

Charlie Beckett

Professor Charlie Beckett is the founding director of Polis, the think-tank for research and debate around international journalism and society in the Department of Media and Communications at LSE.

Posted In: Journalism

Leave a Reply

Your email address will not be published. Required fields are marked *