AI and journalism is a very technical issue but we are talking about this technology at a critical historical moment for journalism. I don’t want to over-indulge in hyperbole but in the midst of this pandemic, we can see some serious forces coming together that present a remarkable set of challenges to journalism. How can AI help?
[This article is based on a keynote speech by Charlie Beckett, Director of the Polis/LSE JournalismAI project, to a conference at Charles University, Prague: Challenges of Journalism – Automated journalism and AI journalism]
Here in Prague, you know all about the economic, political challenges – one could say threats – that much of global journalism now faces. Add to that the media-specific challenges of a changing information environment. These would be enough for any industry, but what is also clear is that the nature of journalism – its value and its values – are being questioned and reformed. It is vital to see the adoption of AI within that larger framework, even as we dive into the details.
I am a former journalist who has charted the changes in journalism over the last 15 years through my think-tank POLIS at LSE, where I am a professor in the Department of Media and Communications. We’ve seen three waves of technological disruption in the last couple of decades.
Third Technology Wave
Firstly, the move online that started when I was at the BBC in 1997 with the creation of BBC Online. It was a seismic phase taking newspapers or broadcasters online. Back then, cultural newsroom resistance was a significant brake on progress but, with hindsight, we can see that there was also a huge strategic failure of management, owners and leaders to embrace the profound nature of the shift to digital. We were left trailing in the wake of the more agile, better-funded tech companies. The new landscape was formed by others. I think that is happening again with AI.
Secondly, there was the phase of using social media as a place to gather news or sentiment, to promote journalists’ own content, and to engage with debates and the public. This was more complex and still on-going: are you on TikTok? Engaging with social media as part of networked journalism has challenged journalism in a more thorough-going way. It is part of a wider series of changes in the structure and context of journalism: disintermediation, competition, consolidation, the growth of the passion economy.
And this has meant a change in journalism as a practice and in its relationship to the public. Central has been the ‘affective turn’ to journalism paying more attention to, and being driven more by the emotions, identity, feelings and values of the journalist, the subject and the audience. All this within an informational structure dominated by the platforms that have provided incredible opportunities for scale, connectivity and innovation alongside the problems of the proliferation of misinformation, the chaos of information overload and the problems of the attention-economy and hyper-fast, badly-moderated flows of content.
And now, here come the robots.
If artificial intelligence is the third wave of technology change, then perhaps we can learn from the past? The first lesson would be to avoid binary ‘dystopian v techno-utopian’ narratives. I would argue that journalism’s pragmatist, phenomenological praxis philosophy is more helpful.
My understanding of AI is informed by the work we’ve done in the last two years: a global survey of 70 newsrooms; the creation of a network of 1400+ people interested in AI; training courses we’ve created (38,000+ downloads); and most recently, a collaborative experiment, the ‘Collab’, where five groups of international journalists are working on key challenges that AI technologies might help address.
You all know about AI so I won’t waste too much time on the technological issues, the moral panics and the definitional problems. I am talking about a set of technologies and ideas about using algorithms, automation and data to augment the work of journalists. Some journalism labour will be replaced by the machines but this is usually in areas where good journalists can benefit from boring, repetitive or scaled work being handed over to the computers.
Most of the cases we have seen have been supplementary or additional. They operate across the news production process including critically the relationship with the audience.
This is the contents page for our online collection of case studies. It shows how diverse the applications are. Go have a look. They can make a news organisation more efficient and more effective. But because they are usually systemic they require a structural, strategic approach that will be different for every use case and every organisation.
According to our 2019 survey, that strategy is usually lacking.
As one respondent to our survey told us: “it’s not surprising we don’t have an AI strategy – we don’t really have a strategy for anything.”
These are the key challenges that newsrooms told us they face with AI adoption. As you can see these are not always about the technology itself. They are mainly about broader issues of lack of knowledge: skills, training, awareness. A lack of resources, of skilled people, of money and time, and a proper strategy.
So how can AI help? Why bother to make the effort? To give you a sense of its potential look at what our Collab teams are working on at the moment – the results will be published in December. This is what they think is possible with AI – some of it is already happening.
You can find out more by reading our series of blogs about the Collab: they are covering everything from improving diversity and mitigating biases, accessing evergreen content for readers, improving related content sources for journalists writing new stories, and improving the retention of subscribers.
I am hugely impressed by how these 40+ journalists from different news organisations around the world are coming together to think imaginatively about how these technologies might help improve their journalism. It shows the hunger for innovation, but at the same time, the lack of R&D resources.
There is a real fear of missing out. We are in a world that is increasingly algorithmically powered and data-driven. Journalism needs to know about this technology so that it can understand and report upon it – to explain it and hold it accountable. It needs that knowledge to understand and serve its audiences better. Only then can it find more efficient and effective ways to create and connect good content to people. There is too much journalism and not enough. We are duplicating news, often to an agenda that is narrow and distanced from people’s concerns and in language and formats that are not accessible. AI might be able to help.
There are too many journalists and not enough. There are not enough journalists adding value instead of following the information herd. We saw from the Collab teams what potential there is out there, but there are real inequalities within news organisations, between news organisations and between the news media and other sectors that are adopting AI technologies. There are also, of course, inequalities between different national markets or regions.
One good bit of news surfacing from our work is the increasing desire to collaborate within and between news organisations. We’ve seen how AI can play a role in some of the recent spectacular collaborative global investigations into banking for example. But also in other areas of news production working with other organisations such as universities, start-ups, tech companies, NGOs and so on. Our JournalismAI project is a small example of that, and we hope it will catalyse the genuine enthusiasm out there to learn with and work with people beyond their own newsrooms.
Ethical and Practical Risks
Journalism has proven itself remarkably resilient and we should celebrate the successes. But there are huge gaps: at the local level, in many national markets and in key areas of life such as the sciences or in reporting certain demographics. AI can’t solve those problems. In fact, the first step in working out an AI strategy is to work out what it can and can’t help with. It’s an evolving technology and we are still working out its affordances as well as the practical and ethical risks.
These can’t be seen in isolation. Take algorithmic bias. We are all aware that this is a problem for AI in any context from facial recognition to actuarial risk in insurance. We need to understand that problem technically, but also to investigate how it applies to the profession of journalism where the idea of accuracy has always been somewhat provisional. It is possible that machine learning might actually help reduce the human or institutional biases of journalism itself. So as an industry and as researchers, we need to think in a way that synthesises the best of technological insights with the particular demands and aspirations of journalism. We must be critically open-minded but also focused on our mission.
I hope that this conference is another step on the path to help journalists and those people researching it, to provide pathways to adoption that enhance journalistic work and help re-shape it for our complex and challenging times. I am sure all your work will contribute to this – and journalism needs all the help it can get.
[ This article by Professor Charlie Beckett – Director of POLIS, LSE, @CharlieBeckett – is based on a keynote speech to a conference at Charles University, Prague, ‘Challenges of Journalism – Automated journalism and AI journalism‘ ]
The views in this article are those of the author and not necessarily those of POLIS or the LSE.