LSE - Small Logo
LSE - Small Logo

Mattia Peretti

December 8th, 2020

Crossing boundaries together to tackle journalism biases

0 comments

Estimated reading time: 10 minutes

Mattia Peretti

December 8th, 2020

Crossing boundaries together to tackle journalism biases

0 comments

Estimated reading time: 10 minutes

Six months ago, a group of people met in a tiny virtual space with small square windows. They were speaking from San Francisco, Buenos Aires, London, Paris, Nice, Bonn, and Stockholm. We were thousands of miles apart from each other and yet, something brought us very close: we were all curious about artificial intelligence and how we could use these new technologies to work towards our joint mission of providing a more inclusive journalism to our communities.

Had we met in a park or a bar, we would have spent hours talking about the many recent discoveries in the field of bias and AI, such as MIT’s Joy Buoloamwini’s findings about facial recognition discrimination, or the many hidden biases in algorithms described by Cathy O’Neil in Weapons of Math Destruction. But we had no such welcoming place to meet. So, how could a group of journalists with hectic lives, in the middle of a pandemic, still achieve something together in such a complex field? 

The JournalismAI Collab has proven to us that distance, travel bans, and language barriers do not kill the spirit of collaboration, and we are very thankful for that. This is one of the major learnings we now take away with us: it is sometimes easier to get things done when you bring together a group of passionate people across the journalism industry than trying alone in your own newsroom. 

If aligned behind a common goal, even competitors can work together to produce something that will support many in the industry in the future. For all our newsrooms, the topic of bias and diversity is critical. Especially in this “Corona time”, the more we can share costs and resources to solve our shared issues, the better. 

The right mix: when difference becomes crucial

We knew we wanted to use AI to address the biases in the content we produce. But a common objective is not enough if you don’t have the right mix to work towards it. We were very lucky to have in the team people with different skills and approaches who could get results together: a developer, three data journalists, one PhD candidate, and several journalists and managers with strategic thinking, from different countries and media markets.  

We believe that there are few better ways to tackle biases than bringing together people from different cultures. Working in a group with such diverse perspectives allowed us to quickly review the available resources and draw up an experiment which has delivered real data

The project took off with a few general ideas and has now landed on a clear methodology outlining how we can use AI technologies to identify gender biases. We have discovered and tested open-source tools both for images and text classification, and we can now use them in our newsrooms to measure progress. 

And now?

Being able to show, with the data at hand, how we can use AI to detect biases will give us great power back in our newsrooms. It will be far easier to start working on similar projects internally now that we have our Collab work to refer to. It is also easier to “evangelise” internally when you can argue that other major news organisations – or even your competitors! – are working on the same issues you are advocating for. Humans and machines can now work on this issue together. The machines can give regular read-outs on gender balance and measure progress, while humans focus on editorial decisions and making sure they have a real impact. 

We have discovered first-hand the immense potential of natural language processing (NLP) in the journalism field. One partner in this collaboration now plans to measure the representation of specific minorities in its journalism. Another one hopes to apply our findings to analyse the content of a group of European broadcasters. Others hope to explore biases in non-English content, namely in Spanish, French, Swedish, German.  

NLP can help uncover gender biases, but also other diversity-related biases. It can be used to investigate large numbers of documents, to help ensure balance when covering elections, or to detect social trends via sentiment analysis. We should not leave AI only in the hands of academics, tech platforms, big corporations, and advertisers. We should take the lead in our own industry. 

As Nick Diakopoulos – one of our Collab mentors – writes in his book ‘Automating the News’, journalists can use AI to improve their work but also get new insights into society. AI-empowered journalism is not just about efficiency for news organisations, it can also be a force for social good: the 21st century way of pursuing the people’s right to know. 


This blog was jointly written by the members of ‘Team 1’ of the JournalismAI Collab. It sums up their main lessons learned from working together in 2020 to explore the challenge of how we might leverage AI to understand, identify, and mitigate newsroom biases. Explore their complete work at aijoproject.com.

The JournalismAI Collab is a project of POLIS, supported by the Google News Initiative.

About the author

Mattia Peretti

Posted In: Guest Blog | JournalismAI | JournalismAI Collab