How did one LSE research centre have real world impact? - Activism Influence and Change
LSE - Small Logo
LSE - Small Logo

Duncan Green

March 17th, 2025

How did one LSE research centre have real world impact?

0 comments | 1 shares

Estimated reading time: 10 minutes

Duncan Green

March 17th, 2025

How did one LSE research centre have real world impact?

0 comments | 1 shares

Estimated reading time: 10 minutes

Duncan Green looks at the unpredictable but fascinating ways that one of LSE’s many research centres influenced policy, practice and world views. To find out he conducted open-ended interviews with a number of researchers at the Centre for Public Authority and International Development (CPAID). He found that in addition to the quality of the research, the importance of relationship networks, highlighting findings that are surprising/unexpected, and responding to shocks and crises are all crucial to impact

Research Centres are an important part of academia – a pot of money that allows a group of researchers to focus on a particular topic over a period of years. Recently, I’ve been working with one of them – the LSE’s Centre for Public Authority and International Development (CPAID). One of my jobs was to write a series of blogs documenting the impact of the Centre’s research – I think there are some useful lessons here – see what you think.

Being asked to write impact case studies for the research programme at the Centre for Public Authority and International Development (full list of links at the end of this post) was a lot of fun, and a bit unusual. When discussing impact, whether through the REF or the wider demands of self-promotion, institutions normally pick the research that has had the biggest impact, then try and work out why/how. Of course, this results in a massive selection bias. 

This set of case studies has a couple of those (such as those on hunger courts in South Sudan and Ebola in Liberia) but other research projects don’t claim to have had world-changing impact. In that latter group, it was fascinating to see what influence ‘normal’ research has and why.

Here I’ll try to pull together the ‘learnings’ (cue vomit emoji) from the six case studies I wrote. Let’s start with a famous quote from Milton Friedman, an academic who knew a thing or two about impact.

‘Only a crisis – actual or perceived – produces real change. When that crisis occurs, the actions that are taken depend on the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes the politically inevitable.’

These CPAID case studies confirm Friedman’s view on the importance of crises in opening policy windows for researchers, but they also add a lot of detail on what else can help beyond having your ideas ‘lying around’.

Firstly, they highlight a string of necessary-but-not-sufficient conditions, or at least factors that improve your chances of having impact with your research:

Relationships: A lot of the impact came from researchers’ using their networks – a glass of wine with UN types in Juba, or national researchers embedded in local decision-making systems.

Track Record: The longer you have been in a place, the more credibility and contacts you accumulate. This may not be intentional – you go somewhere to research your PhD without any thought of influence, but years later the people you interviewed can open the door when you return with ideas.

Authentic, high quality research: Phew – yes it still matters.

Social Media: If you are on social media, it is much easier to grab one of those windows of opportunity.

Brand: It may well be unfair, even colonial, but it helps if you have LSE after your name.

Findings that are surprising/unexpected: Research that confirms what everyone already thinks or does is understandably not likely to get much interest. The academic equivalent of a journalistic ‘man bites dog’ story (such as ‘Ebola death rates were lower in community run treatment centres than in the official ones’) piques interest and opens doors.

Secondly, you need luck, but not just luck. As Louis Pasteur said on his ‘accidental’ discovery of germ theory, ‘fortune favours the prepared mind’. Crises can be helpful – scandals, shocks and general meltdowns have policymakers scrambling for solutions with suddenly open minds. But what makes them listen to your research rather than someone else’s?

Often, it comes back to relationships. If you’ve met them, know them, even just interviewed them for your research, decision-makers are more likely to pick up the phone or accept the meeting. But an entrepreneurial mindset also helps – someone who’s willing to go beyond their academic comfort zone, talk to a scary international body, and maybe open doors for more junior research staff, as Tim Allen did for Holly Porter at the ICC.

Finally, how should we attribute impact to a piece of research? The short answer is – with great difficulty.

For a start, it’s a lot easier to claim impact for a given researcher than a specific research output. The REF case studies are full of senior academics who have been invited onto advisory groups or ended up as de facto consultants for decision-makers. But as I found in the case of Melissa Parker’s work on Ebola, which seems to have landed her on the UK COVID advisory committee, those kinds of invitations are often based on reputation / a lifetime’s work rather than a specific piece of research. Pity the poor junior researcher just starting out, who has yet to gain that track record but is still required to ‘prove’ impact.

Consultants typically come armed with solutions, however specious. Many academics do not; they highlight problems, things that aren’t working and unwarranted assumptions underpinning policy-making. If they succeed in getting a government or international body to think again, they won’t be able to point to their magic bullet / toolkit to prove impact. Tricky.

My overall impression? 

Impact is important, messy and unpredictable, but there are things you can do to greatly improve your chances of achieving it.

And what about the current massive shock to the aid sector and research funding for things like CPAID? I think the broader message is that issues of power and public authority will not go away – they determine ho+w change happens or is blocked in rich countries and poor. So there ought to be a future for research programmes like CPAID, but preferably detached from the aid world, and the increasingly unconvincing narrative of Global North and South.

Catch up on the case studies here:

  1. Supporting early warning systems for famine in South Sudan
  2. How an arts project created real-world impact for refugees and formerly displaced persons
  3. How research into Ebola secured a seat at the table of COVID-19 policymaking
  4. How research impacted the reform of Ugandan refugee camp aid systems
  5. ‘New’ issues in development policy drive research impact on Somali state-building
  6. How research into sexual wronging changed the course of the landmark trial at the ICC

An earlier version of this post appeared on the CPAID website in 2021

About the author

Duncan Green

Duncan Green is a Co-Director (with Tom Kirk) of the LSE's Activism, Change and Influence programme and website. He is a Professor in Practice in the LSE's International Development department. He can be reached at d.j.green@lse.ac.uk, or on @duncangreenlse.bsky.social. He doesn't look at twitter any more.

Posted In: Research

Leave a Reply

Your email address will not be published.