In practice the way in which research impacts and influences policy and society is often thought to be a rational, ordered and linear process. Whilst this might represent a ‘common sense’ understanding of research impact, in this cross-post John Burgoyne reflects on how upending the primacy of data and embracing complexity can lead to a more nuanced and effective understanding of research impact.
Here’s what I used to believe was the best way to drive impact:
1. Data is a powerful tool that is under-utilised in the public and social sector.
2. Powerful insights and knowledge sit in academic journals with rigorous research waiting to be applied by practitioners.
3. We need a better evidence base to understand how to allocate public and social dollars.
4. Measurement frameworks will help service providers better understand and improve their impact.
5. Outcomes should be rigorously measured and used to hold people to account.
These beliefs are consistent with the contemporary canon that’s taught at many universities and which is backed by lots of philanthropic funding. They were hot topics of debate at the conference.
As I gain more experience with social problems, however, I increasingly understand that my original beliefs about impact are flawed. They rest on the assumption that impact can be brought about by a linear chain of causal events. Research what works, invest in some outputs, and over a predictable time span, a quantified set of measurable outcomes will result.
However, in the face of complex problems our society faces, this assumption, while compelling in theory, has consistently proven itself untrue in practice. From working on issues ranging from homelessness to mental health to animal welfare, the only outcome I can predict with any consistency is that unexpected changes will occur and force actors to adapt if they want any chance at achieving impact.
After the excellent debates at the Social Outcomes Conference, I wanted to reflect on what I have learned about those original five beliefs:
1. The increase in usage of data does not automatically lead to more impact. In fact, requiring data usage can exacerbate unhealthy power dynamics and make things worse. I still believe data can be helpful, but I would rather have a well trained team with no experience in data analytics than a team of data analysts with no experience in the field. One innovative teaming model comes from Austin’s Homelessness Advisory Council, where those who have experienced homelessness work hand in hand with city government, including data analysts, to shape policies and programs.
2. Powerful insights and knowledge sit in the minds of practitioners waiting to be enabled. While innovative research may attract funding and recognition, it is frontline staff with years of experience tackling social problems who should be invested in and nurtured. If I had $1M to tackle a crisis, I would start by asking those closest to the problem what support they need and what conditions would need to change for them to better serve their residents, just as Huntington, WV has done for its first responders of opioid epidemic.
3. We have invested loads into building countless evidence bases and I’m not sure anyone has ever calculated the return on those investments. While evidence-based approaches can provide inspiration for a community struggling to tackle a problem, they come with the dual dangers of undervaluing local contexts and overlooking new methods. Rather than earmarking social and public dollars exclusively towards “what works”, we need flexible investment into local communities, who should be trusted to figure out the best way to spend it.
4. Top-down reporting requirements restrict rather than support service providers. A clear theory of change can be helpful for service providers, but too often we see grantees subject to overcomplicated reporting of metrics set by their funders. At best, these metrics help tell a story of impact, but at worst, they force grantees to change their models and can lead to decreased levels of motivation among staff. Wigan Council found success in an alternative approach, offering flexible community grants rather than prescriptive funding for services. After introducing this scheme, hundreds of project were launched, overall satisfaction levels increased 59% among residents, and overall spending decreased dramatically. Donna Hall, the CEO of the council, noted, “we didn’t tell them what they needed; we listened.”
5. Outcomes are emergent properties of complex systems that we simply cannot force through more rigorous measuring. We can create the conditions from which outcomes can emerge, but we should not start assigning blame (or withdrawing funding) when targeted outcomes are not met due to unforeseen circumstances. Adaptive experimentation among a group of trusted partners is much more likely to lead to positive impact than adherence to rigid outcomes set by those in a position of power
If you can’t tell already, my relationship with social outcomes has grown complicated. It admittedly has been confusing having my view of the world upended by new experiences, but I think a worthwhile pursuit and reflection. I realise my original assumptions resulted less from the hands-on testing of hypotheses I intended, but rather from a selective seeking out of evidence that validates my thinking.
I wanted to share how my thinking has changed for two reasons. First, in case others are similarly wrestling between these different views and seeking another perspective. And second, more selfishly, to get out of the old echo chamber, gain diverse perspectives, and ultimately avoid the confirmation bias trap that led me to my original beliefs!
To the second point, if others on the interwebs have experiences and perspectives that challenge or add nuance to my own, please do reach out or leave a comment! Especially interested in recommendations for good people to be reading on these topics.
About the author
John Burgoyne is a senior programme associate at the Centre for Public Impact, a nonprofit organisation dedicated to reimagining government so it works better for everyone.
Note: This article gives the views of the authors, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.