LSE - Small Logo
LSE - Small Logo

Sarah Morton

Alisa Cook

April 28th, 2023

Putting public data to use is a fundamentally human challenge

0 comments | 11 shares

Estimated reading time: 6 minutes

Sarah Morton

Alisa Cook

April 28th, 2023

Putting public data to use is a fundamentally human challenge

0 comments | 11 shares

Estimated reading time: 6 minutes

Using data is often seen as a technical or computational challenge, especially for organisations with multiple streams of data. However, as Sarah Morton and Ailsa Cook discuss, the fundamental challenges faced by organisations using data are largely human. Highlighting the importance of making time for ‘sensemaking’, they argue careful analysis and discussion of data has multifaceted benefits.


We live in a data rich world. At first pass, this seems like a great opportunity for organisations and decision-makers, who have lots of evidence, information and feedback at their fingertips, to inform better decision-making. However, access to data on its own doesn’t solve problems unless there is time and space to consider and make sense of it. This ‘sensemaking’ process can be squeezed out when budgets are tight. Yet, it is important to recognise that people are at the heart of the evidence-use process, and to make sure there are spaces for them to use evidence well. Without this it is hard to track and navigate change in complex public service settings.

Our argument is based on our learning from many years supporting hundreds of public service organisations in the public and voluntary sectors to understand the change they want to see in the world and get the data and evidence they need to track it. Issues we discuss at length in our recent book, How do you know if you are making a difference?.

It’s not surprising that people running public services need and hold lots of different kinds of evidence. They need it for several purposes:

  • To understand how to tackle problems and the kind of initiatives that might help and how to plan for them.
  • To learn about the work they are delivering now – what is going well, what can be improved.
  • To know about the most effective interventions, and where to focus resources.
  • To demonstrate that initiatives are making the difference they hoped.

By evidence, we mean all the feedback, data (quantitative and qualitative), formal evidence and reflections from practice that go into understanding complex initiatives in the real world. Public service organisations have lots (sometimes overwhelming) amounts of evidence, largely because getting hold of it has become much easier. They have their own systems and ways of generating data about the populations they serve, with quick ways of running surveys or getting other feedback electronically.

This often results in a sense that ‘we should do something’ with all this data.

But where to start? If only there was a quick fix!

Added to the problems about the quantity of data, there is often a lack of consistency in the data that organisations collect. Sometimes there has been a permissive data culture that has enabled lots of different approaches to collecting data and feedback – a thousand flowers have bloomed! Other times there have been changes in the way data has been collected over time. Often clients reflect that they have lots of data, but are unsure where it all is, and don’t have a system to bring it together.

Using mixed data is a well recognised academic challenge. Different people have the training and skills to use quantitative compared to qualitative data, and making sense of statistics alongside examples of people’s experience can be complicated. This is exacerbated in real-world settings, where people often lack the tools, knowledge, capacity and skills in analysis and reporting.

How can these challenges be tackled? It starts and ends with people. It can be easy to see evidence as a technical thing – data as a neutral resource, in ever increasing quantities waiting for clever tech to come and analyse it. It is important to remember that it is people who use data. Data doesn’t speak for itself. It is animated, analysed and understood by the people who need it. It isn’t neutral and doesn’t hold any power – except the power we give it.

If people are at the heart of the process of evidence use and are to do it well, they need time, space, resources and skills for sensemaking. And they need safe interpersonal spaces for sensemaking: conversation about the evidence might be as important as the data being considered. The space needs to be non-judgmental and good for learning. Support from an expert or knowledge broker to ‘hold’ this space can also be important too.

Sensemaking in action

What is going on when people have this time and space for sensemaking? It might include:

  • making sense of what data they have, its strengths and weaknesses, and any gaps
  • gaining insights into the work in hand and what the data is telling them, especially where there are multiple data sources of different types,
  • reaching a collective understanding of the available evidence and the key themes
  • building consensus about what evidence tells them about their work
  • learning what is going well and deciding where improvement is needed
  • action learning cycles: improving, collecting more data and reflecting

When public services come under the pressure of meeting growing challenges with shrinking budgets, this time to think can be seen as expendable. But the benefits of doing this sensemaking work are huge.

Sensemaking brings many benefits for organisations who prioritise this kind of activity. They are able to make better decisions, in a clearer way, using the best evidence they have. When they get external demands for data, they know how to respond – either to supply data they have, or to explain why that kind of data is not suitable for them. Working in this way can boost staff wellbeing by giving people time to think and to really see what difference their work is making to people. Staff capability to collect, collate and analyse evidence is being developed. This can help on the journey to becoming a learning organisation. Ultimately, working in this way means that initiatives stay focused on what matters and make the biggest difference they can. It also means they can make the case for continued funding and have more influence with a clearer story about the impact of the work.

It is everyone’s job to try to protect sensemaking spaces. To make sure people have time to think and are not caught in constant and exhausting ‘doing’ delivery. This can help effectiveness, efficiency and, critically, staff morale. It should not be seen as a ‘nice to have’.

 


The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image Credits: Featured image Killian Cartignies via Unsplash, In text image, Sensemaking in Action, reproduced with permission of the authors. 


 

Print Friendly, PDF & Email

About the author

Sarah Morton

Dr Sarah is internationally recognised for her work in developing innovative approaches to ensure that decision-makers have access to the best evidence for taking organisations, policies and practices forward. She has pioneered a participatory approach to using contribution analysis to track research impact and is co-founder of Matter of Focus.

Alisa Cook

Dr Ailsa Cook is a leader in the field of outcomes. Through her research and work with policy and practice, Ailsa has made a significant contribution to shifting the focus of public services in Scotland towards the outcomes that are important to citizens and communities. Alisa is also co-founder of Matter of Focus.

Posted In: AI Data and Society

Leave a Reply

Your email address will not be published. Required fields are marked *