LSE - Small Logo
LSE - Small Logo

Colette Einfeld

Sarah Allen

November 11th, 2022

Navigating co-design and nudge: Evidence and expertise in practice

0 comments | 8 shares

Estimated reading time: 7 minutes

Colette Einfeld

Sarah Allen

November 11th, 2022

Navigating co-design and nudge: Evidence and expertise in practice

0 comments | 8 shares

Estimated reading time: 7 minutes

In a previous blogpost Colette Einfeld and Emma Blomkamp argued bringing together nudges and co-design in practice illuminated fundamental differences underlying these approaches. Reflecting on a project to improve healthy food choices in a hospital setting, Colette Einfeld and Sarah Allen, explore how these tensions emerge in practice and suggest how this might inform the work of policy practitioners developing ‘nudge style’ interventions.


Last year, Emma Blomkamp and myself (Colette) published a post on this blog about nudge and co-design. It was based on our more extensively argued paper in Policy Studies and argued that nudge and co-design have fundamental differences in their underlying expertise, methodologies and philosophies, even though they are increasingly merged together in practice. The article and blog generated healthy discussion among academics and practitioners. One of these practitioners was Sarah Allen, an applied behavioural scientist and designer at the Innovation Lab (iLab), which is part of the Northern Ireland Civil Service. Established in 2014, the iLab has expertise in service design, behavioural science, and system dynamic modelling and is an example of a Policy Innovation Lab with a strong background in co-design.

By more rigidly defining both nudge and co-design, the paper articulated some of the niggling issues Sarah was having in her work, even though the projects might not always adhere strictly to either method. Returning to the subject again here, we use a project that Sarah had previously worked on at iLab to consider how tensions raised in the paper play out in practice. The theoretical issues raised in the paper provide a framework around why issues may ‘niggle’ or be uncomfortable in practice. The project illustrates some of these issues in trying to bring together different types of evidence and expertise that nudge and co-design embody.

The project

iLab worked with a client on a project aimed to improve healthy food choices of staff in hospital restaurants. The project started with a literature review of Randomised Control Trials (RCT) done in canteen settings where the goal was to prompt the choice of healthier options. From the outset there were time-constraints, so iLab planned to evaluate just one intervention through a RCT in a local hospital restaurant.

A survey was sent to staff in one hospital and two follow-up focus groups were conducted to explore barriers and facilitators to choosing healthier options. The staff reported that some of the key barriers to making healthier choices included availability (or lack thereof) of healthy options, time pressure resulting in choosing something quickly that was often less healthy, and price.

The research was brought together into a one-day workshop with stakeholders from the hospital food system and different government departments. Participants were asked to map an imagined journey for personas, reflecting the different goals, frustrations, and self-reported influencers of food choices through the canteen, noting where the persona might struggle to make a healthier decision.

Is it possible to both be an impartial facilitator and have a point of view with direct input into solutions?

iLab led the project while working closely with clients. They facilitated all of the workshops, did the research and wrote the recommendations, raising questions about power imbalances. Is it possible to both be an impartial facilitator and have a point of view with direct input into solutions?  This niggling issue is reflected in the paper on the roles of the experts in co-design and nudge, who must simultaneously see themselves as both expert and non-expert.  While academic research notes the importance of de-centered expertise, in practice this dual role remains a problematic niggle.

In the workshop, participants were also given small cards that had key pieces of ‘behavioural’ information on them: there were some cards with summary points from the literature review, others with general ‘behavioural insights’ on them. This was to help participants consider the abundance of evidence from the academic literature alongside the ‘lived experience’ of the personas in the particular hospital setting. It quickly became clear that expecting stakeholders to absorb this volume of research in a one-day workshop session was unrealistic. But how do we decide what evidence to highlight? And who should decide? Also, how do we explain the impact on behaviour of concepts such as social norms to people who may undervalue or dismiss their impact? The range of different biases that affect people, also influence the ways in which they use and interpret evidence. As noted in our paper, people are often unaware of their own biases, and this can influence their input into designs.

The range of different biases that affect people, also influence the ways in which they use and interpret evidence.

At the end of the workshop, participants had generated and ranked over 100 ideas designed to address concerns the personas faced about healthy eating. Yet, despite gathering qualitative and quantitative data from users of the hospital restaurant, and having ideas from the workshop, Sarah and the team ultimately selected a nudge from the literature, taste – focused labelling, to trial in the hospital. This was both because it had the potential to make a large impact on healthy food choices with minimal effort and did not require structural or price changes, but it was also because the intention of the project was to test one intervention through and RCT. All workshop ideas were included in the report, and many were highlighted as recommendations. In hindsight however, this process of selecting the nudge raised issues of trust, as some ideas from users weren’t progressed. Limiting the scope to testing one intervention meant greater pressure to identify something that would ‘work’ and could be tested in context rather than trying something new and untested.

Reflections for practitioners

Although it was not possible for the team to run an RCT in the end, as there were challenges with data validity, stakeholders were provided with a carefully considered list of recommendations for improving healthy choices based on both the academic and user research. Sarah has thought about this project a lot over the years, particularly how to introduce and weigh different types of evidence. The volume and complexity of the insights were impossible to digest in one sitting. Labels (calorie/traffic light/taste), food positioning, education, pricing…they all have an influence. How do facilitators decide what to focus people’s attention on? How do they balance the academic evidence with the experiences the staff shared in the focus groups and survey? With limited time or budget, how do practitioners prioritise what to implement and test?

Hopefully, by writing openly about it, communities of practitioners will continue to share their experiences in developing innovative problem-solving approaches. In reflecting on the practice of bringing together different approaches, this case reveals deeper insights into how the tensions we discuss in our article are experienced in practice. It suggests that theory has an important role in understanding why ‘niggles’ may be experienced in policy making. Surfacing or making explicit underlying contradictions in different approaches can be a productive process in policy making. This process can make space for discussion, such as on the roles of evidence and expertise, when designing and implementing policy interventions.

 


The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image Credit: Adapted from Sven Mieke via Unsplash. 


 

Print Friendly, PDF & Email

About the author

Colette Einfeld

Colette is a research and evaluation specialist with fifteen years’ experience working with government, universities and not-for-profits. Colette is a Research Fellow at The Australian National University and has a PhD in Public Policy. Her academic research explores the use of knowledge, evidence and expertise in policy making, and community and government engagement in energy resources. Find Colette on Twitter @ColetteEinfeld.

Sarah Allen

Sarah works as an applied behavioural scientist in the Northern Ireland Civil Service. She has a BSc in Psychology from Furman University and MA in Cognition and Culture from Queen’s University Belfast. She has worked on topics such as antimicrobial resistance, recycling, COVID-19, and supporting employees back to work after illness. She is interested in how behavioural science can support the design of public services.

Posted In: Evidence for Policy | Research methods

Leave a Reply

Your email address will not be published. Required fields are marked *