Clare Wilkinson and Emma Weitkamp from the University of the West of England, Bristol offer support for researchers looking to track and evidence the unique, creative and often qualitative outcomes of public engagement and communication activities. Rather than an add-on to the research, it may be possible to embed evaluation within the research project itself.
As science communication researchers and practitioners, we’ve been quite interested in the increasing emphasis on ‘impact’ within the UK university sphere. How do we measure ‘impact’ in a public engagement context? What might impact from communication look like? These questions keep emerging in our partnerships with researchers seeking to communicate their work, on training courses we run and in the evaluation roles we’ve held in a variety of projects. Again and again we’re asked about the benefits of critically evaluating engagement approaches, how to collate evidence, and find the time to do so. This has given us ample opportunity to reflect on the ways that researchers might be equipped to evidence the impact of their public engagement activities.
The challenges of measuring research impact have of course been well discussed, including within this very blog. We recognise many of these wider concerns; there is a danger that metrics may emphasise or narrow certain definitions of impact, place an overemphasis on accountability, and be an uncertain process for many researchers. Whilst analysis of the preparation and submission of Impact Case Studies to REF 2014 suggest mapping impact from public engagement was less common and may have been seen as ‘risky’ to include (Kings College London and Digital Science, 2015; Manville et al., 2015).
Nevertheless researchers engaged with communication around their research are now faced with the challenges of evidencing its impact, and rather than reduce public engagement to ‘one size fits all’, we hope to support researchers to evidence the unique, creative and often qualitative outcomes of such efforts.
In our new book Creative Research Communication: Theory and Practice, we talk researchers through evaluation approaches which can be used to track the outcomes and impacts of research communication activities on a range of participants, including the researcher, as well as ways in which you might adapt evaluation to be more creative in itself. We are interested both in how evaluation of public engagement can be designed to include outcomes and impacts, as well as how evaluation of research communication activities in itself supports evidence of the impact of the research, but for now we will focus on three key issues.
Beneficiaries
When collating data on the experience of public engagement it is easy to be driven by funder or broader institutional and governance requirements, meaning many researchers may lose sight of the other benefits of such processes. Evaluation can be a space for reflection, learning and improvement, to promote innovation, change and development, and allow both participants and researchers to identify what they have taken from such processes. We argue that it is easy to see evaluation as an ‘add on’ or an extra, but in many cases it may be possible to embed the appreciation of what a public engagement activity has achieved within a research project itself. Within the book we draw on a case study from the Centre of Appearance Research at UWE, Bristol. Collaborating with the Dove Self-Esteem Project has allowed this group of researchers not only to engage around their work on young people and body image, but to develop sophisticated monitoring and evaluation tools which are built into their projects. Yet for a variety of reasons many researchers may not, as yet, be connecting their research activities and public engagement activities in a way which allows for a consistent narrative to be formed. Identifying public engagement with a more holistic research process may assist researchers to do so.
Evidence
It is well recognised within the field of evaluation that tracking impact beyond the immediate effects poses considerable difficulties, but there are a variety of techniques and resources on which to draw. There exist a range of frameworks and materials which can be utilised, from handbooks on project evaluation, to those which are more akin to a recipe, or seek to empower people within the process. Utilising such expertise and advice can help researchers to move away from the typical starting point of a questionnaire (though they might be perfect in certain settings), and to use a framework of tried and tested methods to introduce more creative or innovative aspects to their delivery. Designing a festival activity? Why not carry out some short qualitative interviews? Creating a summer school for young people? Could you develop an image-based evaluation using their photographs and videos? Involved in a dialogue activity? Is there time to build in a focus group? There are a variety of approaches to use but often they require thought, consideration and planning, which links us to the next challenge.
Time
Time is always a challenge for researchers, both in finding the time to communicate and engage (a recent report by TNS-BMRB shows this remains a considerable barrier to engagement), and then additionally to consider any impact from that work. It is not a problem that can be easily resolved. However putting in some ground work to find out if your institution provides any support around the collation of such data, considering whether there are existing data sources which might be drawn on or supplement your evaluation, or planning a way, upfront, to collate your engagement evidence (perhaps an email from a local school, or testimonial from an advisory group stakeholder) may help to make the process a little more efficient. Of course, if your activity is supported by external funding, consider building in time for your evaluation and/or the support of an external evaluator. A variety of funders expect you to plan a pathway to impact, or want to see you acknowledge how any learning from a public engagement activity might be shared. Use this opportunity and build in a credible, authentic and achievable evaluation plan.
Our communication and engagement practice and research highlights a wide variety of benefits to participants from well-planned engagement, but it is important to recognise the multiple goals and outcomes participants may have. There is an implicit assumption in much of the language used in relation to impact that change is needed, whether that be about behaviour, learning, attitude or awareness, and that people will be ‘improved’ in some way by participating. Yet participants bring multiple motives to public engagement and a narrow focus on beneficial insights may neglect the varied and mutual outcomes engagement can have on all involved.
Creative Research Communication is written for researchers with an interest in engaging the public with their research and postgraduate students exploring the practical aspects of research communication.
Featured image credit: Map by Alex Mihis (CC0, pexels)
Note: This article gives the views of the author, and not the position of the LSE Impact blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Clare Wilkinson and Emma Weitkamp are Associate Professors based at the Science Communication Unit, University of the West of England (UWE), Bristol. They have recently authored the book Creative Research Communication: Theory and Practice published by Manchester University Press. They can be found on Twitter at @clarewilk4 and @e_weitkamp
3 Comments