Within the social sciences, translating and sharing new knowledge is now common practice amongst many researchers and institutions across academia. From evidence briefings and summaries of literature to online blogs and presentations, a wide range of research evidence aims to engage policy and practitioner audiences so they can more easily access and use the evidence. Raj Patel questions whether it is adequate to simply communicate findings, and proposes a model for adding value to research in a way that is more likely to generate impact.

Social science research can help to unpick and understand the complexity of problems, explore behaviours in relation to particular events or stimuli, and critically examine the extent to which policies or other factors determine outcomes. Such research can add to our collective knowledge or be instrumental in shaping a particular policy. However, a 2013 Carnegie Trust and Joseph Rowntree Foundation survey of social policy evidence users found that though university research was named as the most trustworthy source of evidence (with 68% of respondents “always or usually” trusting of it) – ahead of UK Government (60%) and devolved Governments (57%) – it was not the most frequently used, with only a third of those surveyed able to access it frequently.

With scientific papers often described as “impenetrable” by non-academic audiences there is acknowledgement that communication of scholarly research is an important ingredient in knowledge sharing and a first step in impact generation. But even if impenetrability, paywalls and the volume of scientific papers (estimated to be growing annually by 3% to 3.5% globally, with acceleration in recent years) are not always an issue, time constraints are a real barrier so “straight-talking” research matters in driving up use – whether the research is used in a two-way communications and engagement process or in a one-to-many communications process aiming to reach a broad audience.

In their discussion paper “Using Evidence, What Works?”, informed by a systematic review of research on evidence-based policy, Breckon and Dodson identify that providing communication of and access to evidence is a key factor that influences the use of evidence in decision-making. However, they argue that more attention needs to be paid to how evidence is communicated and, importantly, to audience segmentation, tailored messaging, and user-friendly design. Framing the evidence (e.g. in terms of gains or losses), crafting a narrative that is persuasive, or promoting the evidence through a user-friendly online platform are just some of factors that could influence the uptake of research.

There has undoubtedly been considerable progress in closing the gap between research production and uptake/use since the advent of the impact agenda. But complex ideas and research evidence compete for attention in a “marketplace” saturated with facts, statistics, and analyses, and often with other forms of non-academic evidence too. Unsurprisingly, the need to be creative in this space and look for opportunities to innovate has never been greater – notwithstanding that innovations may not always travel easily across audiences, sectors, disciplines, and contexts.

The Economic and Social Research Council, for example, encourages researchers to answer the “so what?” question when communicating findings. Its evidence briefings set out policy implications up front before explaining the findings. Of course, where such policy recommendations are highly generalised, “unrealistic” within a given context, or do not directly flow from clear and precise research findings, they can distract from the strength of evidence itself. Indeed, in some cases uncertainty of research means that it is not always feasible to make strong recommendations. As one knowledge director recently commented: “I am not interested in half-baked policy ideas – stick to the findings [if you don’t know]”.

Understanding Society, the UK Household Longitudinal Study, has been innovating with findings through its annual Insights Report, containing selective policy-relevant research based on the Study’s longitudinal data. The data is publicly available to use which means a wide range of social, economic, health, and methodological research is produced annually. The Insights Report brings together new findings with commentaries from external policy players or practitioners in key sectors, including former ministers. It uses aids such as infographics to visualise and communicate key trends. Meanwhile, the Parliamentary Office for Science and Technology produces POST Notes for parliamentary use; “four-page summaries of public policy issues based on reviews of the research literature and interviews with stakeholders from across academia, industry, government and the third sector”.

This raises a question about what value users place on different kinds of translated research and how best to add value to social science research. This, of course, requires demand-side research into the practices of knowledge acquisition and use by decision-makers – a topic I hope to revisit. We already know that the current incentive system rewards original findings rather than synthesis which puts together findings from different types of studies – a longer-term barrier that needs to be addressed.

In her excellent ethnographic case study of civil servants in England’s Department of Health, Jo Maybin provides interesting insights into the how and why of knowledge use. Mid-ranking civil servants were at pains to emphasise that they tend to be engaged in “making policy happen” and consistently objected to being referred to as “policymakers”. To quote:

“The key planks of policies were established by ministers, special committees and selected outsiders (think tanks, respected academics and prominent ‘expert’ individuals from particular industries or sectors). Their task was not to come up with the solution to some societal problem; it was to instead take ideas or proposals brought by others, and turn them into workable policies.”

In this particular context, knowledge offered by individual researchers (up-to-date, synthesised, candid and editorialised) was seen as particularly well-suited to the needs of civil servants (rather than research documents).

But the culture of evidence use can vary enormously between different types of institutions and indeed between government departments. If researchers are not to waste incalculable hours on communications and engagement the question of how to add value to research becomes critical. The concept of added value is well recognised in the private sector but usually when applied to products and services. Applied to academia and knowledge sharing it is far more motivating to think about what one can create that produces most value – this is far more empowering and builds long-term trust with users (Reed, 2016, The Research Impact Handbook). This framework attempts to model how researchers can create better value from good research.

Figure 1: A model for adding value to research. Source: significantly adapted from Dolan, A., HM Revenue and Customs

Whilst presented as stages, there is no presumption that each stage is sequential or that this is a task purely carried out by researchers – some aspects could benefit from collaboration. It is simply a way of thinking about how to more systematically move from data collection to possible solutions in tackling societal challenges, whilst looking for opportunities to add value.

The context in which the model is applied is important, and it comes with three important caveats:

  • Detailed and considered proposals carry more weight. So, for example, understanding the problems with the current delivery system may be as important as the policy choices. One way to address this is by bringing together different kinds of stakeholders, including practitioners, in more structured ways to tease out implications and ideas.
  • Other actors in the world of policy, such as think tanks, often pay more attention to problem definition, argument, language, persuasion, and timing. But the solution is not to simply copy dissemination approaches by others. Rather, the objective should be to become part of advocacy coalitions, alliances, or networks which offer longer-term insights into policy influencing processes.
  • Often driven by ministerial desire to demonstrate action, research shows that Britain is prone to ad hoc policymaking, continual policy reinvention, and short-termism that have led to poor performance in a number of sectors (Institute for Government, 2017). Simply generating more policy ideas does not necessarily lead to longer-lasting solutions when a more radical course of action might be needed.

Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

About the author

Raj Patel works at the Institute for Social and Economic Research, University of Essex, where he is the Impact Fellow at Understanding Society. Previously Raj was a Director at the Learning and Skills Network. His career includes heading up Research and Development at the Neighbourhood Renewal Unit in the Office of the Deputy Prime Minister (now the Department of Communities and Local Government). An economist and policy analyst by profession, he has worked extensively on national public campaigns, policy research, local and regional development and designing new ways to tackle social issues. Get in touch with Raj at rajpatel@essex.ac.uk.

Print Friendly