LSE - Small Logo
LSE - Small Logo

Jonathan Breckon

April 27th, 2022

Quick, but not dirty – Can rapid evidence reviews reliably inform policy?

5 comments | 38 shares

Estimated reading time: 7 minutes

Jonathan Breckon

April 27th, 2022

Quick, but not dirty – Can rapid evidence reviews reliably inform policy?

5 comments | 38 shares

Estimated reading time: 7 minutes

The COVID-19 pandemic created an unprecedented and time critical demand for policy relevant evidence syntheses and in so doing demonstrated how timely evidence reviews can shape policymaking. As the policy crisis of COVID-19 recedes, research is underway to assess how these methods could be applied to other policy areas. In this post, Jonathan Breckon considers how rapid evidence reviews have been used, the potential pitfalls in adopting rapid research methods and invites readers to contribute to work being carried out by the Parliamentary Office for Science and Technology, International Public Policy Observatory, and Capabilities in Academic Policy Engagement, into how rapid reviews can be deployed in future.


With the explosion in the quantity of published research, it has become ever harder to make sense of any given research area, especially for policymakers. A torrent of new studies are published daily. During COVID-19, it was estimated that over 100,000 pandemic related papers were published in 2020 alone. Not only is there a deluge of supply, but also of demand from policymakers, who need this information at pace. An exhaustive systematic review of available research can take on average 15 months to finish. Few policymakers can wait that long. Is it possible to balance the need for speed with academic quality?

Research Maps

One way to keep up with the volume of information is through dynamic ‘living’ reviews of evidence. The International Public Policy Observatory regularly updates its living map of systematic reviews of social sciences relating to COVID-19. Evidence and ‘gap maps are also useful snapshots for what we know – and what we don’t know.

Fig.1: Centre for Homelessness Impact/Campbell Collaboration Evidence and Gap Map

Evidence maps can be impressive in their range, eg. this review of prevention of children getting into violence, that covered 1,569 impact evaluations, 302 process evaluations, and 268 systematic reviews. But, they are blunt instruments. They show what evidence is out there, not what the evidence says. To find meaning you need other tools.

Enter the rapid review

For an analysis of research, rapid evidence assessments offer an accelerated version of an exhaustive systematic review. Rapid reviews were pioneered in health policy, but are increasingly used in social policy. In both instances, they are informed by systematic review methodology (see Fig.2): defining questions, agreeing inclusion criteria from the start, identifying all existing studies, trawling through research databases, checking quality, and then, finally, bringing it all together by transparently analysing and synthesising what has been found.

Fig.2: Example of flow diagram of stages in a systematic review:

What systematic reviews are NOT are literature reviews. They avoid cherry picking individual studies – consciously or not – that fit a preconceived idea or narrative. This attempt at comprehensiveness is important for policy: where research summaries on issues of national importance should avoid bias towards some studies and the exclusion of inconvenient research.

Choose your shortcuts wisely

Systematic reviews are resource-intensive. Can they be safely sped up? Rapid reviews fast-track the systematic review process with a range of ‘shortcuts’, striking a balance between rigour and rapidity to minimise bias, and optimising transparency. But, there is little agreement about where to expedite the systematic review process. One review of rapid methods showed wide variations in rapid methods. There isn’t even agreement on  what to call them, a scoping review in health research found over 20 different names, the most frequent term being ‘rapid review’.

One option is to cut back on the breadth of literature reviewed – reducing the number of databases searched, limiting the inclusion criteria, or simply narrowing the years and geographies covered. However, only covering recent research could omit crucial data. A 2021 UK POST rapid review for parliament on water fluoridation needed to refer to a seminal core study from the 1970s. Another danger of narrowing the range is missing ‘grey’ literature. Think-tank reports or industry studies – can provide valuable insights for policymakers. In balancing rapidity with rigour, there is a need to chose shortcuts wisely.

Examples of shortcuts for a Rapid Evidence Assessment: 1.Focusing on more targeted research question/reducing scope; 2. Reducing list of sources searched, including limiting these to specialised sources (e.g. of systematic reviews, economic evaluations); 3. Narrowing timeframe and searching for only recent published literature; 4. Searching for only English language studies; 5. Avoiding unpublished ‘grey literature’; 6. Using only one reviewer for study selection and/or data extraction; 7. Not publishing a formal protocol or using PRISMA reporting framework; 8. Using just one database, not a wide selection; 9. No ‘hand searching’ to manually find research; 10. Using only a narrative to synthesise findings

A reasonable compromise or compromised research?

The danger is that shortcuts also cut academic quality and may still fail to be fast. Reviewers at the EPPI Centre warned nearly a decade ago that there is a danger of failing to ‘satisfy either the requirements for rigour or the requirement for timeliness’. Rapid approaches to questions of social policy may be particularly problematic, in part because of the complexity of the topics, diversity of studies, and contested nature of policy.

Such concerns are understandable as universities have to protect reputations for scholarly rigour. Yet, there is a hunger for fast reviews for policy that should be met by being quick, but not dirty, or as Voltaire more eloquently said, quoting an Italian proverb, ‘the best can be the enemy of the good’. If we only allow the best of systematic reviews, we allow policymakers to use partial and unreliable literature reviews.

Other ways to fastrack?

The Covid-19 pandemic showed us that it is possible to move at speed without cutting quality, on topics such as, the impact of school closures, or estimates of vaccine refusals, even if the quality and coverage was sometimes variable. It may be hard to repeat that sense of purpose. But, other fastracking review techniques are possible. You can find existing overlapping systematic syntheses through tools such as Cochrane and Campbell Collaboration. Covid also reminds us of the value of automation and machine learning, such as the splendidly named L-OVE (Living OVerview of Evidence), and the automated text mining and screening at the EPPI Centre.

It may not need whizzy and expensive technology, just focus. Gavin Stewart at Newcastle University, worked with N8 AgriFood to deliver rapid reviews with doctoral students and early career researchers – by keeping them undistracted in a hotel for a week. With residential training, they produced rapid reviews and policy briefs on topics – at a rough cost of £2,336 per review (systematic reviews can cost around £100k). The students also gained valuable skills in evidence synthesis and writing for policymakers.

Share your insights

Rapid review methods are growing and the demand is not going away. The UK Parliamentary Office of Science and Technology is currently running a pilot project exploring different ways in which rapid reviews can inform select committee work. As part of this we also want to learn from others developing rapid reviews for policymakers. Please get in touch if you have experiences to share on developing your own methods. Balancing rigour with speed is challenging, sharing insights is one way we can all get better at it. A lot of this work is happening in silos (notably health and the systematic review research communities) and we need more knowledge exchanging across boundaries. For our project, we intend to capture what we have learnt, warts and all, with the hope that other policy and research communities can continue to take forward the important work of rapid rigorous synthesis.

 


Readers interested in sharing their own expertise can contact the Parliamentary Knowledge Exchange Unit via email: keu@parliament.uk.

The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image Credit: James Donovan via Unsplash. 


 

Print Friendly, PDF & Email

About the author

Jonathan Breckon

Jonathan Breckon, is a consultant and independent advisor at CAPE he previously worked inside Whitehall on the Open Innovation Team, running rapid reviews and deep dives of research and expertise for policymakers. He has been a knowledge broker for the last 20 years. He led the Alliance for Useful Evidence for 9 years, and was a founding Board member of What Works Children’s Social Care. For light relief, he is part way through a part-time PhD on evidence-based practice and the professions. He is a Senior Associate at Transforming Evidence, Visiting Fellow at Campbell Collaboration UK and Ireland, and a Fellow of the Academy of Social Sciences.

Posted In: Evidence for Policy

5 Comments