LSE - Small Logo
LSE - Small Logo

Blog Admin

April 20th, 2011

In Whitehall, academic research is far more likely to be used if it fits with the story already being told

0 comments

Estimated reading time: 5 minutes

Blog Admin

April 20th, 2011

In Whitehall, academic research is far more likely to be used if it fits with the story already being told

0 comments

Estimated reading time: 5 minutes

After a six-month spell as a Whitehall policy adviser, Professor Alex Stevens finds there is plenty of evidence on which to base policy – but this does not make for ‘evidence-based’ policy.

Does the coalition government still believe in “evidence-based policy”? Ministers from Andrew Lansley to Baroness Neville-Jones have claimed that it does. In a recent Lords debate, Neville-Jones invoked this notoriously Blairite phrase to justify the new strategy on illicit drugs.

The past two governments have both used claims to evidence to brush off accusations of ideological policy making. The apparent gaps between these claims and the resulting policies have led to great scepticism. Some have stated that policy makers ignore research, and so create strategies that are “evidence-free”. Others suggest that they deliberately set out to use only the studies that suit their arguments, and so create “policy-based evidence”. In a recent article in the Journal of Social Policy I describe how both these claims fail to capture the complex reality of how civil servants use evidence.

In 2009, I spent six months working on an “out of academic life” placement as a policy adviser in a team at the heart of Whitehall. Surrounded by bright, industrious, fast-tracked young men (and a few similarly impressive women), I responded to requests for information, worked up policy proposals and generally took part in the policy-making process.

I soon found out that this was a world saturated with evidence. Huge volumes of government-collected data, independent reviews, think tank papers and academic articles – alongside media stories and examples from popular TV programmes (The Wire was a contemporary favourite) – were folded into our daily discussions on how to improve governmental action.

The problem was not a lack of evidence, but the sheet quantity of data and opinions that was available. And most of it was unsuitable for answering policy questions. Policy-makers want to know what the costs and effects of a policy option will be, and on whom they will fall. It is rare for research to provide definitive answers to these questions. A colleague spoke of a “depressingly similar pattern” of looking for high-quality evidence, but then ending up with some anecdotes and “what you can garner through a few field visits”.

Faced with a lack of conclusive evidence and an ongoing need to keep churning out the policy papers, my colleagues and I engaged in a process of “selling” policies. To be recognised by our peers and seniors for being useful and productive we had to get these policies accepted by all the departments who shared an interest in them. To achieve this, we paid great attention to narrative. What story is the policy telling? How can it be made more persuasive?

Given that most of these tales were being told in the form of PowerPoint packs, the best way to increase their impact was to include “killer charts”. These graphs reduced the world to a usual maximum of two variables and they aimed to make policy implications self-evident. They excluded the possibility of uncertainty. Uncertainty – as my colleagues let me know whenever I tried to insert academic caveats – is the enemy of policy-making.

Even where the evidence did lead to a particular conclusion, this did not mean that it would feed into policy. In a meeting on sentencing policy, I observed a well-informed debate using research on the effects of increased imprisonment. My colleagues eventually agreed that it was not the best use of public money, but they ruefully concluded that this evidence would not be welcomed by the ministers and special advisers of the day. So they left it out of their recommendations.

I found many examples where the scope of the policy debate was constrained by my colleagues’ anticipation of the kinds of knowledge that would be of use to their bosses. If civil servants hope for swift career progression, they have to develop useful links to people who are more senior in the hierarchy. To do this, they have to prove themselves to be useful.

Civil servants learn from their colleagues that certainty is more useful than accuracy, and action is better than contradiction. This means that – for the team I worked with at least – evidence was far more likely to be used if it fitted with the story that was already being told; a story that usually emerged from a complex interaction of the evidence with the interests of the politicians, special advisers and civil servants who were its joint authors.

This suggests that policy is not evidence-free. Neither is the chosen evidence simply policy based. It does mean that there is some systematic distortion of the use of evidence in ways that suit the interests of the powerful social groups that constitute the British state. And this is certainly not “evidence-based policy”.

Professor Alex Stevens is professor in criminal justice at the University of Kent and author of Drugs, Crime and Public Health (Routledge, 2011).

This article was first published in Public Servant magazine.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Government

Leave a Reply

Your email address will not be published. Required fields are marked *