The development of the “What Works Centres” demonstrates how providing evidence to help determine effective policies is far from being as straightforward as it may sound. Complex questions rarely yield simple solutions and getting those in power to act on findings remains an uphill struggle, explains Dan Corry.
Ten years ago, the government began setting up a family of “What Works Centres” to feed expert advice based on sound academic evidence into government and other decision-making. These centres aimed to bring all the evidence in a particular area together, and to commission more research where evidence was lacking. The centres were modelled on the National Institute for Health and Care Excellence (NICE), the equivalent body for the medical world that aims to prevent ineffective and poor value for money treatments being paid for by taxpayers.
Over the past decade, new What Works Centres sprang up in areas as varied as education, policing, and local economic development. But has this network of centres, now numbering nine – plus several more loosely associated ones – worked? And if not, why not?
Not an exact science
Some problems are inherent when you move away from hard medicine and hard science and into social issues. Social issues are about people, and people are complicated. The data and evidence often turn out to be ropey, hard to interpret and unsuited to unequivocal rankings of “what works”. Critics are right to ask how replicable the programmes tested are, even when they’ve supposedly been given the green light by high-end evaluations like Randomised Control Trials (RCT). In the social sphere, a programme for helping young people with mental health issues or to regenerate a town may do brilliantly at a given time and place but fail dismally in another context.
Second, many centres struggle with funding. Doing all this well is not cheap; NICE’s annual expenditure is some £75 million while some newer what works centres – like that for wellbeing or early intervention – struggle by on a few millions at maximum. Top-end RCT approaches are notoriously expensive, which is why we really should be making much more out of the administrative data that the government holds to undertake rapid relatively low-cost, high-quality evaluations as we’ve already demonstrated through the Ministy of Justice’s Justice Data Lab and now beginning to do through the Department of Work and Pension’s Employment Data Lab.
Some What Works Centres are decently funded through an endowment via government or quasi government bodies (like those for Education, Ageing and Youth), but many others struggle to make ends meet. Few taxpayers would put “What Works Centres” on their priority list, yet in times of tight spending constraints there is a strong argument that these centres should receive more investment so money is not wasted on ineffective programmes.
We need to be careful here in terms of governance. The government might want to lean a bit more on those centres with fewer resources whose impact is unclear, but while centres should always be looking at their impact and cost effectiveness, requiring them to demonstrate to some central bureaucracy that they have delivered “expected standards” or risk being struck out of the What Works network would surely be a mistake. The minute these centres do not look 100 per cent independent from government in what they investigate and say, the game is up.
Even where What Works Centres do learn what works, that doesn’t mean politicians will listen to them.
Third, even where What Works Centres do learn what works, that doesn’t mean politicians will listen to them. The era of populism and fake news was hardly the atmosphere that we evidence-based policy wonks hoped to engender. There is a fundamental issue here about the degree to which any of us make evidence-based decisions, let alone politicians seeking re-election. If the evidence says one thing and the opinion polls another, which would you choose?
More charitably, cash strapped decision-makers don’t and can’t use the evidence What Works Centres churn out when they suggest an approach which costs more in the short term than they can afford, whatever their longer-term impacts. This is why front-line leaders – headteachers, doctors, social workers – often fail to take their messages on board, as founder of the now defunct Alliance for Useful Evidence Jonathan Breckon argues in a recent podcast for New Philanthropy Capital (NPC).
Fourth, the What Works movement has only won marginal gains, which though useful are not game changing. So, while the College of Policing has been taught to send letters to traffic speedsters with data on how many children are run over by speeding vehicles, and the education centre has found good and bad ways to use teaching assistants, the “mega” questions remain unanswered.
A worthwhile investment
This is partly because big questions are difficult questions to answer. Are academy schools really working? Can economic regeneration work without serious demand for more jobs? Would more tech in schools be better than more teachers? How do we avoid locking people up with short sentences when we know it does not work to reduce reoffending? How can we improve jobs and housing to boost wellbeing? But it’s also because these ideas inevitably challenge the government of the day. What Works Centres are rarely bold enough to say loudly that a major government policy is not at all supported by the evidence, which is why arguably the more fully independent ones like the Institute for Fiscal Studies (IFS) have more impact.
This does not render What Works Centres a pointless experiment. While they rarely tell us something that nobody ever suspected before, they do help codify what works and help us avoid the false stories of the ideologues and snake oil sellers. The small wins matter – the finding that it really matters how we use teaching assistants means we will be spending those millions much better, and this will probably massively outweigh the costs of all the What Works Centres. If the Treasury in the next spending round insisted that every penny of expenditure had to be supported by evidence, then this would lead to much better resource allocation.
So what’s the verdict? What Works Centres have definitely proved their value, even if they haven’t been revolutionary. If in the next period the What Works Centres are better funded, braver in challenging how we spend taxpayers’ money, and shout from the rooftops when anybody – public, private or voluntary sector – does things that go against the evidence, then we will have reached a new and positive era.
Note: A government event to celebrate 10 years of What Works was held on April 17th attended by government ministers, Chair of the What Works Network Ian Diamond, and some centre leaders.
All articles posted on this blog give the views of the author(s), and not the position of LSE British Politics and Policy, nor of the London School of Economics and Political Science.
Image credit: UX Indonesia via Unsplash.