Academic researchers – not just media pundits – should have their say in holding policy promises to account. Jonathan Breckon charts the various activities around the country aimed at providing a rigorous evidence-base in the run-up to the UK’s General Election. A whole range of economists, statisticians, social scientists and others are fact-checking what politicians and pundits say so that they don’t get away with iffy promises or sound-bites. But the challenge in the boom of fact-checkers is getting the quality right.
You may be despairing of some of the current evidence-free promises made by Government and Opposition. But this week an Alliance for Useful Evidence policy seminar discussed the ways for academic social scientists to rebut dodgy statements in the lead up to May. The panel and audience explored the role of evidence in the general election and saw the launch of the Alliance for Useful Evidence-backed Manifesto Check run by The Conversation UK. Manifesto Check will use university-based academic experts (I’m afraid researchers from think-tanks and campaigning groups are excluded). To control for biases of experts, The Conversation is also planning on some blind peer review– where social scientists evaluate the quality of other researchers’ work – but without knowing each others’ identities.
During the election, Manifesto Check will be one of many voices; we will see a whole range of economists, statisticians, social scientists and others, fact-checking what politicians and pundits say. The UK’s Full Fact will be checking claims around health, immigration, education and other areas of social policy. They’ve set up a ‘war room’ of staff and volunteers who will work 18 hours a day in the run up to voting day (do ask any interested colleagues and students to sign up too).
Image credit: BurnAway (Flickr CC BY 2.0)
The fact-checking trend kick-started with the 2012 American presidential election with organisations like Factcheck.org and the Pulitzer-prize winning PolitiFact. These bodies publicised false claims by Barack Obama and Mitt Romney’s team, who had to employ full time staff to respond to the fact-checkers. It’s now a global movement with 80 fact checking organisations. The challenge in the boom of fact-checkers is getting the quality right. Any major media outlet may set something up in the weeks before the Election. But quality may be compromised if it’s done from a ‘standing start’, warned Will Moy, Director at Full Fact at our planning roundtable last year.
Hopefully getting academics involved in the process will encourage high standards. But experts are all fallible and robust quality-controls needed. The Channel 4 News factcheck blog crowd-source some of their expertise to make sure the public are feeding in and they are not missing out. Others are taking a more campaigning approach. Based at Sense about Science, Evidence Matters is asking researchers – as members of the public and as voters – to hold candidates to account through their Ask for Evidence campaign. They will celebrate politicians who get it right, expose them when they’re not, and have a public debate about the uncertainty in-between.
Some of the well-established research outfits will also get into election-mode. The macroeconomic implications of parties’ fiscal plans have been looked at the by the National Institute of Economic and Social Research – the only independent analysis so far of this important topic. Despite the oncoming purdah, the Government’s Office for National Statistics is not shying away from getting data out to voter producing weekly releases on perspectives on UK subjects that likely to be relevant to the election campaign. And if you want a Coalition post-mortem, the Institute for Fiscal Studies has launched its new election website, funded by the Nuffield Foundation (who also fund Full Fact’s election work), with analysis of what has happened over this parliament – and the implications of the different parties’ fiscal policies.
The push for smarter use of evidence continues after the election. The Evidence Information Service plans to do some rapid match-making between MPs and thousands of academics. The fresh intake of MPs will be giving training in evidence by bodies like Parliamentary Office for Science and Technology (POST), who have a dedicated Social Science Section. POST is closely involved in House of Commons-wide efforts around induction and continuing professional development, to ensure that evidence has an appropriate profile across a range of activity. For instance, to help with MPs’ scrutiny policy, POST are working with Parliamentary Scrutiny Unit, the Commons’ Library and external partners to develop an interactive session open to all MPs on using research evidence. POST are also piloting ‘on-demand’ briefings in key policy areas, and will be running horizon scanning on trends in in society, technology, the environment, economics and politics.
To improve advice to the newly-minted ministers in the next Parliament, the Campaign for Science and Engineering – and its sister body, the Campaign for Social Science- have been making a strong case for bolstering academic advice in each Government department. The social scientists recently launched their own election demands in The Business Of People: The Significance of Social Science Over The Next Decade, asking that their disciplines are embedded in any strategy for science and innovation worth its name. The Royal Statistical Society has a data manifesto for those next in power. They are campaigning around the issue of pre-release access, whereby some Ministers and officials can see statistics before everyone else. New research from NatCen shows the majority of the public think that everyone should see that stats at the same time.
But how much can social science evidence really help in the highly contested realms of social policy? Will, say, systemic reforms to welfare policy have research to back them up? Will tinkering with how schools are organized do any good for children? These are no easy answers, and sometimes the best we can do is to provide caveats and warnings of complexity. A message of ‘voters beware’.
Yet academic researchers – not just media pundits – should have their say. British social scientists have an outstanding record of studying elections. They have also helped inspire new policies, or write manifestos. From sociologist’s Michael Young’s drafting of Labour’s Let Us Face the Future in 1945, to the authoring by political scientist Andrew Dobson’s of the 2010 Green Manifesto. As well as such direct involvement, researchers now have the means to inform the public about what we know – and don’t know. It’s an opportunity to get off the sidelines – so that politicians or journalists don’t get away with iffy promises or sound-bites.
Note: This article was originally published on the LSE’s Impact of Social Sciences blog and gives the views of the author, and not the position of the British Politics and Policy blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Jonathan Breckon is Head of the Alliance for Useful Evidence. The Alliance champions the use of evidence in social policy and practice. It is an open-access network of over 2,000 individuals from across government, universities, charities, business and local authorities. It is funded by ESRC, Nesta and the Big Lottery Fund. @A4UEvidence