Open data and transparency have long been heralded as welcome innovations by policymakers and politicians, and the current Government has made it a priority at both a national and local level. But when it comes to the latter, how effective has it been and how much have citizens made use of it? Mark Frank argues that local authorities continued use of the ‘passive transparency’ model threatens to limit the potential of open data and transparency.

This piece originally appeared on Democratic Audit.

Over the course of the last decade governments round the world have been extolling the virtues of open data, which means publishing a wide range of government data on the web for anyone to view and reuse without financial, legal or technical restrictions. Open data differs from Freedom of Information legislation because it is proactive not reactive – data is published without being asked.  The coalition government supported this policy enthusiastically and the UK is ranked as number one in the world for prevalence and impact of open data according to the Open Data Barometer

Chart 1: Country Rankings according to the Open Data Barometer 2013.

MarkFrank1

Many benefits have been claimed for governments publishing open data, one of the most important is that it increases transparency and thus makes government more accountable and  efficient (a concept which goes back at least to Jeremy Bentham). Clearly this message appeals to a government committed to reducing public expenditure and the role of the public sector. As David Cameron put it shortly after coming to power in 2010:

“With a whole army of effective armchair auditors looking over the books, ministers in this government are not going to be able to get away with all the waste, the expensive vanity projects and pointless schemes that we’ve had in the past.”

This policy applied equally to local government. A few weeks later Eric Pickles, then the Secretary of State for Communities and Local Government (DCLG), announcedlist of “recommended” datasets for local authorities to publish, known as the Local Government Transparency Code. Almost every primary and secondary authority in the country conformed to the recommendation. In 2014 the list was expanded and authorities were required to publish the expanded list by law.

So has publishing open data made authorities more transparent?  It depends on what you mean by transparency. In 2011 Antti Halonen of the Finnish Institute in London surveyed local authorities on the new policy with results that he summarised as “in theory yes, but in practice no”. The main problem being that the data were largely ignored:

“It depends on how you define successful. While we have produced the data, there appears to be minimal interest in it.”

In 2013 Ben Worthy of Birkbeck College, London conducted another survey of local authorities which confirmed this conclusion. 60% of the authorities reported use as low or very low.

This message is further confirmed by my own research. In 2014 I explored several local authorities’ use of open data  – interviewing politicians, officers and users. One authority in particular clearly illustrated the situation. This authority was among the first to publish of open data and helped set the standards for the transparency code. There was no doubting the commitment to transparency throughout the organisation. Yet, usage is low, as one officer explained:

“There are people like the armchair auditor … but they are pretty far and few between at the moment”

There are a couple of reasons why this is not surprising. One is the nature of the data. The transparency code emphasise data on expenditure (invoices over £500, senior officers salaries, details of contracts etc). This type of data provides what David Healdcalls input transparency – information about the resources the authority uses to provide services. While input transparency may appeal to a government cracking down on public expenditure and waste, most citizens are more interested in what Heald calls output and outcome transparency i.e. what services the authority delivers and what effect those services have.

Parents care more about the lessons their children receive and how educated they are as a result than about how much the authority spends on teachers’ salaries. Even those citizens who are keen to monitor inputs will presumably want to relate those inputs to outputs and outcomes. As chart 2 below shows, there is only one dataset in the transparency code that could be reasonably be described as output transparency (number of parking places) and none that could be described as outcome transparency.

Chart 2: Mapping the Data Transparency Code Against Heald’s Types of Transparency

MarkFrank2

The other reason is that authorities publish the data but make little effort to promote it. It is not easy to find the data on most authorities’ websites – much less understand it.  As an officer explained:

“….I wouldn’t say we have necessarily been overt about publicising it either. So I think it has largely been driven by trying to make our lives easier in terms of requests rather than being overt about publicising that things are there”

Many authorities make a public commitment to transparency, and, at least for this case study, this was a genuine commitment. Councillors would regularly challenge officers to publish data unless there was good reason not to. As a result the authority published many more datasets than were required under the code (over a 100 at the last count). Nevertheless the vast majority of datasets were input data and there was little effort to promote them. I concluded that this arises from their concept of transparency – what Frederick Schauer calls passive transparency and Richard Oliver calls old transparency. Passive transparency is the view that transparency means making information about government available. Government should not concern itself with who uses that information or for what purpose. It is not required, or appropriate, for government to promote that information or interpret the information. The metric of passive transparency is the amount of information that is made available – more is better.  As the leader of the authority said when asked about the low usage of open data:

“there are books in libraries that don’t get borrowed for years and years, but then someone will want it someday”

Active transparency regards transparency as an act of communication to an audience.  A great example is  targeted transparency programmes  such as requiring restaurants to display stickers with the results of food hygiene inspections. Active transparency involves defining a problem to be addressed, identifying an audience to be informed, selecting information that is to be communicated and putting in place the socio-technical systems to communicate successfully and allow the audience to act on the information.  Clearly open data can be part of active transparency but it requires a lot more than making data available. It may even require creating the data needed to address the problem e.g. through a survey or an inspection programme.

Open data is closely linked to passive transparency.  Success is typically measured by the amount of data that is published – governments and departments make much of the number of datasets – and the golden rule is all data should be published unless there is good reason not to. Many writers strongly resist the idea that government should select or interpret the data. It is not surprising that the result is mostly input transparency. Input data is on the whole easier to obtain and publish than output or outcome data.  It is easier to track how much you have spent on teachers than the number of lessons they have taught or  how well educated children are as a result.  There are signs that this changing. The national information infrastructure is a government initiative to select datasets based on users needs that are key to the nation and demand special attention. But the passive transparency model remains dominant and while it does open data is unlikely to have a significant political impact.

Featured image credit: LeRamz, CC BY NC SA 2.0

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Mark Frank is PhD student at the University of Southampton. His Southampton web page can be found here.

Print Friendly