The eventual replication of the data from the Reinhart-Rogoff paper on 90% debt/GDP threshold has sparked vibrant discussion on the impact of error-ridden research on austerity policies around the world. Velichka Dimitrova argues this controversy highlights the importance of open data of economics datasets. Coding errors happen, yet the greater research problem was not allowing for other researchers to review and replicate the results through making the data openly available as early as possible.
Another economics scandal made the news last week. Harvard Kennedy School professor Carmen Reinhart and Harvard University professor Kenneth Rogoff argued in their 2010 NBER paper that economic growth slows down when the debt/GDP ratio exceeds the threshold of 90 percent of GDP. These results were also published in one of the most prestigious economics journals – the American Economic Review (AER) – and had a powerful resonance in a period of serious economic and public policy turmoil when governments around the world slashed spending in order to decrease the public deficit and stimulate economic growth.
Yet, they were proven wrong. Thomas Herndon, Michael Ash and Robert Pollin from the University of Massachusetts (UMass) tried to replicate the results of Reinhart and Rogoff and criticised them on the basis of three reasons:
- Coding errors: due to a spreadsheet error five countries were excluded completely from the sample resulting in significant error of the average real GDP growth and the debt/GDP ratio in several categories
- Selective exclusion of available data and data gaps: Reinhart and Rogoff exclude Australia (1946-1950), New Zealand (1946-1949) and Canada (1946-1950). This exclusion is alone responsible for a significant reduction of the estimated real GDP growth in the highest public debt/GDP category
- Unconventional weighting of summary statistics: the authors do not discuss their decision to weight equally by country rather than by country-year, which could be arbitrary and ignores the issue of serial correlation.
The implications of these results are that countries with high levels of public debt experience only “modestly diminished” average GDP growth rates and as the UMass authors show there is a wide range of GDP growth performances at every level of public debt among the twenty advanced economies in the survey of Reinhart and Rogoff. Even if the negative trend is still visible in the results of the UMass researchers, the data fits the trend very poorly: “low debt and poor growth, and high debt and strong growth, are both reasonably common outcomes.”
What makes it even more compelling news is that it is all a tale from the state of Massachusetts: distinguished Harvard professors (#1 university in the US) challenged by empiricists from the less known UMAss (#97 university in the US). Then despite the excellent AER data availability policy – which acts as a role model for other journals in economics – AER failed to enforce it and make the data and code of Reinhart and Rogoff available to other researchers.
Coding errors happen, yet the greater research problem was not allowing for other researchers to review and replicate the results through making the data openly available. If the data and code were available upon publication already in 2010, it may not have taken three years to prove these results wrong – results which may have influenced the direction of public policy around the world towards stricter austerity measures. Sharing research data means a possibility to replicate and discuss, enabling the scrutiny of research findings as well as improvement and validation of research methods through more scientific enquiry and debate.
The Open Economics Working Group of the Open Knowledge Foundation advocates the release of datasets and code along with published academic articles and provides practical assistance to researchers who would like to do so. Get in touch if you would like to learn more by writing us at economics [at] okfn.org and signing for our mailing list.
Link to paper
Link to data and code
Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics.
Velichka Dimitrova is project coordinator of Open Economics at the Open Knowledge Foundation. She is based in London, a graduate of economics (Humboldt Universität zu Berlin) and environmental policy (University of Cambridge) and a fellow of the Heinrich Böll Foundation. She can be found on Twitter at @vdimitrova
Any one take a lab course in secondary or tertiary such as chemistry? Was it a standard requirement that you handed in your data notebook with the data provided?
It seems the school standard is higher than the current custom in publishing works.
Today, most data for every paper or work can be made available through the internet.
There are many discussions on the web about requiring the data for all research publications. This is a very good idea. Why isn’t making data public a more common or uniform practice, just like in the class room?
I remember reading R&R’s work months ago. At that time professor Reinhart had data freely available at this address. http://www.carmenreinhart.com/data/ At the time I did not look at the data but noted that if I wanted to look into their work further.
Sharing data publicly could be an issue that has additional benefits than “Revisiting Reinhart-Rogoff”. Of course with the data made available my professor Reinhart and in the link provided allows others to make discoveries and make checks on the data.
I was impressed that professor Reinhart made data freely available.
Correction second to the last paragraph the word my should have been by, and should read as follows.
Sharing data publicly could be an issue that has additional benefits than “Revisiting Reinhart-Rogoff”. Of course with the data made available by professor Reinhart and in the link provided allows others to make discoveries and make checks on the data.
(I have never taken any classed from professors Reinhart or Rogoff or have any contact other than looking at their work on the internet. It was a typo.)
Good article, Velichka Dimitrova.
There is a detectable rise in interest – and concerns – about the robustness, competence and transparency of much of what has and is being passed of as ‘research’. That this is now being articulated in all manner of for and channels (such as this excellent piece) has to be all to the greater good.
We need, however, to beware how transparency from the outset can carry other costs with it. For example, the not-competent and seemingly ideology-driven responses from right-wing sources to the ‘The Spirit Level’ research publication by Wilkinson & Pickett was instructive. The authors Wilkinson & Pickett were clearly being perceived as heretics on what was still-then the dominant right-wing neo-liberal worldview (that the Reinhart-Rogoff paper was taken as part of the contemporary basis for).
At times there were odd sources for some of the attempts at critiques of The Spirit Level (for example a somewhat opaque body, ‘The Taxpayers Alliance’?). I believe that in the end the authors had to decline to exchange with anyone other than other researchers who had published peer-reviewed work.
We’ve discussed this on the yahoogroup of the European Spreadsheet Risk Interest Group, of which I’m the current chair. Our next conference is in Greenwich on 4-5 July.
I agree with her main point:
1) The authors did not catch the error, and only when they released the spreadsheet did others find it.
Lesson: review by a different pair of eyes can catch errors before they cause loss or embarrassment.
“Peer review” is the gold standard in academic research, and the best self-protection for business users of spreadsheets.
Here’s a story from Edward Krudy of Reuters quoted in the Toronto Star: the person who checked the R&R study also checked their own work:
http://www.thestar.com/business/2013/04/18/student_finds_glaring_spreadsheet_errors_in_study_used_to_justify_budget_slashing.html
Student finds glaring spreadsheet errors in study used to justify budget slashing
“I almost didn’t believe my eyes when I saw just the basic spreadsheet error,” said Herndon, 28. “I was like, am I just looking at this wrong? There has to be some other explanation. So I asked my girlfriend, ‘Am I seeing this wrong?’” His girlfriend, Kyla Walters, replied: “I don’t think so, Thomas.”
2) Like most spreadsheet creators, they possibly did not realise at the start how important it would become; in their case, frequently cited in support of austerity and arguably the cause of more grief to the world at large than other problems that have merely cost millions. (See http://www.eusprig.org/horror-stories.htm )
Lesson: The more important something is, the more care you have to take with it.
3) Technically, the error could have been spotted by simple tests, such as pressing Ctrl+[ on a formula to show what cells feed into the total.
There are many software tools, such as XLTEST, to point out structural flaws.
There is much guidance on safer spreadsheet construction, books such as “Spreadsheet Check and Control”, “Spreadsheet Safe”, etc.
In the business world, common checks include cross-total balances and reconciliations.
Eusprig papers are published at
http://www.eusprig.org/conference-abstracts.htm