LSE - Small Logo
LSE - Small Logo

Sierra Williams

April 23rd, 2014

Are 90% of academic papers really never cited? Reviewing the literature on academic citations.

88 comments | 62 shares

Estimated reading time: 5 minutes

Sierra Williams

April 23rd, 2014

Are 90% of academic papers really never cited? Reviewing the literature on academic citations.

88 comments | 62 shares

Estimated reading time: 5 minutes

DKR_work_2009-09C.jpgIt is widely accepted that academic papers are rarely cited or even read. But what kind of data lies behind these assertions? Dahlia Remler takes a look at the academic research on citation practices and finds that whilst it is clear citation rates are low, much confusion remains over precise figures and methods for determining accurate citation analysis. In her investigation, Remler wonders whether academics are able to answer these key questions. But expert evaluation has indeed correctly discredited the overblown claim resulting from embellished journalism.

90% of papers published in academic journals are never cited.” This damning statistic from a 2007 overview of citation analysis recently darted about cyberspace. A similar statistic had made the rounds in 2010 but that time it was about 60% of social and natural science articles that were said to be uncited. Neither statistic came with a link to supporting academic research papers.

That lack of support was a problem for me. I did not doubt the basic truth that many academic papers are uncited. But to be sure 90% was not urban legend and to learn the context and caveats, I needed to find the original research paper. I was not the only one who wanted the supporting evidence.  So, I dove into Google scholar, searching the disparaged academic literature for articles on academic citation rates.

What’s the truth?

Many academic articles are never cited, although I could not find any study with a result as high as 90%. Non-citation rates vary enormously by field. “Only” 12% of medicine articles are not cited, compared to about 82% (!) for the humanities. It’s 27% for natural sciences and 32% for social sciences (cite). For everything except humanities, those numbers are far from 90% but they are still high: One third of social science articles go uncited! Ten points for academia’s critics. Before we slash humanities departments, though, remember that much of their most prestigious research is published in books. On the other hand, at least in literature, many books are rarely cited too.

The uncited rate is also sensitive to other factors: how long a window is used to check for citations (e.g., 5 years); when the article whose cites are being counted was published (2000s or 1990s); and what counts as a citation. The uncited rates I gave as “the” rates are really five-year citation rates in all Thomson’s Web of Science journals, and that is not comprehensive. The details of whether to include self-citations, non-academic articles, and so on, also matter.

-Citation_needed-Image credit: futureatlas.com (Wikimedia, CC BY)

Another reason for the various uncitedness rates floating around is confusion between the average citation rates of journals and citation rates of articles. Within a given journal, some articles have many citations while others have few and many have zero—citations within a given journal are skewed.  The average rate of citations for a whole journal, the impact factor, is pulled up by the few articles with many citations. Focusing on the impact factor will make it seem like more articles get cited than actually do. Ironically, a Chronicle of Higher Education article bemoaning the low rate of citations under-stated its case by assuming the average citation rate for journals applied to articles.

Clearly, academic articles have a serious problem. But my experiences highlighted another bad thing about academic articles—and a really good thing.

I had a hard time finding the rates at which articles were uncited, because the overwhelming majority of relevant articles were about other things, such as the effect of time windows, different inclusion criteria for citations, whether the Internet has changed citation practices and so on. Those are all good things to investigate, but in the grand scheme of things, they are not as important as the large share of articles going uncited altogether. Another point for academia’s critics, who contend that academics worry about small things no one else cares about and miss the big things.

But my experience also showed what’s great about academic articles. You get to learn how people reached their conclusions and judge the methods yourself. You also get the assurance that knowledgeable people paid attention to how things were done and the validity of the conclusions. Contrast the accuracy and information in the academic articles that I have linked to with the figures from non-academic outlets that darted around the Internet.

And what about that 90% figure? It came from an article by an expert in citation analysis, Lokman Meho, then at Indiana University, in Physics World, a member magazine of the Institute of Physics. No wonder it got (inaccurately) described in cyberspace as a “study at Indiana University.”

Meho explained the 90% by email, “The first paragraph of the article was written by the editor of the magazine and not me. If I recall correctly, he got the figures from/during a lecture he attended in the UK in which the presenter claimed that 90% of the published papers goes uncited and 50% remains unread other than by authors/referees/editors.” Meho noted that the 90% figure was about right for the humanities but not other fields.

Five points for academia’s supporters. No editor could do anything remotely like that to an academic article. (It’s bad journalism too.) In academic articles, all methods are explained and all claims are supposed to be evaluated by other experts.

Academic publication needs fixing. Even the 12% uncited rate for medicine seems large to me, particularly given what medical research costs. The one-third rate for social science and more than 80% for humanities are really troubling. But whatever we do, let’s preserve somewhere what’s good about academic articles—full descriptions of methods and expert evaluation.

This piece originally appeared on Dahlia Remler’s personal blog and is reposted with the author’s permission.

This post was updated on 1 November 2016 to correct an erroneous reference.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Dahlia Remler is Professor at the School of Public Affairs, Baruch College, and the Department of Economics, Graduate Center, both of the City University of New York. She is also a Research Associate at the National Bureau of Economic Research.

Print Friendly, PDF & Email

About the author

Sierra Williams

Posted In: Academic publishing | Citations

88 Comments