It is widely accepted that academic papers are rarely cited or even read. But what kind of data lies behind these assertions? Dahlia Remler takes a look at the academic research on citation practices and finds that whilst it is clear citation rates are low, much confusion remains over precise figures and methods for determining accurate citation analysis. In her investigation, Remler wonders whether academics are able to answer these key questions. But expert evaluation has indeed correctly discredited the overblown claim resulting from embellished journalism.
“90% of papers published in academic journals are never cited.” This damning statistic from a 2007 overview of citation analysis recently darted about cyberspace. A similar statistic had made the rounds in 2010 but that time it was about 60% of social and natural science articles that were said to be uncited. Neither statistic came with a link to supporting academic research papers.
That lack of support was a problem for me. I did not doubt the basic truth that many academic papers are uncited. But to be sure 90% was not urban legend and to learn the context and caveats, I needed to find the original research paper. I was not the only one who wanted the supporting evidence. So, I dove into Google scholar, searching the disparaged academic literature for articles on academic citation rates.
What’s the truth?
Many academic articles are never cited, although I could not find any study with a result as high as 90%. Non-citation rates vary enormously by field. “Only” 12% of medicine articles are not cited, compared to about 82% (!) for the humanities. It’s 27% for natural sciences and 32% for social sciences (cite). For everything except humanities, those numbers are far from 90% but they are still high: One third of social science articles go uncited! Ten points for academia’s critics. Before we slash humanities departments, though, remember that much of their most prestigious research is published in books. On the other hand, at least in literature, many books are rarely cited too.
The uncited rate is also sensitive to other factors: how long a window is used to check for citations (e.g., 5 years); when the article whose cites are being counted was published (2000s or 1990s); and what counts as a citation. The uncited rates I gave as “the” rates are really five-year citation rates in all Thomson’s Web of Science journals, and that is not comprehensive. The details of whether to include self-citations, non-academic articles, and so on, also matter.
Image credit: futureatlas.com (Wikimedia, CC BY)
Another reason for the various uncitedness rates floating around is confusion between the average citation rates of journals and citation rates of articles. Within a given journal, some articles have many citations while others have few and many have zero—citations within a given journal are skewed. The average rate of citations for a whole journal, the impact factor, is pulled up by the few articles with many citations. Focusing on the impact factor will make it seem like more articles get cited than actually do. Ironically, a Chronicle of Higher Education article bemoaning the low rate of citations under-stated its case by assuming the average citation rate for journals applied to articles.
Clearly, academic articles have a serious problem. But my experiences highlighted another bad thing about academic articles—and a really good thing.
I had a hard time finding the rates at which articles were uncited, because the overwhelming majority of relevant articles were about other things, such as the effect of time windows, different inclusion criteria for citations, whether the Internet has changed citation practices and so on. Those are all good things to investigate, but in the grand scheme of things, they are not as important as the large share of articles going uncited altogether. Another point for academia’s critics, who contend that academics worry about small things no one else cares about and miss the big things.
But my experience also showed what’s great about academic articles. You get to learn how people reached their conclusions and judge the methods yourself. You also get the assurance that knowledgeable people paid attention to how things were done and the validity of the conclusions. Contrast the accuracy and information in the academic articles that I have linked to with the figures from non-academic outlets that darted around the Internet.
And what about that 90% figure? It came from an article by an expert in citation analysis, Lokman Meho, then at Indiana University, in Physics World, a member magazine of the Institute of Physics. No wonder it got (inaccurately) described in cyberspace as a “study at Indiana University.”
Meho explained the 90% by email, “The first paragraph of the article was written by the editor of the magazine and not me. If I recall correctly, he got the figures from/during a lecture he attended in the UK in which the presenter claimed that 90% of the published papers goes uncited and 50% remains unread other than by authors/referees/editors.” Meho noted that the 90% figure was about right for the humanities but not other fields.
Five points for academia’s supporters. No editor could do anything remotely like that to an academic article. (It’s bad journalism too.) In academic articles, all methods are explained and all claims are supposed to be evaluated by other experts.
Academic publication needs fixing. Even the 12% uncited rate for medicine seems large to me, particularly given what medical research costs. The one-third rate for social science and more than 80% for humanities are really troubling. But whatever we do, let’s preserve somewhere what’s good about academic articles—full descriptions of methods and expert evaluation.
This piece originally appeared on Dahlia Remler’s personal blog and is reposted with the author’s permission.
This post was updated on 1 November 2016 to correct an erroneous reference.
Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Dahlia Remler is Professor at the School of Public Affairs, Baruch College, and the Department of Economics, Graduate Center, both of the City University of New York. She is also a Research Associate at the National Bureau of Economic Research.
THANK YOU THANK YOU THANK YOU.
I did my own digging back when this bit of misinformation was floating around on Twitter, but stopped short of writing a post or emailing Lokman.
Learning that he did not include the “90% of citations” line made me very, very happy. I knew he was a better scholar than that!
I think looking at citations over a five year period, with regard to the humanities at least, leads to pretty misleading figures. Unless one is researching in a popular topic area of a field, there may be only one or two scholars publishing in a given niche area at any given time. Also, humanities publishing tends to work on a much longer cycle than medicine or hard sciences, which compounds the fact of five years being a short period. In an area where I’ve been working recently, the only other scholars to touch the area in the last century are a. dead or b. retired.
Really Interesting article. Good to know that the 90% figure is not correct or even close in most areas.
I’d be interested to know what you think an acceptable non-citation rate for articles would be? There is always going to be a certain amount of attrition. Does too much focus on getting a certain number of citation restrict the type of research that people put out? Would it inhibit creativity and direct research to only areas that are ‘trendy’ and cite-able? Is that what we want?
The author has a good point, but we should also question measuring the impact of research by academic citations. I have been using ResearchGate (http://www.researchgate.net) for some time already and this service has an interesting feature. It counts how many times our contributions are viewed (as well as how many times they are downloaded, if you use the site as an article repository), reviewing my own features I notice that one of my most viewed articles is a paper published in a conference proceedings which will hardly ever get cited.
Agree, academic citations presume someone has the interest in studying a correlated theme and that this person had time and resources enough to write another scientific article about that. Why should the importance of an article be measured only by this metrics? My guess is that is so, because it is convenient and not because it is correct.
A couplw of folk have touched on what is increasingly being termed “altmetrics”: that is, non-formal citations/accesses/downloads/reads of a given article, which can give a much more accurate view on how something has influenced the field – or the blogosphere / Twittersphere.
Bottom line: citations give only a very narrow window onto the impact of any given written thing.
There’s also an assessment problem because it’s highly likely that many papers and books are read and influence the reader to one degree or other but are not cited. Some journals might even limit the number of citations you can use in your paper. And really, does anyone really provide all citations for all points made? Doing so would, in many cases, lead to every page of a paper having hundreds of citations listed. And each year it would be more and more. Pretty soon you’d be reading a book that contained one paper, with 10 times more citations than text. 🙂
Excellent post on the importance of proper attribution and use of hyperbole in journalism to attract clicks. Atleast in medicine however, I suspect that the rate is higher than 12%, if you account asyou mentioned for self- citations. Most researchers make a point to cite their own papers in future publications and often that is the extent of the citations that some articles get. It would be very interesting to know the number if self-citations are eliminated.
82% of Humanities’ articles are never cited?! What (and why) are you working (on)? Let’s make this fact even more disturbing by reframing the question: how many of the cited articles are really read? If you are part of Humanities—for Social Science it’s 1/3 uncited articles—, and respond with a “why should I care”, you DO have a problem. And if this seems rather discouraging to you, why don’t you start thinking about how to reach your readership with us? #TheArtOfWriting #SocialMedia #NarrativesMatter Join this discussion at SIEF 2015 http://transformations-blog.com/call-for-papers-engaged-an…/ and, of course, get engaged with us 🙂 http://transformations-blog.com/become-au…/why-write-for-us/
Sorry, one of the links of my last reply was broken. Here you go 🙂 http://transformations-blog.com/call-for-papers-engaged-anthropology-reality-necessity-utopia/
I have to make a case for the humanities. How did you do the research? Most of the articles in my field, which falls into humanities, are never published online or as a pdf/ebook version. You could not have them as a searchable database and I find it hard to imagine researching that in a non-digital way. I think this factor might be able to get the rate for humanities up a litle, but I wouldn’t know how to estimate how much.
Do the great include book reviews, editorial and meeting summaries? These are often of great service to a particular field but they rarely get cited. We have a paradox: our peers (or at worst editors looking for a credible reviewer) look to us to provide reviews, editorials etc, but we hurt our own citation record if we do them. Upshot, only an altruist, a fool or someone with an agenda should bother with these sorts of publication.
“Many of the things you can count, don’t count. Many of the things you can’t count really count.”–attributed to A, Einstein
Most citation engines only work on English, also, so if you publish in foreign languages, as some humanities scholars do, their work is not included in the citation search. There also are different privacy standards and copyright standards for putting printed materials on the web in different countries, so that will make things more difficult for scholars who publish in different countries.
Thanks to the note above about publishing in a niche- most of the scholars in many areas are dead or retired, which is why we are working in the area- the works need to be re-edited according to modern standards.
One should also add that reference works are frequently not attributed, whether web or printed. So writing an encyclopedia article, however learned, gets nothing.
Emelőgép szakértő,munkavédelem,emeléstechnika,ipari gép szakértő,földmunkagép,útépítőgép szakértő.Daru,munkagép,emelőgép,földmunkagép,útépítőgép,
fővizsgálat,szerkezeti-,biztonságtechnikai vizsgálat,emelőgép javítás,csápos emelő,csörlő,
műszaki vizsga. Budapest, Pest megye és országosan, hozzárértő
tapasztalt profi szakértővel.
Is part of the Humanities result perhaps due to the low numbers of active researchers, when compared with other larger disciplines? Perhaps the Humanities result is skewed because an active Humanities researcher’s annual output of articles is generally lower than that found in other disciplines (where shorter articles often have voluminous biblographies), hence they have less opportunity for article citations? The temporary fashion that has caused the decline of extensive footnotes in the Humanities may also play a part. All these are just my guesses. But I’d love to see a highly trained statistician look at your article and comment on any need to weight for factors such as the size of the respective researcher populations, and the number of published articles expected each year by each group.
Nice job checking this out! Your last paragraph contains an interesting statement, “Even the 12% uncited rate for medicine seems large to me, particularly given what medical research costs”. I would argue that, while medical research can cost quite a bit in the case of randomized controlled trials (RCT), that is not always the case. I have a paper that may never be cited (woe is me) but the monetary cost was minimal as it was a retrospective chart review. Though, the cost to me was substantial, I put time aside during graduate school training to complete it, in my “spare time”. I would be interested to know how that 12% breaks down further. My bet is the majority of the “uncited” material is like my paper, retrospective chart reviews, rather than RCTs. Just my thoughts. Thanks again for the article.
Surely the low citation rate is just another bad consequence of pressure to publish or perish.
People will publish trivia if the alternative is to lose your job and your house,
The citation counts includes only citations in other papers. It does count citations in undergraduate or Masters dissertation.
I remember hearing quite a few years ago (and certainly before the internet became ubiquitous) that the reason the British library stopped making a michrofiche of every successful PhD thesis upon receipt of the thesis and moved to a system whereby it made a microfiche to order should a reader request a copy of the thesis was that their record showed that 90% of theses were never requested (and presumably, therefore, never read, outside their home institution where the thesis would be in the institutional library).
Perhaps the number has passed into academic folklore!! 🙂
World be of interest to see the same analysis excluding self-citations. Self-citation is extensive in some fields and introduces a significant bias
Here is the paper that did the research for the 905 stat:
The decline in the concentration of citations, 1900–2007: https://arxiv.org/ftp/arxiv/papers/0809/0809.5250.pdf
Also see “Most Cited versus Uncited Papers. What Do They Tell Us?” published in ACS Energy Letters https://pubs.acs.org/doi/10.1021/acsenergylett.8b01443?ref=featureBox#.W7oKwpnNLD4.twitter
Refer to an earlier editorial, ” Most Cited versus Uncited Papers. What Do They Tell Us?” It provides some reasons for papers not receiving any citations https://pubs.acs.org/doi/10.1021/acsenergylett.8b01443
A few possible reasons behind low cited or uncited papers are summarized here.
– Ahead of its time. The disclosure of the research theme is either premature or the research theme currently is not in the mainstream. Such papers, often referred to as Sleeping Beauties, typically have a late bloom and start gathering citations in the later years.
– Readability. The research is presented in such a way that the essence of the research findings is too difficult to grasp. Unless the authors make an effort to reach out to readers with a compelling argument, the readers are not likely to pay attention to the published work.
– Title and graphics. The first entry point for any paper is the title of the paper. If the title is too specialized or long and boring, readers are likely to skip the paper. Similarly, graphics should be aesthetically appealing with scientifically accurate presentation of the data.
– Journal selection. Given the large number of papers that are being published in any discipline, it is not unusual for a good paper to slip through. Hence, the selection of the right journal to present the latest results becomes an important criterion to gain the attention of researchers working in the same discipline.
– One of a kind such that no one cares. Lastly, the research theme is so outdated or uninteresting that no one cares to follow up on and further continue this work.
One issue with the citation metric used is that it does not cover many of the journals in which humanities and social sciences are published and cited. You should try the Google Scholar version. That covers a much wider range of citations sources, including books, reports and theses. I subscribe to Google Scholar and they tell me whenever one of my articles or books is cited, and this report comes through several times a week. Quite frequently the citations are in doctoral theses rather than articles, and quite often in books and chapters within books. So, expanding the metric will give you a fairer idea of who has taken the trouble not only to read your stuff thoroughly but think sufficiently of it to bother citing it!
nice