Our methods of rewarding research foster an incentive for journal editors to ‘game’ the system, and one in five researchers report being pressured to include citations from the prospective journal before their work is published. Curt Rice outlines how we can put an end to coercive citations.
Over 20 per cent of researchers have been pressured by journal editors to modify their articles in ways that manipulate the reputation of the journal. Journals are ranked by the citation rates of the articles they publish. Editors can manipulate their journal’s ranking by asking authors to include more citations of other articles in that very journal.
An editor of Leukemia wrote to an author whose work was about to be accepted. “You cite Leukemia once in 42 references. Consequently, we kindly ask you to add [more] references of articles published in Leukemia to your present article.”
These data recently appeared in Science, where Allan W. Wilhite and Erica A. Fong dubbed the phenomenon Coercive Citation in Academic Publishing.
While 80 per cent of researchers say that coercive citation reduces the prestige of a journal in their eyes, 60 per cent nonetheless admit that they would add citations from such a journal to their reference list before submitting their article to it.
This practice can be stopped by changing how we calculate a journal’s impact factor. Impact factors reflect average citation rates for articles; a high impact factor shows that a journal is important in its field.
When we determine impact factors, we should simply exclude citations appearing in the journal at hand. If the impact factor of Leukemia were computed without reference lists from articles in Leukemia itself, nothing could be gained from coercive citation.
Would this give a skewed picture of the relative importance of journals? It’s true that curiousity-driven research leads to specialization so narrow that only a few journals would be interested in any particular article. As a result, new findings in some sub-sub-sub-field — which is where researchers work — have very few potential outlets. But this is true for everyone and almost all journals, such that it shouldn’t lead to unreasonably skewed citation indeces.
Another possible fix is advocated by John G. Lynch, also in Science. Lynch organized several editors of leading journals in his field to write a joint letter to 600 deans, identifying the practice of coercive citation and its potential damage to the field. These editors encouraged deans to evaluate the quality of their faculty members’ papers based on the articles themselves rather than the impact factor of the journals in which they appear.
And, indeed, Lynch is right that evaluation and funding cultures provide the context for coercive citation. When promotions are based on publication in journals with high impact factors, the journal editors have a motivation to get the best impact factor they can because that will let them attract the best articles from up-and-coming researchers. There’s an incentive to game the system.
When governments connect funding for universities to the number of publications in different tiers of journals — as the Norwegian government does — the lure of corruption is introduced.
Universities carry out basic research that takes many years. Elected officials operate on shorter cycles; politicians want to give money to research and then see results during their relatively short period in office. The legitimate priorities of universities and politicians are therefore at times in conflict.
Attempts to resolve the conflict — primarily about how long it takes to get results — give rise to systems based on metrics, on counting. And systems based on counting can be gamed.
The game we learned about from Wilhite and Fong — the game of coercive citation — can be fixed. Doing so will strengthen our confidence in the system.
That way, when we have good results, we can try to publish them in the best possible journal, confident that the quality of the journal reflects the quality of the research others have published there, and not just the vastness of their reference lists.
For more writing on gaming scholarly publication, check out the DrugMonkey blog at scientopia, a post at the scholarly kitchen, and S. Scott Graham’s blog entry on citation coercion.
Note: This article gives the views of the author, and not the position of the Impact of Social Sciences blog, nor of the London School of Economics
This blog has been republished with the permission of the author. Curt Rice’s personal blog, Thoughts on University Leadership is available here.
Thank you so much for writing this important article. 2011/2012 marks an
important year for research and publishing ethics. The world press
highlighted numerous data fraud scandals, As was noted in this article,
Science published papers and commentaries on the use of coercive citations
among journals, and journals faced criticism for engaging in tactics more
focused on engorging impact factors than the advancement of science per
se. At the same time, this period showcased major public dialog on the
topic of research ethics among major professional associations, as well as
special issues seeking to define the normative ethical practices of
authors, reviewers, and editors.
In response to this, a group of Editors from the fields of
Industrial/Organizational Psychology and Management assembled to draft a
voluntary Code of Conduct defining some general behaviors we agree are
important to maintaining the ethics and integrity of scientific inquiry.
We have since circulated this Code to other editors to gain their support.
Since that time our list of signatories has been growing. We will continue
to circulate the Code in hopes support for it will continue to grow.
Our goal is that these efforts will have a positive impact on the way
journals conduct themselves, and on the quality and integrity of
organizational research in general. We encourage our fellow Editors and
Associate Editors to publicly affirm this code, and for anyone to post
comments on the code and related issues. We will continue to revise the
Code as our open dialog evolves.
For more information, please visit editorethics.uncc.edu.