LSE - Small Logo
LSE - Small Logo

Blog Admin

July 12th, 2012

How journals manipulate the importance of research and one way to fix it

4 comments | 8 shares

Estimated reading time: 5 minutes

Blog Admin

July 12th, 2012

How journals manipulate the importance of research and one way to fix it

4 comments | 8 shares

Estimated reading time: 5 minutes

Our methods of rewarding research foster an incentive for journal editors to ‘game’ the system, and one in five researchers report being pressured to include citations from the prospective journal before their work is published. Curt Rice outlines how we can put an end to coercive citations.

Over 20 per cent of researchers have been pressured by journal editors to modify their articles in ways that manipulate the reputation of the journal. Journals are ranked by the citation rates of the articles they publish. Editors can manipulate their journal’s ranking by asking authors to include more citations of other articles in that very journal.

An editor of Leukemia wrote to an author whose work was about to be accepted. “You cite Leukemia once in 42 references. Consequently, we kindly ask you to add [more] references of articles published in Leukemia to your present article.”

These data recently appeared in Science, where Allan W. Wilhite and Erica A. Fong dubbed the phenomenon Coercive Citation in Academic Publishing.

While 80 per cent of researchers say that coercive citation reduces the prestige of a journal in their eyes, 60 per cent nonetheless admit that they would add citations from such a journal to their reference list before submitting their article to it.

This practice can be stopped by changing how we calculate a journal’s impact factor. Impact factors reflect average citation rates for articles; a high impact factor shows that a journal is important in its field.

When we determine impact factors, we should simply exclude citations appearing in the journal at hand. If the impact factor of Leukemia were computed without reference lists from articles in Leukemia itself, nothing could be gained from coercive citation.

Would this give a skewed picture of the relative importance of journals? It’s true that curiousity-driven research leads to specialization so narrow that only a few journals would be interested in any particular article. As a result, new findings in some sub-sub-sub-field — which is where researchers work — have very few potential outlets. But this is true for everyone and almost all journals, such that it shouldn’t lead to unreasonably skewed citation indeces.

Another possible fix is advocated by John G. Lynch, also in Science. Lynch organized several editors of leading journals in his field to write a joint letter to 600 deans, identifying the practice of coercive citation and its potential damage to the field. These editors encouraged deans to evaluate the quality of their faculty members’ papers based on the articles themselves rather than the impact factor of the journals in which they appear.

And, indeed, Lynch is right that evaluation and funding cultures provide the context for coercive citation. When promotions are based on publication in journals with high impact factors, the journal editors have a motivation to get the best impact factor they can because that will let them attract the best articles from up-and-coming researchers. There’s an incentive to game the system.

When governments connect funding for universities to the number of publications in different tiers of journals — as the Norwegian government does — the lure of corruption is introduced.

Universities carry out basic research that takes many years. Elected officials operate on shorter cycles; politicians want to give money to research and then see results during their relatively short period in office. The legitimate priorities of universities and politicians are therefore at times in conflict.

Attempts to resolve the conflict — primarily about how long it takes to get results — give rise to systems based on metrics, on counting. And systems based on counting can be gamed.

The game we learned about from Wilhite and Fong — the game of coercive citation — can be fixed. Doing so will strengthen our confidence in the system.

That way, when we have good results, we can try to publish them in the best possible journal, confident that the quality of the journal reflects the quality of the research others have published there, and not just the vastness of their reference lists.

For more writing on gaming scholarly publication, check out the DrugMonkey blog at scientopia, a post at the scholarly kitchen, and S. Scott Graham’s blog entry on citation coercion.

Note:  This article gives the views of the author, and not the position of the Impact of Social Sciences blog, nor of the London School of Economics

This blog has been republished with the permission of the author. Curt Rice’s personal blog, Thoughts on University Leadership is available here.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic publishing | Citations | Impact

4 Comments