LSE - Small Logo
LSE - Small Logo

Blog Admin

August 22nd, 2017

Collaborative researcher behaviour has not (yet) increased in response to incentive regimes’ performance measures

1 comment

Estimated reading time: 5 minutes

Blog Admin

August 22nd, 2017

Collaborative researcher behaviour has not (yet) increased in response to incentive regimes’ performance measures

1 comment

Estimated reading time: 5 minutes

A somewhat cynical view of researcher motivations suggests that, when faced with new quantitative performance measures as part of their local incentive regimes, researchers will quickly modify their behaviours in an effort to optimise their own performance. Charlotte Wien, Bertil F. Dorch and Asger Væring Larsen set about testing this notion, looking more closely at how their own Danish system incentivises co-authorship of publications among its universities and how researchers have responded. Findings reveal that new performance measures have not, in fact, prompted any noticeable increase in researchers’ collaborative behaviours.

It has been argued by both researchers and policymakers that the introduction of various quantitative performance measures for research publications has increased the quantity but decreased the quality of research publications. The argument is as follows: whenever a new measure is introduced, researchers immediately modify their behaviours in order to optimise their own performance against this new measure. It can further be argued that researchers are becoming increasingly expert at doing this (i.e. researchers and institutions will adapt to and successfully game whatever systems are put in place to measure them).

If this is true, it follows that researchers’ most important goal nowadays is to boost their own perceived performance through gaming bibliometric measures, not the pursuit of knowledge and truth. But since all researchers we know – including ourselves – are not as cynical and calculating as this would suggest, we decided to challenge the assumption that researchers systematically seek to optimise their performance in accordance with their local incentive regime’s quantitative performance measures.

Some incentive regimes reward the sharing of authorship between institutions. The reason for this is political: it is considered beneficial for research if researchers cooperate across institutional borders. However, it has been argued that, because of this incentive, the number of authors per journal article has increased at such a rapid pace that the integrity and credibility of research is lost; i.e. the changing pattern is purely due to gaming and represents an artificial effect that has nothing to do with changes in the inherent research processes. Therefore, we have chosen the development of the number of authors per journal article in Denmark as our case study here.

Denmark was chosen partially out of convenience – we are all Danish – but also because the Danish Bibliometric Research Indicator (the BFI, a performance-based model for the distribution of a special pool of baseline funding for Danish universities and public research institutions) rewards research outputs published in the most prestigious scientific journals. As it happens, the Danish BFI system is roughly identical to its Norwegian equivalent.

Image credit: Štefan Štefančík, via Unsplash. This work is licensed under a CC0 1.0 license.

The BFI has an in-built incentive to researchers to cooperate across institutions. If authorship of a published article is shared among two or more institutions, that particular publication is awarded a 25 per cent credit bonus, and its total number of credits is shared among all affiliated institutions. If the number of affiliated institutions is high the number of credits apportioned to each institution is likely to be very low; for this reason a minimum number of credits, set at 10 per cent, is apportioned irrespective of how many institutions have contributed. The Danish BFI was implemented in 2009, with budgetary effects from 2010.

So the question is: what kind of behaviour in terms of the composition of authorships should we expect from researchers if they are indeed as cynical and calculating in optimising their own credits in the system as one might expect. Through careful modelling of the various scenarios we found that if researchers were, in fact, optimising in accordance with this measure then the optimal authorship panel composition – seen from the perspective of the researchers – would be one or two internal authors sharing authorship with one or two external authors. Remember, due to the way the BFI system divides a publication’s credits among its authors and affiliated institutions, too many external authors with only one internal author, for example, can mean the number of credits for your university becomes unfavourably small.

We chose to analyse this a bit further, taking our home institution the University of Southern Denmark (SDU) as our case. It comprises four main fields: humanities; science and engineering; social sciences; and health sciences. We recorded the publication type, publication year, and number of authors, and calculated various statistical characteristics of the publication data.

On the one hand, we find the number of authors per journal article is, in fact, increasing, with the average being 6.5. And we find that this figure is a result of a steady increase from 2010 to 2015 in the number of publications for which one or more authors were affiliated with another institution. However, this average of 6.5 authors is obviously higher than the one or two internal combined with one or two external authors we identified as the optimal combination for researchers looking to game the regime. In other words this result is the first indication that researchers are not systematically gaming the BFI system by optimising their research output, at least not when it comes to authorship.

However, since an average may cover many outliers and differences, and as we know from previous studies that there are huge differences in the publication patterns between the various research fields, we decided to dig a bit deeper in our data and analyse the composition of author lists across the main fields. In short, we found that the mean number of authors per publication is highest in the health sciences, in which a (to us) surprisingly large share of articles have only one internal author and a relatively modest share have no external authors. In the arts and humanities, single authorship seems to be the rule and not the exception, with very few articles having more than one author. We also find it hard to argue that science and engineering faculty co-author papers in order to optimise BFI performance. Only within the social sciences, does the data indicate that the average article has two internal and one or two external authors. This may constitute an optimisation in accordance with the BFI; however, when we analysed the development over time, this pattern has been more over the entire period of investigation.

To sum up we have found some evidence that despite the more or less perverse incentive regimes that have been created in order to measure and quantify research output, researchers at SDU and their co-authors have not significantly changed their behaviour. It remains the inner motivations of researchers and not performance measures that guide the way we work.

This blog post is based on the authors’ article, “Contradicting incentives for research collaboration”, published in Scientometrics (DOI: 10.1007/s11192-017-2412-0).

Note: This article gives the views of the authors, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

About the authors

Charlotte Wien is head of the Research Department at the University Library of Southern Denmark. ORCID: 0000-0002-3257-2084.

 

Bertil F. Dorch is head of library at the University Library of Southern Denmark. ORCID: 0000-0003-2594-6778.

Asger Væring Larsen is Senior Consultant at the University Library of Southern Denmark. ORCID: 0000-0002-9967-2637.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic writing | Higher education | Measuring Research | Research evaluation

1 Comments