LSE - Small Logo
LSE - Small Logo

Blog Admin

July 15th, 2015

The Management of Metrics: Globally agreed, unique identifiers for academic staff are a step in the right direction.

4 comments

Estimated reading time: 10 minutes

Blog Admin

July 15th, 2015

The Management of Metrics: Globally agreed, unique identifiers for academic staff are a step in the right direction.

4 comments

Estimated reading time: 10 minutes

The Metric Tide report calls for research managers and administrators to champion the use of responsible metrics within their institutions. Simon Kerridge looks at greater detail at specific institutional actions. Signing up to initiatives such as the San Francisco Declaration on Research Assessment (DORA) is a good start. Furthermore, by mandating unique and disambiguated identifiers for academic staff, like ORCID iDs, links between researchers, projects (and outputs) will become more robust.


This is part of a series of blog posts on the HEFCE-commissioned report investigating the role of metrics in research assessment. For the full report, supplementary materials, and further reading, visit our HEFCEmetrics section.

If there were ever two words that struck fear into the hearts of academics then the combination of management and metrics must surely sit high up in the league table of suspicion.  The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management is the culmination of over a years’ work looking into just this topic (with a bit of “assessment” thrown in for good measure).  To make matters worse, I myself am a “Research Manager and Administrator” (and proud of it), so if you are an academic you might by now feel that this blog post is not for you; but bear with me, all is not lost (no need to commit “metricide”* just yet); common sense has, I think, prevailed.

Whilst the initial remit from government was borne of the idea that the Research Excellence Framework could be wholly or partly replaced by metrics, thus saving the sector/treasury a large part of the over £100m that REF2014 cost (this week, Times Higher Ed reports cost figure at £250 million); the steering group was free to weight the evidence and come to its own judgments and conclusions. You will no doubt have read elsewhere about the sector involvement in the review, so I’ll cut to the chase.

Notwithstanding the ‘administrator’ in me wanting to be able to boil everything down to a nice simple number, the world is not like that. I and my fellow research management and administration (RMA) professionals understand, by working closely with researchers, that the best way to help and support them is not by pigeon-holing them but rather to work with them and understand their needs to best advise them, within the context of the institutional, national and international policy environment.  Indeed our professional body, the Association of Research Managers and Administrators, ARMA has over 2,500 members, whose aim is to “facilitate excellence in research by identifying and establishing best practice in research management and administration”.  ARMA is the 3rd largest such association in the world, having grown rapidly over the last ten years or so as more and more research organisations recognise the need to manage and protect research in the UK’s complex research policy environment; but I digress.

By now you will have read the report and understand that there is no suggestion that metrics can (currently, or perhaps ever) replace peer review, but that they might be used to help such review. As such, it is in our interest to (legitimately) maximise such metrics. By this I don’t mean, for example, asking cabals to cite our work; but I do mean making sure that genuine citations do not go uncounted for technical reasons.

The Metric Tide report’s overall tenor is one of “Responsible Metrics”; and the third recommendation that “Research managers and administrators should champion these principles and the use of responsible metrics within their institutions” is one that RMAs will embrace (and I will ensure that ARMA enables the sharing of best practice in this domain). To me there are a number of issues with metrics which can be summarised as:

  1. The accuracy and scope of the underlying data;
  2. The way in which those data are described and understood;
  3. The way in which they are then combined to produce the metrics; and
  4. The way in which the metrics are used (or abused)

Ideally for 1) it would be desirable to have globally agreed unique and disambiguated identifiers for all objects in the digital landscape of research information. In some areas this is already a reality, for example most journals have ISSNs, similarly ISBNs for books.  For (electronic) journal articles (and other forms of digital objects) there are DOIs.

However for the most important element, academic staff, this is not the case.  There is the emerging ORCID iD which is gaining traction, and the UK now has a framework agreement, brokered by Jisc Collections, enabling academic institutions to gain the advantage of the premium API for a significant reduction (the highest banding is only $3,500 per annum).

Image credit: Jenny Cham via ORCID website

I sincerely hope that this will accelerate the uptake and usage of ORCID iDs in the UK academic sector.  I suspect that if recommendation 10 is taken up and that HEFCE mandate ORCID iDs for the next REF then this will galvanise the sector!  If RCUK and other research funders also follow the Wellcome Trust’s lead of mandating them for project proposals then not only will update be quicker, but links between researchers, projects (and outputs) become more robust.

Other objects (for example organisations and equipment) are still further away. But the sooner we start solving these problems the sooner we can have a robust set of underlying data.

2) In terms of describing the data, many institutional research management systems already use the euroCRIS CERIF standard, I hope this will become more pervasive.  A more complex issue is the semantics of the data. A citation is a citation (although wouldn’t it be nice to know if it were a ‘negative citation’?) but a question like “how many senior lecturers do you have?” is more difficult to answer if the other party happens to call them “associate professors” – is it really the same as “senior lecturers”, how can you be sure?  Some sort of semantic translation is required and work is going on in this field internationally through the auspices of CASRAI and the like.

3) Similarly in terms of the semantics of the combination of data into compound metrics (for example project proposal success rates) the Snowball Metrics recipe book provides tight definitions, but they are by no means universally adopted.

Finally the 4th and possibly most difficult element is the responsible use of metrics. Unfortunately I have run out of space, but initiatives such as the San Francisco Declaration on Research Assessment (DORA) which decry use of group level metrics (such as Journal Impact Factors) to assess individual constituents (ie journal articles) and the more recent Leiden Manifesto for research metrics are a great start.  However the inappropriate uses of metrics are in some case ingrained and rectifying this will take a significant culture change. I sincerely hope that that The Metric Tide goes some way to accelerating this process. RMAs will have a large part to play in delivering on this, but until we have eradicated such bad practices there will, I’m sure, be plenty to catch up on at in the Responsible Metrics blog.

As an endnote, I would also advocate more “research into research management and administration” to provide a better evidence-base for the decisions around research management and administration more generally.

If the metrics data infrastructure is an area that interests you then there is more detail in section 2.5 of The Metric Tide report.

*Metricide is a term I first heard from Clair Donovan (Brunel University)

Note: This article gives the views of the authors, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Simon Kerridge, Director of Research Services, University of Kent; and Chair of the Board of Directors, Association of Research Managers and Administrators.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Higher education | Measuring Research | REF2014 | REF2021

4 Comments