LSE - Small Logo
LSE - Small Logo

Blog Admin

July 15th, 2015

Using REF results to make simple comparisons is not necessarily responsible. Careful interpretation needed.

2 comments

Estimated reading time: 10 minutes

Blog Admin

July 15th, 2015

Using REF results to make simple comparisons is not necessarily responsible. Careful interpretation needed.

2 comments

Estimated reading time: 10 minutes

Steven-HillWhat are the implications of the HEFCEmetrics review for the next REF? Is is easy to forget that the REF is already all about metrics of research performance. Steven Hill, Head of Research Policy at HEFCE, reiterates that like any use of metrics, we need to take great care in how we use and interpret the results of the REF.

This is part of a series of blog posts on the HEFCE-commissioned report investigating the role of metrics in research assessment. For the full report, supplementary materials, and further reading, visit our HEFCEmetrics section.

The metrics review has reported and made very clear recommendations regarding the next Research Excellence Framework (REF). The review urges a cautious approach, suggests some ways in which further quantitative data can be used, and supports a continued place for peer review at the heart of the exercise.

These recommendations are important as we in HEFCE, and the other HE Funding Bodies, consider the future of the exercise. Notwithstanding the cost of the exercise, the evidence does not support the notion that using metrics is a ‘silver bullet’.

We are currently discussing the future REF informally across the sector, in advance of publishing a formal consultation in the autumn. The review is an important part of the evidence picture we have assembled, and now we need everyone to contribute to the debate, and offer their ideas for the future. And it is clear that some of the recommendations of the review concerning future REF will need further work, especially the suggestion that we should consider some standardisation of the way quantitative data are used in impact case studies.

figure 6 hefce metrics ref2014 quantitative dataSource: Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. DOI: 10.13140/RG.2.1.4929.1363

The review has other implications for the REF, though. While we think about the future, is is easy to forget that the REF is already all about metrics of research performance. While there is only limited use of quantitative data as an input to the exercise, the outputs of the exercise, the quality profiles, are themselves metrics of research performance. The exercise could be characterised as the use of expert judgement to develop a quantitative assessment of performance.

Like any metrics, and as recommended by the review, we need to take great care in how we use and interpret the results of the REF. The Funding Bodies only publish the results as profiles, that capture the full nuance of the assessment. There are many ways to ‘collapse’ the profiles into a single number – the Grade Point Average, calculations of research power, and even approaches that take into account the proportion of eligible staff that were submitted. But all of these attempts to create a single number description of performance inevitably simplify. The same number can represent vastly different profiles, and so different performance.

The three elements of the assessment – outputs, impact and environment – also need to be considered separately to get an even more nuanced view of performance. Overall profiles can be made up in different ways from the three elements, and the balance between different elements reveals the particular strengths of institutions and units within them.

All of this means that using the REF results to make simple ‘X is better than Y’ comparisons is not necessarily responsible (to use the language of the metrics review). And if you want to use the results to separate departments or institutions into groups based on performance great care is needed. It is essential to first determine the purpose of the analysis, and then consider how best to use the data from the profiles to address that purpose.

As with other metrics, the power of numbers to inform and mislead is great in equal measure.

Note: This article gives the views of the authors, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Steven Hill is Head of Research Policy at HEFCE.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Citations | Higher education | Measuring Research | REF2021

2 Comments