Head of Research Policy at the Higher Education Funding Council for England, Steven Hill, presents an overview of the work HEFCE are currently commissioning which they are hoping will build a robust evidence base for research assessment. He argues that attention on the costs, benefits, problems and solutions of the REF are an obvious starting point, but it is also important that the higher education community consider and inform wider issues beyond the current exercise.
Soon the wait will be over, the results of the Research Excellence Framework (REF) will be released to the world. In the run up to the big announcement it isn’t surprising that there is plenty of commentary about the REF, its perceived strengths and weaknesses, its costs and benefits, the funding outcomes, and of course the future of and planning for the next exercise. An example of the latter is a letter from the Presidents of the British Academy and Royal Society in this week’s THE calling for a thorough evaluation of the costs of the REF, to inform future planning.
In my team, we have actually been thinking about the future of the exercise for close to two years already. We believe an essential starting point is gathering robust evidence about the process that we are just completing, as well as getting up-to-date evidence on wider questions of research assessment. So, working with the other UK Funding Bodies, we have an extensive and detailed programme of work underway, or due to be commissioned, that will provide a solid foundation on which to debate the future during 2015.
Image credit: Ivy Dawned (Flickr, CC BY-NC-SA)
While all of these activities are in the public domain, the aim of this post is to present them together, to give a sense of the overall programme.
- One focus of the evaluation is on the impact element of the REF. This element is new, not just for the UK but globally, so it is essential that we subject what we have done to suitable scrutiny. During 2013 and 2014 we commissioned RAND Europe to investigate the costs, benefits and implications for institutions preparing submissions to the exercise. And now, RAND are carrying out a second phase of the evaluation working with REF panel members to evaluate the assessment process for impact. These studies will provide an in depth review of the impact element from all angles.
- At the same time, we have also commissioned Digital Science and Kings College London to carry out a synthetic analysis of the case studies submitted to the REF, that will provide not only evidence on the impact that research in HEIs makes, but will also provide valuable information on the types of impact that feature in the REF, and how impact varies by discipline.
- To further our understanding of the implications of the REF within institutions, at the beginning of this year we also invited feedback from HEIs on the process. More than 100 HEIs (out of the 154 who submitted) responded, and provided a wealth of information on costs, benefits, problems and solutions.
- We are also seeking feedback from REF panel members, and are running a series of focus groups with them. These are already providing valuable insight.
- While the impact studies described already are examining the costs associated with that element, we have also commissioned from Technopolis a further study to estimate the costs and benefits of the rest of the exercise. A particular focus will be the costs of the provisions made for special staff circumstances. These provisions have been welcomed by institutions and researchers alike, but we know that there have been associated cost implications.
- We are also about to commission work to examine the position of multi- and inter-disciplinary research in the REF, as a part of wider work, some of it planned jointly with RCUK. We are also doing some work to look at the extent to which outputs submitted to the REF reflect collaborations between institutions.
Taken together these evaluations will provide a rounded and complete evaluation of the REF. But it is also important that we consider wider issues beyond the current exercise. A central part of this is the independent review on the use of metrics in research assessment that is being led by Professor James Wilsdon. While this study is broader than just a consideration of the use of metrics in national research assessments, it is going to provide important evidence to inform the thinking about future REF exercise.
Once the solid foundation of evidence is built during the course of 2015, we can then begin the debate about the future, with researchers and HEIs, of course, at its heart.
The title of the original post is Time for REFlection. © HEFCE copyright material is reproduced with the permission of Higher Education Funding Council for England (HEFCE) and may be accessed in its original form here. Full terms and conditions of the above material can be found here.
Note: This article gives the views of the authors, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Steven Hill is Head of Research Policy at HEFCE.
Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1) http://eprints.ecs.soton.ac.uk/17142/
Scientometric predictors of research performance need to be validated by showing that they have a high correlation with the external criterion they are trying to predict. The UK Research Excellence Framework (REF) — together with the growing movement toward making the full-texts of research articles freely available on the web — offer a unique opportunity to test and validate a wealth of old and new scientometric predictors, through multiple regression analysis: Publications, journal impact factors, citations, co-citations, citation chronometrics (age, growth, latency to peak, decay rate), hub/authority scores, h-index, prior funding, student counts, co-authorship scores, endogamy/exogamy, textual proximity, download/co-downloads and their chronometrics, etc. can all be tested and validated jointly, discipline by discipline, against their REF panel rankings in REF2014. The weights of each predictor can be calibrated to maximize the joint correlation with the rankings. Open Access Scientometrics will provide powerful new means of navigating, evaluating, predicting and analyzing the growing Open Access database, as well as powerful incentives for making it grow faster.