LSE - Small Logo
LSE - Small Logo

Josh Brown

Wolfgang Kaltenbrunner

Michaela Strinzel

Sarah de Rijcke

Michael Hill

April 5th, 2022

Imperfect, boring, headed for change? 10 Ways to improve academic CV assessments

2 comments | 18 shares

Estimated reading time: 6 minutes

Josh Brown

Wolfgang Kaltenbrunner

Michaela Strinzel

Sarah de Rijcke

Michael Hill

April 5th, 2022

Imperfect, boring, headed for change? 10 Ways to improve academic CV assessments

2 comments | 18 shares

Estimated reading time: 6 minutes

Academic CVs play a major role in research assessment and in shaping academic fields by sorting and selecting promising researchers. Their role in structuring and prioritizing information is therefore significant and has recently been criticised for facilitating judgements based predominantly on narrow quantitative measures. In the blogpost, Josh Brown, Wolfgang Kaltenbrunner, Michaela Strinzel, Sarah de Rijcke and Michael Hill assess the changing landscape of research CVs and give ten recommendations for how they can be used more effectively in research assessment.


It is fair to say that almost nobody actually likes CVs. Whether you are writing one, updating one, or reading a pile of them, they are a time consuming, boring, and in many ways imperfect way to assess either a person or their career.

Yet in the world of academic research, these usually static documents are enjoying an unusually dynamic moment. Looking back a few years, the trend seemed to go in a very different direction, namely towards more standardized, but otherwise traditionally structured CVs. Models like the Canadian Common CV, or the Spanish CV Normalizado, took standard elements of a CV and were designed to output them in the appropriate format for their end use (such as a specific funder requirement). The idea was to save researchers time and effort, and increase the re-use of data.

Yet in recent years, and in line with growing awareness of the downsides of overreliance on indicators and quantifiable information such as publication counts in research evaluation, a novel development is to supplement CV formats with narrative elements. For instance, UKRI have announced their Resumé for Research and Innovation (R4RI) which builds on the concepts of predecessors such as the Royal Society’s Resumé for Research and pushes the CV in a more narrative, contextualised direction. Major funders in the Netherlands, Ireland, Switzerland, Luxembourg, and the USA are similarly experimenting with or have already adopted variants of narrative CVs. The focus of these novel formats is on individuals having a space to describe the impact they have had, rather than simply listing jobs, publications and acquired grants. An important consideration is also to discourage reviewers from breaking down complex evaluative decisions to simplistic quantitative comparisons, thereby diversifying the criteria according to which research projects and careers are assessed.

As was perhaps to be expected with such an ambitious intervention, early experiences with narrative CVs have already highlighted challenges, for both applicants and evaluators.

For applicants, it may for example not be straightforward how exactly to tell those contextualising stories the narrative affords, and how to select biographical information to best supplement more traditional CV data. For evaluators, the structure can be challenging too. A recent report from the University of Glasgow’s Lab for Academic Culture, in partnership with the UK Reproducibility Network (UKRN) noted that “reviewers often struggled to locate the relevant information” and noted that narrative CVs could create a significant additional workload for reviewers and researchers alike.

While these early experiences indicate the need for both refinements to narrative CV formats, and training for those creating or evaluating them, we, a group of funders, evaluation experts, and research information providers, gathered as the ‘CV Harmonisation Group’ (which was originally part of the ORCID Reducing Burden and Improving Transparency (ORBIT) project), are excited about this first wave of narrative CV projects. They in many ways resonate with recommendations for the evaluation of academic CVs we have recently published as an article. We recommend that evaluators should:

  1. Provide clear instructions for researchers and evaluators – For each section of a CV, make sure the person filling it in and the one evaluating it understand exactly what it should contain and how it should be evaluated by providing both with the same guidelines
  2. Prioritise actual achievements over the recognition endowed upon them – Confounding achievements with rewards or lending too much weight to the latter obscures evaluation and propagates the Matthew effect
  3. Focus on more recent achievements over historical information – This prevents established researchers from being perpetually rewarded for the same achievements and gives up and coming researchers a fair starting point
  4. Focus on activities and outputs that are relevant – Focusing on only a few outputs saves researcher and evaluator resources, discourages salami slicing of results, improves comparison between early- and late-career researchers and renders publication hiatuses as a result of career breaks less apparent
  5. Acknowledge and encourage a broad range of contributions – Recognise and reward a wide range of outputs to foster a more diverse and inclusive research culture
  6. Balance and control incentives – Many laudable policies and incentives bear unintended hidden risks if they are not carefully balanced and implemented. For example, only allowing open access publications on CVs submitted to funding organisations may have opportunity costs if equally valuable aspects such as advancing teaching, diversity and equality, collaboration, sustainability, reproducibility or public engagement are not explicitly required to obtain funding
  7. Use the academic age not the biological age of researchers – The academic age can additionally provide some protection from discrimination against unconventional career paths or researchers with child care or other responsibilities. Measuring the academic age in precise number of years can be overly specific, using age ranges instead may be more useful
  8. Encourage narratives instead of lists – Incorporate a narrative section into a CV. Listed items can provide a quick and well-structured overview but they also entice evaluators to count instead of read, which in turn can favour snap judgements over in-depth evaluation. In free-form narratives the researcher can contextualise their achievements and explain what they are working towards
  9. Use metrics cautiously – Implement metrics carefully to avoid misuse and make sure evaluators are well aware of their respective definitions and limitations
  10.   Use established open and interoperable data standards and systems – CV data is valuable as it allows for the analysis of an institution’s policy and performance as well as researcher career statistics. Using data from reliable well defined and established sources without requiring extensive rekeying of basic information reduces the number of errors in the data provided, and offers means of automated validation

We think that the openness and energetic discussion around recent narrative CV formats are very healthy. Rather than a ready-made solution, we see them as informed by an awareness of the limitations of established evaluation conventions, and as a learning opportunity to further improve our thinking about and practices of research evaluation.

There is, more than at any time in history to date, an infrastructure to support more of a ‘living document’ model for CVs. This moment in time represents a valuable chance to look at the expectations from, and the form of, a CV.

 


The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image Credit: Adapted from Mohamed_Hassan via Pixabay.


Print Friendly, PDF & Email

About the author

Josh Brown

Josh Brown is a consultant, co-founder of, and research and strategy lead for, MoreBrains Cooperative. With a background in librarianship and information science, his interests include research information management and infrastructures, evaluation, and information ethics. He has a particular focus on equity and sustainability in open research and how organisational models and policy can help, or hinder, those goals.

Wolfgang Kaltenbrunner

Wolfgang Kaltenbrunner is a Senior Researcher at the Centre for Science and Technology Studies (CWTS) at Leiden University. His research interests include the politics of quantification in academic evaluation processes, academic curricula vitae as a specific scholarly communication technology, and practices of publishing, reviewing, and editing scholarly literature. Wolfgang is also an investigator at the Research on Research Institute (RoRI).

Michaela Strinzel

Michaela Strinzel is a Scientific Officer in the Strategy Division of the Swiss National Science Foundation (SNSF). She co-led the CV harmonization group and has been involved in the development of SciCV, the SNSF’s new CV format. Her interest revolves around innovation and best practices in in research funding and evaluation.

Sarah de Rijcke

Sarah de Rijcke is Professor in Science, Technology, and Innovation Studies & Scientific Director at the Centre for Science and Technology Studies (CWTS) in Leiden. She is also co-chair of the  Research on Research Institute (RoRI). Sarah specialises in social studies of research evaluation, and the relations between quality control mechanisms and knowledge production in different fields.

Michael Hill

Michael Hill is deputy head of strategy and as of April 2022 head of Grant Management at the Swiss National Science Foundation and a member of the Steering Committee of DORA and Europe PMC. He has wide-ranging expertise in research evaluation, peer review and the development of narrative CVs. Among many other projects, he was in charge of developing the Swiss National Science Foundation’s SciCV as well as the evaluation procedures of the Swiss Science Prize Marcel Benoist.

Posted In: Research evaluation

2 Comments