In little more than a year a number of peer reviewer training programmes have launched, promising to help early-career researchers learn how to do peer review, review more efficiently, and connect with editors at top journals. This follows an expressed need from graduate students and postdocs for precisely this sort of training. But can these new programmes deliver? And as many providers suggest moves towards a subscription-based model, are they worth individuals or institutions paying for them? Shaun Khoo examines the evidence base and finds that there is little to suggest that peer reviewer training programmes actually improve the quality of article reviews.
Peer reviewer training for graduate students and postdocs is pretty trendy right now. As the number of submissions to academic journals grows, publishers are interested in expanding their reviewer pools. Over the last year we have seen the launch of the Publons Academy, ACS Reviewer Lab, Nature Masterclasses’ Focus on Peer Review, and JNeurosci’s Reviewer Mentoring Program. These training programmes promise to help researchers learn how to do peer review, review more efficiently, and connect with editors at top journals. They also fill a gap in researcher training, as over 90% of early-career researchers express interest in peer review training but few receive any formal training during their PhDs. But can these new training programmes deliver? And if training providers were to make their programmes subscription-based, would it be worth the investment?
What are the training programmes like?
Each training course has its own distinct and useful features. In general, programmes like the ACS Reviewer Lab, Publons Academy, and Nature Masterclass feature text and short video segments on how to do peer review, what to focus on, what not to focus on, and ethical dilemmas. They sometimes also feature online formative assessments, like answering multiple choice questions at the end of the unit.
In the Publons Academy, videos cover the basics of peer review and trainees progressively write three reviews for published papers as they progress. The 8-12 hour time commitment culminates with reviews being submitted to a supervisor (either a current supervisor or a volunteer assigned by Publons). Supervisors then give feedback to trainees, who may have to refine their reviews, and provide an endorsement that is visible on their Publons profile.
In contrast, Nature Masterclasses covers a fairly broad array of topics that are not necessarily covered in other courses, such as “Peer reviewing a review paper”. The course is fairly short, requiring only 2-3 hours to complete, but manages to pack in a range of Nature editors who provide their views on each unit in well-made and succinct videos. Trainees also receive an automatically generated PDF certificate, but I’m not sure what value these certificates have.
Society reviewer training programmes provide a much clearer path for interested researchers to join the reviewer pool and could even provide valuable networking opportunities. For example, the ACS Reviewer Lab can be linked to an ORCID that will allow ACS editors to identify reviewers who have been through the programme, view their CV, and then invite them to review. The JNeurosci programme involves trainees (who must be Society for Neuroscience members) working with a senior mentor to write and refine a review on a bioRxiv preprint. After completion, they are added to a database of trained reviewers who may be invited to review.
Image credit: kreatikar, via Pixabay (licensed under a CC0 1.0 license).
Training worth paying for?
I completed the Publons Academy and Nature Masterclass and found them helpful. They reinforced what I learnt from doing reviews with my supervisor and gave me a chance to get some extra perspectives. But I was intrigued by the way that Nature Masterclasses uses the peer review training course to advertise its subscription training courses and that the evaluation for the Publons Academy asks if you would pay for the course or recommend your institution pay for the course.
Paying might be worth it if training improves peer review, but unfortunately there is not much of an evidence base for peer reviewer training. Conducting a workshop in person has no effect on editor ratings of review quality. Providing a second workshop does not improve the situation, even though participants rate the training as helpful. Training provides a temporary and, at best, minor improvement in error detection. Receiving feedback on reviews from an editor has no effect on subsequent peer review quality, nor does pairing new peer reviewers with a senior mentor, as the Publons Academy and JNeurosci programmes do. Formal training in peer review, like many other characteristics, does not correlate with review quality. The evidence for peer reviewer training is so thin that a 2007 Cochrane review found nothing in support of peer review training. Even a 2014 call for peer reviewer training acknowledged that studies on peer reviewer training have “shown little impact”.
Surveys of reviewers have found that the vast majority of academics want formal training, but perhaps the findings of a small Australian study – that 75% of academics would not support formal training – are more consistent with the evidence on peer review training efficacy. Australian academics reflecting on their experience of learning to peer review acknowledged that it was difficult to start with, but that formal training was unnecessary and impractical. One interviewee said: “reviewers are sensible, intelligent people and they can interpret that piece of paper without further training. It’s just a waste of time”.
A waste of time?
Peer reviewer training is not going to save peer review or populate publisher databases with high-quality reviewers. Training probably does little to alter the benefits of involving early-career researchers who already have one of the few characteristics that has been consistently associated with review quality – youth. To this end, training can help by assuring editors that a particular early-career researcher has some basic familiarity with the process. There’s no evidence training will improve review quality, but it may help new reviewers feel more confident or give them a chance to network within their discipline. I wouldn’t recommend spending money on peer review training but investing a few spare hours would not be a terrible waste.
This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Shaun Khoo is a Horizon Postdoctoral Fellow at Concordia University. His research focuses on the neural mechanisms in addiction and appetitive motivation for alcohol and sugar. During his PhD, he worked on the orexin system and its role in motivation for nicotine and palatable food. He also has an interest in laboratory animal welfare and ethics.
This might be a case of ‘sending a wrong message’: ‘There’s no evidence training will improve review quality,..’
Is this because no evidence has been collected? And, who’s perspective has been taken into consideration?
Clearly, an outcome is likely to be different under different scenarios: (a) one know what he/she is doing; (b) one does not and works on assumptions.
I am also tempted to comment on the quote provided: “reviewers are sensible, intelligent people and they can interpret that piece of paper without further training. It’s just a waste of time”.
Given the context, this conversation should be not about PEOPLE and their intelligence but about reviewers and the related intelligence.The reviewers are not intelligent by default.This specific intelligence comes with training and experience.Who absorbs the monetary cost of this is a different matter.
Training for editorial peer reviewers is essential. Let me cite an example of Pakistan where the editors of research journals in the field of social sciences are in deep trouble as they find a few peer reviewers to share their comments on pdf version of a manuscript. Most of the peer reviewers are not computer literate. They don’t know how to add comment/s in the pdf version of the manuscript emailed. As a result, the editors sand a manuscript through traditional mail service.
Jawed Ahmed Khursheed
Phd, Karachi, Pakistan