The recent “sting” article in Science exposed certain predatory journals for publishing clearly erroneous scientific results in exchange for money. Ernesto Priego emphasises the bias in the conclusions drawn from this article and its illegitimate attack on open access publication. But regardless of the business model of the publication, peer review, especially in the humanities, is certainly in need of greater attention and improvements.
By now most people following open access publishing debates will have heard of (or even read) the recent article on Science Magazine, signed by John Bohannon, titled “Who’s Afraid of Peer Review?“ In his response to the article, Open Library of Humanities founder Martin Eve summarises it like this:
“The article details John Bohannon’s submission of over 300 bogus papers to open access journals listed in the DOAJ and on Beall’s list of “predatory open access publishers”. Problematically, 157 of those journals accepted the obviously erroneous manuscript that featured ethical approval problems and clear scientific anomalies. While he never explicitly frames his discourse in terms of comparison to the subscription model, Bohannon’s essential hypothesis seems to be that open access driven by publication charges will be inherently biased towards acceptance.”
Please do read Martin’s full response, as it contains all the key points about this situation we strongly believe have to be stressed. At SV-POW! there’s a list of other reactions to Bohannon’s article. The Open Access Scholarly Publishers Assocation (OASPA) has also published a response, which can be read here. We stand by their statement. [Update: don't miss Curt Rice's excellent post either]. If you have time, please keep on reading…
Nothing new to see there
The reason why the Science article deserves a vocal and energetic response from those of us working on open access publishing is Science’s reputation as a journal and the visibility and online traffic it has. One cannot but interrogate the author’s and his editor’s motivations to publish what comes across as a biased and ‘non-scientific’ approach to exposing poor standards of scientific publication.
What the piece highlights is bad peer review, but as many other respondents have emphasised this does not have any relation to open access as an alternative to paywalled or subscription journals. The piece shows evidence of what was already clearly signposted in Beall’s list of “predatory publishers”, becoming the equivalent of a study involving spoof responses to bank transfer scams to prove that they are indeed bank transfer scams.
Because the author explicitly used this list of known bad journals, the results were always going to be on the negative side. As Martin’s response points out, the article’s main caveat is hidden at the very end as a “Coda”: “If I had targeted traditional, subscription-based journals, [David] Roos told me, “I strongly suspect you would get the same result.” ”
In between the Roos quotes, the Science author drops the bomb: “But open access has multiplied that underclass of journals, and the number of papers they publish.” There is no data in the article to support his claim, and the article fails to make any consideration of the role that traditional academic assessment and publishing systems have played in “multiplying” this “underclass of journals”. I would argue (and this is just an educated guess) that instead it’s been a case of a “publish-or-perish” scenario that has driven desperate academics to fall prey to these publishers that, regardless of their access model, are obviously dedicated to take advantage of them and rip them off, in the same way that desperate people take pay-day loans, or naïvely reply to email phishing frauds.
As the OASPA statement asserts,
“Overall, although the data undoubtedly support the view that a substantial number of poor-quality journals exist, and some certainly lack sufficient rigor in their peer review processes, no conclusions can be drawn about how open access journals compare with similar subscription journals, or about the overall prevalence of this phenomenon.”
The big issue: peer review and publication acceptance rates
The speed at which peer review is conducted is not necessarily an indicator of peer review mediocrity, or of its total absence. As Björk and Solomon explain in their recent study of the publishing delays in scholarly journals,
“Publishing in scholarly peer reviewed journals usually entails long delays from submission to publication. In part this is due to the length of the peer review process and in part because of the dominating tradition of publication in issues, earlier a necessity of paper‐based publishing, which creates backlogs of manuscripts waiting in line. The delays slow the dissemination of scholarship and can provide a significant burden on the academic careers of authors.”
Björk and Solomon [PDF] studied average publishing delays in 2700 papers published in 135 journals sampled from Elsevier’s Scopus citation index. They discovered that “the shortest overall delays occur in science technology and medical (STM) fields and the longest in social science, arts/humanities and business/economics. Business/economics with a delay of 18 months took twice as long as chemistry with a 9 month average.”
They also showed that the time it took from submission in publication for created open access journals was shorter than for those that weren’t:
“Open access journals, particularly those which were created as OA journals rather than were converted from subscription appear to be able to publish articles considerably more quickly than subscription journals. This in part may reflect the fact they are electronic only and tend to publish articles as they are ready rather than bundling them into issues. Given the small numbers and the fact the OA journals are not evenly distributed across disciplines these finding should be interpreted with a great deal of caution” (11).
The list of journals they studied is on pages 24-25, and reveals that these OA journals are published by the same publishers and scholarly societies that also publish under the subscription-only model. In fact many of these journals do publish articles under both models.
The slow pace of academic publishing is often painful to witness, particularly for the 21st century, when everyone is used to the nearly-immediate broadcasting of everything. Academic systems of appraisal, promotion and research funding allocation also define hectic schedules, and the competition between highly-qualified candidates is ferocious. The traditional submission to publication times in the arts and humanities is increasingly unfit for the purpose of addressing the increasing pressures by funders and Higher Education employers for researchers to “perform” and have “impact”. (In the case of scholars working on themes and methodologies that are always changing, the slow pace is indeed counterproductive and might jeopardise the quality and relevance of the research once it finally gets published). Current technologies enable, up to a certain extent, faster editorial workflows, but publishing and communication technologies also play a role in increasing the already heavy workload of many academics.
Towards better peer review
At The Comics Grid we are acutely aware that our peer reviewers work for us as volunteers. Their work is the essential foundation, the main structure of the project, and for this we are always thankful. And the guidelines and deadlines a journal can impose on voluntary labour can only take us so far. (For our Quick Guide to our editorial workflow, click here). Peer review is a cultural construct, a mode of professional, intellectual and ethical behaviour, a type of academic culture. If you are in academia, peer review is everyone’s responsibility.
We know that peer review is often conducted on top of other more urgent responsibilities. It is unpaid (at least directly), and it is rarely considered, explicitly, as a performance indicator. Peer review is a professional activity performed by the most specialised individuals, and yet it is not openly recognized as such. Hence it is there with those things one does for love, or professional status, or as a way to be included within relevant scholarly publishing and academic networks. In terms of time this means (this is also another educated guess; speaking form empirical experience here) that many academics are doing peer review under circumstances that are far from ideal.
We all want better peer-review. “Better” in this case means, for me, peer review that balances openness with professional quality and friendly, encouraging, constructive rigour. Peer review that encourages the development of better research and better scholars. In spite of the acceleration brought by electronic publishing platforms, quality editorial control takes time. We all want better peer review, but that takes longer, and is more expensive. Who will pay for that? And will we all wait? Will readers, funders and employers wait?
Back in 2004, in his article “Promoting Open Access in the Humanities”, Peter Suber indicated that on average humanities journals had higher rejection rates (70-90%) than STM journals (20-40%), suggesting that the cost of peer review per accepted article is higher in the humanities than in the STM fields.
Alas, the Science article did not consider any of this. Predatory journals are an academic problem which is a social problem. It is frustrating that a high-profile publication such as Science has published this article, which reads to me as an authorial and editorial hostility against legitimate open access publishing through biased reporting. One hopes a positive aspect of it is that it may help flush out the bad journals that do no follow strict peer review.
Speed should never replace strict process. Whilst online publishing allows rapid publication times after editorial acceptance, the peer review process should be as stringent as always. Being open has no bearing on this. For journals that do carry out proper peer review, then maybe this will be a good thing, as it will drive people towards good reputations. A good reminder for us to keep doing things properly, despite the pains that it brings.
We are not afraid of peer review. Who’s afraid of open access? That is the question.
This was originally posted at The Comics Grid and is reposted with permission.
Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.
Ernesto Priego is Editor in Chief of The Comics Grid and is a Lecturer in Library Science at the Centre for Information Science at City University London. He did his PhD in Information Studies at University College London.