Although the use of citation counts as indicators of scholarly impact has well-documented limitations, it does offer insight into what articles are read and valued. However, one major disadvantage of citation counts is that they are slow to accumulate. Mike Thelwall has examined reader counts from Mendeley, the academic reference manager, and found them to be a useful source of […]
Once an academic paper is retracted, it is by no means certain it will not go on being cited. Jaime A. Teixeira da Silva, Judit Dobránszki and Helmar Bornemann-Cimenti use three key examples to demonstrate how the continued citation of retracted papers can lead to the proliferation of erroneous literature, mislead young academics and cause confusion among researchers as […]
Measuring the societal impact of research: references to climate change research in relevant policy literature
A new metric offers insight into the societal impact of scholarly research by tracking the mentions of academic publications in policy documents. Lutz Bornmann, Robin Haunschild and Werner Marx have studied the usefulness of this metric, taking climate change research as their example, and found only a low percentage of papers were referenced in the relevant literature. Does this […]
The limitations of simple ‘citation count’ figures are well-known. Chris Carroll argues that the impact of an academic research paper might be better measured by counting the number of times it is cited within citing publications rather than by simply measuring if it has been cited or not. Three or more citations of the key paper arguably represent a […]
Novel breakthroughs in research can have a dramatic impact on scientific discovery but face some distinct disadvantages in getting wider recognition. Jian Wang, Reinhilde Veugelers, Paula Stephan present an overview of their findings which suggest an inherent bias in bibliometric measures against novel research. The bias is of particular concern given the increased reliance funding agencies place on classic bibliometric indicators in making funding […]
Peer review and bibliometric indicators just don’t match up according to re-analysis of Italian research evaluation.
The Italian research evaluation agency undertook an extensive analysis to compare the results of peer review and bibliometric indicators for research evaluation. Their findings suggested both indicators produced similar results. Researchers Alberto Baccini and Giuseppe De Nicolao re-examine these results and find notable disagreements between the two techniques of evaluation in the sample and outline below the major shortcoming in the Italian Agency’s […]
Drawing on citation data that spans disciplines and time periods, Elliott Green has identified the most cited publications in the social sciences. Here he shares his findings on the 25 most cited books as well as the top ten journal articles. The sheer number of citations for these top cited publications is worth noting as is the fact that […]
The recent UK research assessment exercise, REF2014, attempted to be as fair and transparent as possible. However, Alan Dix, a member of the computing sub-panel, reports how a post-hoc analysis of public domain REF data reveals substantial implicit and emergent bias in terms of discipline sub-areas (theoretical vs applied), institutions (Russell Group vs post-1992), and gender. While metrics are […]
Accounting for Impact? How the Impact Factor is shaping research and what this means for knowledge production.
Why does the impact factor continue to play such a consequential role in academia? Alex Rushforth and Sarah de Rijcke look at how considerations of the metric enter in from early stages of research planning to the later stages of publication. Even with initiatives against the use of impact factors, scientists themselves will likely err on the side of caution and continue […]
Credit where credit is due: Research parasites and tackling misconceptions about academic data sharing
Benedikt Fecher and Gert G. Wagner look at a recent editorial which faced considerable criticism for typecasting researchers who use or build on previous datasets as “research parasites”. They argue that the authors appear to miss the point, not only of data sharing, but of scientific research more broadly. But as problematic as the editorial may be, it points to […]
A call for inclusive indicators that explore research activities in “peripheral” topics and developing countries.
Science and Technology (S&T) systems all over the world are routinely monitored and assessed with indicators that were created to measure the natural sciences in developed countries. Ismael Ràfols and Jordi Molas-Gallart argue these indicators are often inappropriate in other contexts. They urge S&T analysts to create data and indicators that better reflect research activities and contributions in these “peripheral” […]
It’s time to put our impact data to work to get a better understanding of the value, use and re-use of research.
If published articles and research data are subject to open access and sharing mandates, why not also the data on impact-related activity of research outputs? Liz Allen argues that the curation of an open ‘impact genome project’ could go a long way in remedying our limited understanding of impact. Of course there would be lots of variants in the type of impact […]
Access to more and more publication and citation data offers the potential for more powerful impact measures than traditional bibliometrics. Accounting for more of the context in the relationship between the citing and cited publications could provide more subtle and nuanced impact measurement. Ryan Whalen looks at the different ways that scientific content are related, and how these relationships could be […]
Why we need a hub for software in science: Research software developers deserve wider academic recognition.
Jure Triglav looks at how software recognition, credit and discovery is a massive problem facing the scientific community. Though instrumental to scientific innovation, software creation and maintenance is rarely recognised. By building a hub for research software, scientists could shine a spotlight on its developers and show the extent, importance and impact of their work. Enter Depsy – a new research […]
Bringing together bibliometrics research from different disciplines – what can we learn from each other?
Currently, there is little exchange between the different communities interested in the domain of bibliometrics. A recent conference aimed to bridge this gap. Peter Kraker, Katrin Weller, Isabella Peters and Elisabeth Lex report on the multitude of topics and viewpoints covered on the quantitative analysis of scientific research. A key theme was the strong need for more openness and transparency: transparency in research […]