The impact agenda is an international and evolutionary phenomenon that has undergone numerous iterations. Discussing the development and recent release of the results of the Australian Engagement and Impact Assessment (EIA), Ksenia Sawczak considers the effectiveness of this latest exercise in impact assessment, finding it to provide an inadequate account of the impact of Australian research and ultimately a shaky […]
Higher education and research institutions are increasingly coming to terms with the issue of gender inequality. However, efforts to move in this direction are often isolated and difficult to compare and benchmark against each other. In this post, Caroline Wagner presents a new initiative from the Centre for Science and Technology Studies at Leiden (CWTS), to assess gender inequality […]
The Journal Impact Factor (JIF) – a measure reflecting the average number of citations to recent articles published in a journal – has been widely critiqued as a measure of individual academic performance. However, it is unclear whether these criticisms and high profile declarations, such as DORA, have led to significant cultural change. In this post, Erin McKiernan, Juan Pablo Alperin […]
The careers of carers – A numerical adjustment cannot level the playing field for researchers who take time off to care for children
Quantitative measures of the effect of caring for children on research outputs (published papers and citations) have been used by some universities as a tool to address gender bias in academic grant and job applications. In this post Adrian Barnett argues that these adjustments fail to capture the real impacts of caring for children and should be replaced with […]
Academics are required to not only find effective ways to communicate their research, but also to increasingly measure and quantify its quality, impact and reach. In Scholarly Communication: What Everyone Needs to Know, Rick Anderson puts us in the picture. And in Measuring Research: What Everyone Needs to Know, Cassidy Sugimoto and Vincent Lariviere critically assess over 20 tools currently available for evaluating the […]
Altmetrics have become an increasingly ubiquitous part of scholarly communication, although the value they indicate is contested. In this post, Lutz Bornmann and Robin Haunschild present evidence from their recent study examining the relationship of peer review, altmetrics, and bibliometric analyses with societal and academic impact. Drawing on evidence from REF2014 submissions, they argue altmetrics may provide evidence for […]
In Hacking Life: Systematized Living and its Discontents, Joseph M. Reagle, Jr. explores the cultural trend of life hacking in its myriad forms as rooted in both the increasing pressures to perform to the maximum of our abilities and technological advances that are enabling us to monitor and quantify the world in unprecedented detail. The book not only lays bare an increasingly […]
Academic hiring and promotion committees and funding bodies often use publication lists as a shortcut to assessing the quality of applications. In this repost, Janet Hering argues that in order to avoid bias towards prestigious titles, plain language statements should become a standard feature of academic assessment.
Let’s start with the obvious. Evaluation and assessment are part and parcel of the […]
Invisible impact and insecure academics: structural barriers to engagement and why we should do it anyway
Participatory Action Research (PAR) is a form of research that involves prolonged and deep engagement with local communities and can produce profound social impacts. In this post, Dr Katrina Raynor describes how current approaches to impact assessment and the structure of the academic labour market impede researchers from engaging with PAR and raise particular challenges for insecurely employed early […]
Introducing the Observatory of International Research: A simple research discovery tool for everyone
Andreas Pacher presents the Observatory of International Research (OOIR), a research tool that provides users with easy to use overviews and information for whole fields of social science research. Reflecting on the advantages and limitations of other discovery tools and the potential for information overload, Andreas points to the utility of OOIR in producing search results that are both […]
In The Data Gaze: Capitalism, Power and Perception, David Beer explores how we are being put under the extractive, analytic and predictive lens of a data gaze that seeks to define our world in increasingly granular detail. Critically probing into the data analytics industry and the imaginary that gives it legitimacy, Beer offers a thoroughly readable take on the structures that are constructing […]
Sascha Friesike, Benedikt Fecher and Gert. G. Wagner outline three systemic shifts in scholarly communication that render traditional bibliometric measures of impact outdated and call for a renewed debate on how we understand and measure research impact.
New digital research infrastructures and the advent of online distribution channels are changing the realities of scientific knowledge creation and dissemination. Yet, the […]
The growing, high-stakes audit culture within the academy has brought about a different kind of publishing crisis
The spate of high-profile cases of fraudulent publications has revealed a widening replication, or outright deception, crisis in the social sciences. To Marc Spooner, researchers “cooking up” findings and the deliberate faking of science is a result of extreme pressures to publish, brought about by an increasingly pervasive audit culture within the academy.
By now most readers will have heard […]
Developing approaches to research impact assessment and evaluation: lessons from a Canadian health research funder
Assessing research impact is complex and challenging, but essential for understanding the link between research funding investments and outcomes both within and beyond academia. Julia Langton provides an overview of how a Canadian health research funder approaches impact assessment; urging caution in the use of quantitative data, highlighting the importance of organisation-wide capacity-building, and outlining the value of a […]
Considering the future of research assessment, Elizabeth Gadd outlines how she believes research evaluation could be made better, fairer, and more meaningful. The resulting seven guiding principles, neatly framed as hashtags, range from understanding our responsibilities to researchers as people, through to ensuring our evaluations are a more formative process, offering valuable, constructive feedback.
Imperial College recently held an event […]
Using citation metrics as part of academic recruitment decisions leads to an increase in self-citations
The use of citation metrics in academic hiring and promotion decisions was intended as a response to important and legitimate concerns over the meritocracy of recruitment procedures. However, evidence suggests that doing so distorts scientists’ behaviour and increases the risk that these measures become unreliable. Marco Seeber, Mattia Cattaneo, Michele Meoli and Paolo Malighetti investigated the use of citation […]
Despite becoming increasingly institutionalised, there remains a lack of discourse about research metrics among much of academia
The active use of metrics in everyday research activities suggests academics have accepted them as standards of evaluation, that they are “thinking with indicators”. Yet when asked, many academics profess concern about the limitations of evaluative metrics and the extent of their use. Why is there such a discrepancy between principle and practices pertaining to metrics? Lai Ma suggests […]
There is an absence of scientific authority over research assessment as a professional practice, leaving a gap that has been filled by database providers
Research metrics have become more established as a means to assess research performance. This is understandable given research institutions’ and funders’ demand for assessment techniques that are relatively cheap and universally applicable, even though use of such metrics remains strongly contested within scientific communities. But to what extent does the academic research field of evaluative citation analysis confer legitimacy […]
Making visible the impact of researchers working in languages other than English: developing the PLOTE index
As outlined in the Leiden Manifesto, if impact is understood in terms of citations to international publications, a bias is created against research which is regionally focused and engaged with local society problems. This is particularly critical for researchers working in contexts with languages other than English. Peter Dahler-Larsen has developed the PLOTE index, a new indicator which hopes […]