Mariana Gkliati calls for a reconsideration of traditional research methods in legal studies and how these methods are communicated. Most legal scholars seek to fit their conceptual analysis into narrow and strictly legal boxes, often relying on tacit knowledge from the field. Drawing on the metaphor of elephant paths, or an overlaying system for going from place to place, and behavioural […]
Fundable, but not funded: How can research funders ensure ‘unlucky’ applications are handled more appropriately?
Having a funding application rejected does not necessarily mean the research is unsupportable by funders – maybe just unlucky. There is a significant risk to wider society in the rejection of unlucky but otherwise sound applications: good ideas may slip through the cracks, or be re-worked and dulled-down to sound more likely to provide reliable results. Oli Preston looks at […]
Metrics in academia are often an opaque mess, filled with biases and ill-judged assumptions that are used in overly deterministic ways. By getting involved with their design, academics can productively push metrics in a more transparent direction. Chris Elsden, Sebastian Mellor and Rob Comber introduce an example of designing metrics within their own institution. Using the metric of grant income, their tool ResViz […]
What impact evidence was used in REF 2014? Disciplinary differences in how researchers demonstrate and assess impact
A new report produced by the Digital Science team explores the types of evidence used to demonstrate impact in REF2014 and pulls together guidance from leading professionals on good practice. Here Tamar Loach and Martin Szomszor present a broad look at the the types of evidence in use in the REF impact case studies and reflect on the association between use of […]
The recent UK research assessment exercise, REF2014, attempted to be as fair and transparent as possible. However, Alan Dix, a member of the computing sub-panel, reports how a post-hoc analysis of public domain REF data reveals substantial implicit and emergent bias in terms of discipline sub-areas (theoretical vs applied), institutions (Russell Group vs post-1992), and gender. While metrics are […]
To fight the slow pace of gender equality in the workplace, attack the root cause: invisible, unconscious bias.
Gender diversity is correlated with better business results and enormous economic and business value. But unconscious bias continues to negatively affect women in the workplace in a number of ways, writes Caroline Turner. Those who manage teams must actively reveal and uproot these biases.
This piece is part of a wider series on Women in Academia and coincides with LSE Women: making history […]
Accounting for Impact? How the Impact Factor is shaping research and what this means for knowledge production.
Why does the impact factor continue to play such a consequential role in academia? Alex Rushforth and Sarah de Rijcke look at how considerations of the metric enter in from early stages of research planning to the later stages of publication. Even with initiatives against the use of impact factors, scientists themselves will likely err on the side of caution and continue […]
In this feature essay, Ninna Meier reflects on the materiality of the writing – and re-writing – process in academic research. She explores the ways in which our ever-accumulating thoughts come to form layers on the material objects in which we write our notes and discusses the pleasures of co-authorship.
This essay originally appeared on LSE Review of Books and is the first in […]
Is it ethical to be passionate in academia? Passion is a central concept for understanding academic labour.
Today we launch a new series of posts from a recent conference about the Accelerated Academy. Pieces over the next few weeks will explore the history, development and structure of audit cultures in Higher Education, digitally mediated measurement and the quantification of scholarship. The first piece in the series is from Fabian Cannizzo. Drawing from his research in Australia, he explores performance management criteria, […]
It’s time to put our impact data to work to get a better understanding of the value, use and re-use of research.
If published articles and research data are subject to open access and sharing mandates, why not also the data on impact-related activity of research outputs? Liz Allen argues that the curation of an open ‘impact genome project’ could go a long way in remedying our limited understanding of impact. Of course there would be lots of variants in the type of impact […]
85% of Health Research is Wasted: How to do great research, get it published, and improve health outcomes.
Trish Groves reflects on the scandal of waste, error, and misconduct in clinical and public health research and describes a new effort to tackle research and publication integrity from both ends. This challenge matters everywhere, but it’s specially urgent in low and middle income countries. The University of California, San Francisco and BMJ have teamed up to develop an eLearning programme for […]
Putting hypotheses to the test: We must hold ourselves accountable to decisions made before we see the data.
In the daily practice of doing research, it is easy to confuse what is being done. There is often confusion over whether a study is exploratory (hypothesis-generating) research or confirmatory (hypothesis-testing) research. By defining how a hypothesis or research question will be tested at the outset of research, preregistration eliminates this ambiguity. David Mellor outlines the value of preregistration for […]
Are scientific findings exaggerated? Study finds steady increase of superlatives in PubMed abstracts.
Are scientists using language aimed at convincing editors and reviewers to publish their work? Joeri Tijdink, Christiaan Vinkers and Wim Otte present findings which suggest a rise in potentially exaggerated language. Potentially conflicting with the core values of science, the pressure to publish in high impact publications may be contributing to a paradigm of over-interpretation, overstatement and misreporting of scientific results.
Our perception […]
Towards a critical data science – the complicated relationship between data and the democratic project.
What is driving the rise in data-driven techniques used by politicians and political campaigns to connect with the concerns and needs of citizens? Will a data-driven approach to political campaign messaging disrupt the “echo chamber” effect that is perceived to emerge within online spaces? Jo Bates finds the role of data science in the development of the democratic process is […]
Access to more and more publication and citation data offers the potential for more powerful impact measures than traditional bibliometrics. Accounting for more of the context in the relationship between the citing and cited publications could provide more subtle and nuanced impact measurement. Ryan Whalen looks at the different ways that scientific content are related, and how these relationships could be […]
Bringing together bibliometrics research from different disciplines – what can we learn from each other?
Currently, there is little exchange between the different communities interested in the domain of bibliometrics. A recent conference aimed to bridge this gap. Peter Kraker, Katrin Weller, Isabella Peters and Elisabeth Lex report on the multitude of topics and viewpoints covered on the quantitative analysis of scientific research. A key theme was the strong need for more openness and transparency: transparency in research […]
When are journal metrics useful? A balanced call for the contextualized and transparent use of all publication metrics
The Declaration on Research Assessment (DORA) has yet to achieve widespread institutional support in the UK. Elizabeth Gadd digs further into the slow uptake. Although there is growing acceptance that the Journal Impact Factor is subject to significant limitations, DORA feels rather negative in tone: an anti-journal metric tirade. There may be times when a journal metric, sensibly used, is […]
Blogs are now an established part of the chattersphere/public conversation, especially in international development circles, but Duncan Green finds academic take-up lacking. Here he outlines the major arguments for taking blogging and social media seriously. It doesn’t need to become another onerous time-commitment. Reading a blog should be like listening to the person talk, but with links.
Before I started […]