LSE - Small Logo
LSE - Small Logo

Blog Admin

March 5th, 2013

The new metrics cannot be ignored – we need to implement centralised impact management systems to understand what these numbers mean

6 comments | 3 shares

Estimated reading time: 5 minutes

Blog Admin

March 5th, 2013

The new metrics cannot be ignored – we need to implement centralised impact management systems to understand what these numbers mean

6 comments | 3 shares

Estimated reading time: 5 minutes

pat loriaBy using the social web to convey both scholarly and public attention of research outputs, altmetrics offer a much richer picture than traditional metrics based on exclusive citation database information. Pat Loria compares the new metrics services and argues that as more systems incorporate altmetrics into their platforms, institutions will benefit from creating an impact management system to interpret these metrics, pulling in information from research managers, ICT and systems staff, and those creating the research impact.

Researchers today are under enormous pressure to demonstrate impact in order to secure the resources they need for the continuation of their work. In a world characterized by boundless ideas, but limited means, funding providers use various approaches to determine who they will support, often including an analysis of impact numbers and narratives. How many items were published? How many citations received? Research assessment exercises survey numbers of research grants or commercialization incomes, along with a plethora of applied and esteem measures. For Australia’s Excellence in Research for Australia (ERA) 2012 evaluation exercise, quantitative measures were supplemented by peer review for some fields and explanatory statements for all fields to provide narrative contexts. The Australian National Competitive Grants Program solicits impact numbers and narratives in the Research Opportunity and Performance Evidence (ROPE) sections on application forms (I’m sure the acronym is unintentional).

In recent times, citation metrics have been used, particularly in the science, technology, and medicine (STM) fields, to provide evidence of impact. The h-index has been very popular among STM scholars for this purpose. However, citation metrics are only as good as the coverage or exposure of a field’s research outputs (and therefore impacts) in citation databases. Researchers in fields such as the arts and humanities, or the social sciences, whose research may be published in a wider variety of formats, including books, book chapters and conferences (not to mention in non-traditional outputs such as exhibitions and performances), and whose journal titles are not widely covered in citation databases, appear to have much less impact than their STM colleagues. In addition, traditional citations only measure one aspect of the research impact story, as the following table demonstrates (adapted from Jason Priem’s presentation on Altmetrics and Revolutions).

PatLoria table1

As suggested by Table 1, traditional citations do not provide us with indicators of wider impact, as evidenced in scholarly collaboration services, news outlets and social media. In order to redress the “pond mentality” engendered by the inclusion (or exclusion) policies of citation databases, alternative metrics, or altmetrics, were developed to measure the scholarly and public attention received by research outputs on the social web. I note that at the time of writing, I discovered the availability of a new Altmetric for Scopus application. This is a fast-moving field!

To give an indication of public and scholarly impact harvested by altmetrics, the following table compares three of the most popular services, Altmetric.com, ImpactStory, and Plum Analytics. This comparison is based on my own analysis and the information available on each service’s website. The list of data sources harvested by these services is ever expanding.

PatLoria table2

The new metrics cannot be ignored. According to author Peter Vinkler, “Citation graph data is like Chekhov’s gun: once on stage, it has to be fired” (cited by Jason Priem in Altmetrics and Revolutions). As more publishers and research information management systems incorporate altmetrics into their platforms, those involved in the doing, management, support and funding of research will need to understand what these numbers represent, and more importantly, what they mean in a world in which the push towards open access of publically funded research continues to gain momentum. One question arising from recent initiatives, such as FASTR, the OSTP public access policy memorandum in the US, the Finch Report in the UK, and the ARC open access policy in Australia, is how will anyone be able to evaluate whether or not the public are actually engaging with the content. Altmetrics can provide real evidence of public engagement with open access research outputs.

Other benefits look very familiar:

  • As a measure of reach and influence
  • To review the success or otherwise of a research dissemination strategy
  • For grant, employment and promotion opportunities
  • Providing impact data to research managers and funding agencies.

In addition, altmetrics can assist researchers throughout the stages of the research lifecycle (adapted from PLoS Article-Level Metrics for Researchers):

  • Discover trending topics and potential collaborators
  • Discover existing data and avoid previous mistakes
  • Discover alternative results or interpretations to your results
  • Add presentations and datasets to sharing services and track interest and interactions
  • Track article reach across numerous dissemination channels
  • Evaluate dissemination decisions and share metrics with managers and funders.

For research managers, two strategic objectives can be supported by a discerning use of altmetrics. The first is the desire to increase an institution’s digital research footprint or visibility via the use of profiling systems, such as VIVO. And the other is the need to evaluate programs via the use of research information management systems, such as Symplectic. Altmetrics can be harvested by such systems to supplement, and sometimes question, traditional impact data, as well as to communicate the public interest generated by research activities.

In my opinion, what is ultimately required is the implementation of what I call an impact management system (IMS). The IMS will incorporate outputs and impacts, harvesting metadata from human resources systems, research management systems, institutional repositories and impact monitoring services. It will facilitate the manual entry of qualitative data by researchers on policy, industry and community impacts, and non-harvestable academic impacts, such as applied and esteem measures. The development of an IMS will require a collaborative, institutional approach between research managers, ICT and systems staff, and those creating the impact! Librarians can help, with their data management skills and aptitude for storytelling. The aim of the IMS is to collect research activity data that can be interpreted and weaved into variations on a theme for internal, external and public audiences.

In a climate of accountability for public funds, the management and communication of impact has become an essential skill for researchers and institutions. There are many questions yet to be answered in the development of systems, such as the perceived need for a standard impact metadata schema and perhaps a meta-search engine for research profiles. In the meantime, those institutions that are developing local profiling systems will simplify the task of communicating how their research is making a difference. It’s all about telling your research impact story!

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. 

About the author
Pat Loria
is the Research Librarian at the University of Southern Queensland in Australia. His Twitter handle is @pat_loria

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Academic publishing | Impact | REF2014

6 Comments