LSE - Small Logo
LSE - Small Logo

Blog Admin

December 17th, 2012

Altmetrics are the central way of measuring communication in the digital age but what do they miss?

9 comments

Estimated reading time: 5 minutes

Blog Admin

December 17th, 2012

Altmetrics are the central way of measuring communication in the digital age but what do they miss?

9 comments

Estimated reading time: 5 minutes

Inspired by the push towards altmetrics, Nick Scott sees great potential to better communicate indicators of academic success. But this does constitute impact? Here, he puts forward questions on media mentions, website page hits and the ‘dark stuff’.

The LSE Future of Impacts conference in London saw a lively debate on numerous issues. However it was the discussion on altmetrics that interested me. Altmetrics is a movement of sorts (with its own manifesto, it must be, right?). It aims to complement/replace [there seems to be quite lively discussion on which] traditional bibliographic rankings based on citation analysis of academic journals with a wider set of metrics. These include tweets on Twitter, Facebook shares, saves on sharing tools like Delicious and in reference managers/research management tools like Mendeley.

I’m all for this – in fact, even before I’d heard of Altmetrics I’d been working on a tool to do very similar things at ODI – we call it ODI CommsStats and it brings in Twitter and Facebook, along with other statistics from Google Scholar and the like (though it doesn’t yet include Mendeley shares or Delicious saves). You can see some of the principles behind it here.

However, despite my support for Altmetrics, I’m a little concerned that the nature of debate is not helpful. Messaging is too fixated on whether altmetrics are better than traditional forms of assessing impact – citation analysis and the like. To me, this is to miss the point completely. We shouldn’t be arguing that altmetrics is a better way to measure reach, that makes it seem like we have an alternative. No, we should be arguing that altmetrics are the central way to measure the more varied forms of scholarly communication of the digital age.

The world of research is changing. A speaker at the conference highlighted that academics spend as much time reading now as they did in the past. What has changed is that they read more different articles than before. This dynamic makes it more important than ever to be able to synthesise and publish findings in more varied formats – giving life to research as I outlined in my blog on a ‘cradle to grey’ digital strategy for research. Academics need to publish blogs, make videos, distribute executive summaries, enter findings into Wikipedia.

Coming from the think tank world, I’m quite familiar with the concept that academic success can come from things other than highly-ranked peer review journals. In this world, the relative amount published to journals seems to have declined as other forms of communication have taken over: more reports, more blogs, and more recently, more multimedia.

Unfortunately, I think the altmetric initiative is not yet comprehensive enough to tell a story about reach and impact of scholarly communications in the digital age. In particular, I’m keen to see progress on the following areas:

  1. How to identify different types of outputs?
    At the moment, two major identifiers seem to be central – DOIs(Digital Object Identifiers). These are already used in the world of journals, but are not used elsewhere. You can’t assign a DOI to a Youtube video, or to a self-published research report very easily (or at all if you don’t sign up through a fairly complicated process).The other ID format, launched in the last few months, is ORCiD. This is an individual ID, so an academic could sign up for one, and then add details of all their papers. However here again they run into trouble, as one of the main methods of identifying your outputs is entering the DOI – so you end up with the same issues as outlined above.The obvious other source for identifiers would be URLs. Almost everything has a webpage address, right? Yes, but not just one – it has two, three, four or more. There isn’t a clear system of parcelling up a whole set of URLs across multiple sites and saying these are all about one thing, for example one research report. Also, current implementations of URLs in altmetrics (in ImpactStory, for example) seem to cut out things like Mendeley saves and citations, which is a real shame.
  2. What about hits?
    The URLs question is pertinent to the second shortcoming too: one of the best metrics for many research outputs is how many times it has been accessed. However, altmetrics implementation of web hits and downloads seems limited to journals published on the Public Library of Science (PLOS). Perhaps we need Google Analyticsto take the lead here and make the ultimate AltMetrics tool?But that would still leave the age-old question of how to track downloads (basically, the only reliable way is to use server logs). I’m aware that absolute figures of downloads and page views are not useful: but the breakdown by country, the comparison of different papers, the year-on-year trends; these are all very useful figures.
  3. What about the dark stuff?
    A lot of sharing is ‘dark social’ – here is where the quantitative and digital limitations to altmetrics make their presence felt. Dark social means sharing in places analytics can’t reach – in people’s emails, bookmark or offline sharing.
    How you would you track this? There is no easy answer. But at ODI we’ve instituted an ‘M&E log’ where researchers can forward any example of their work being used, shared or having an impact. This then appears in reports on the impact of their papers online. Perhaps Altmetrics could define a standard dataset to record this information?
  4. What about media citations?
    Another great integration would be with a media monitoring service, or even Google News Alerts. ImpactStory organised things into examples of where papers have been ‘Saved by scholars’, ‘Saved by the public’ – perhaps a third rung is ‘Shared by the media’. Good research can see massive media pick-up, and it would be wonderful to be able to see this great indicator of reach.
  5. What about combining different outputs?
    Some of the most effective research communications of the digital age are those that take place within a campaign. That is to say, that it is not just a single paper being produced, but a paper complemented by numerous other outputs, each telling the same story but perhaps in a more appropriate tone and style for particular audiences. For these campaigns, it would be great to be able to collate up all the different metrics and see a full figure of reach, sharing, usage.ImpactStory goes part of the way, allowing collections of outputs to be created. But it could be improved by also showing a summing up of everything in a collection; showing the highlights and lowlights so that different forms of communications can be compared.What we’ve seen from doing this in ODI is that there is a clear story about blogs about a research paper being more shared on social networks, leading to more interest, and ultimately higher downloads for a research paper. Having a tool that can tell those stories about different outputs and the different ways they have impact would be incredibly valuable in creating the incentives for researchers to think about more variety in communications outputs.
  6. What about real impact?
    Positive metrics of reach are a great indicator for success. But my experience of the world of international development also concentrates my mind on the importance of real impact: change in the world. So I worry about names like ‘ImpactStory’ – altmetrics offers metrics, indicators of success, not indicators of real impact in my mind.Again, integration of an M&E log dataset of some sort, where people could add some of the more qualitative information of success – or even just a commenting facility to detail how work is used, received and leads to change – would be a great addition to altmetrics tools. Researchers do often hear about impact of their work, so tracking this somehow within altmetrics systems would be wonderful.

I’d like to make it very clear that I have not been feverishly following all movements in the Altmetrics debate; I’ve kept an eye on developments and tried to keep up-to-date with any major articles. So much is happening that it is possible (and actually highly likely) that a lot of the above has been thought about and there are even answers to it. If so, I’d love to know – I’d be very interested in retiring the ODI Dashboard in favour of an altmetrics one, but until I am clearer on the above, it is going to be very hard to do.

Note: This article gives the views of the author, and not the position of the Impact of Social Sciences blog, nor of the London School of Economics.

About the author:
Nick Scott
 is the Digital Manager at the Overseas Development Institute (ODI). He focuses on ways digital tools can support the communications, research and management processes of think tanks and research organisations. He developed ODI’s online communications strategy, awarded Online Strategy of the Year 2012 in the Digital Communications Awards.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Impact

9 Comments