Jason Priem, Judit Bar-Ilan, Stefanie Haustein, Isabella Peters, Hadas Shema, and Jens Terliesner get a sense of how established the academic presence is online, and how an individual academic online profile can stand up to traditional measurements of number of publications and citations.
This article first appeared on the LSE Impact of Social Science blog
Traditionally, scholarly impact and visibility have been measured by counting publications and citations in the scholarly literature. However, increasingly scholars are also visible on the Web, establishing presences in a growing variety of social ecosystems. Examining this broader set of altmetrics could establish a more comprehensive image of influence, uncovering authors’ weight in the informal, invisible college: their “scientific ‘street cred’” [pdf] (Cronin, 2001).
But before we can start to seriously examine scholars’ personal altmetrics, we need to get a sense of how wide and established their presence on the social Web is, and how measures of social Web impact relate to their more traditional counterparts. To answer this, we sampled the 57 presenters from the2010 Leiden STI Conference, gathering publication and citations counts as well as data from the presenters’ Web footprints.
Looking just at authors, we found Web presence widespread and diverse: 84 per cent of scholars had homepages, 70 per cent were on LinkedIn, 23 per cent had public Google Scholar profiles, and 16 per cent were on Twitter (this last number is well higher than more conservative, earlier estimates).
We also delved deeper by looking at publications of our sampled scholars. After assembling all 1,136 articles they’d written (subject to some methodological details), we looked at how much activity these articles were attracting in various ecosystems. The coverage of the social reference manager Mendeley was the biggest story here: 82 per cent of our documents had at least one Mendeley bookmark, which compares quite favorably to 85 per cent coverage from Scopus. Interestingly, it is better coverage than Thompson ISI’s Web of Science (which uses the same data as the Impact Factor); only 74 per cent of sampled articles cited were in WoS.
Only 28 per cent of articles were bookmarked in CiteULike, suggesting that Mendeley is cementing dominance in the online reference manager space. However, CiteULike still has a cool trick up its sleeve: the ability to analyze reader-supplied tags. For example, below are tag clouds for two sampled authors, Loet Leydesdorf (above) and Stevan Harnad; we can see not just their interests, but (by looking at the extent to which a single tag dominates), the amount of focus in their work–or rather, the amount readers perceive this focus.
A potential strength of altmetrics is that they track new forms of impact–forms related but not identical to what citation counting shows us. As you can see from the table below, the data supports this claim. Again, Mendeley is the standout, correlating at .448 with Scopus citation counts.
|Spearman’s ρ||citations (Scopus)||bookmarks (Mendeley)||bookmarks (CiteULike)|
|N=1136||**. Correlation is significant at the 0.01 level (2-tailed).|
Maturing tools like total-impact provide much more article-level data including citation on blogs, Twitter, and Wikipedia; we weren’t able to include these in this preliminary paper. But continued research in to these and other altmetric sources has real promise to help build a new “bibliometric spectroscopy” [pdf], expanding and deepening our understanding of scholarly impact. It’ll take work to understand and use these new metrics – but they’re not going away. Scholars, like the rest of the world, are quickly moving toward a universe of web-native communication.
Note: This article gives the views of the author, and not the position of the British Politics and Policy blog, nor of the London School of Economics. Please read our comments policy before posting.