LSE - Small Logo
LSE - Small Logo

Blog Admin

February 12th, 2018

Analysing Altmetric data on research citations in policy literature – the case of the University of Sheffield

0 comments | 1 shares

Estimated reading time: 5 minutes

Blog Admin

February 12th, 2018

Analysing Altmetric data on research citations in policy literature – the case of the University of Sheffield

0 comments | 1 shares

Estimated reading time: 5 minutes

One of the sources of attention Altmetric.com tracks is the number of times research outputs have been cited in policy literature. Andy Tattersall and Chris Carroll explored the case of the University of Sheffield and what the data says about the impact of its research on national and international policy. The percentage of outputs with at least one policy mention compares favourably with previous studies, while huge variations were found between the time of publication and the time of the first policy citation. However, some problems with the quality of the data were identified, highlighting the need for careful scrutiny and corroboration.

Altmetrics offers all kinds of insights into how a piece of research has been communicated and cited. In 2014 Altmetric.com added policy document tracking to its sources of attention, offering another valuable insight into how research outputs are used post-publication. At the University of Sheffield we thought it would be useful to explore the Altmetric.com data for policy document citations to see what impact our work is having on national and international policy.

We analysed all published research from authors at the University of Sheffield indexed in the Altmetric.com database; a total of 96,550 research outputs, of which we were able to identify 1,463 pieces of published research cited between one and 13 times in policy. This represented 1.52% of our research outputs. Of these 1,463 artefacts, 21 were cited in five or more policy documents, with the vast majority – 1,185 documents – having been cited just once. Our sample compared very well with previous studies by Haunschild and Bornmann, who looked at papers indexed in Web of Science and found 0.5% were cited in policy, and Bornmann, Haunschild and Marx, who found 1.2% of climate change research publications with at least one policy mention. From our sample we found 92 research articles cited in three or more policy documents. Of those 92 we found medicine, dentistry, and health had the greatest policy impact, followed by social science and pure science.

We also wanted to explore whether research published by the University of Sheffield had a limited time span between publication and policy citation. We looked at the time lag and found it ranged from just three months to 31 years. This highlighted a long tail of publications influencing policy, something we would have struggled to identify prior to Altmetric.com without manual trawling. The earliest piece of research from our sample to be cited in policy was published in 1979 and took until 2010 before receiving its first policy citation. We manually checked the records as we found many pre-1979 publications to have been published much later, often this century. This is likely due to misreported data in the institutional dataset, giving a false date; highlighting the need to manually check such records for authenticity. The shortest time between research publication and policy citation was a mere three months: a paper published in November 2016 and first cited in National Institute for Health and Care Excellence (NICE) policy in January 2017.

The Altmetric.com reports are only as good as the data they analyse and our research did uncover some errors. Looking at those 21 papers with more than five policy document citations, we found seven were not fit for inclusion. One such example was identified when we discovered research papers had been attributed to the University of Sheffield when the authors were not, in fact, affiliated to the university. As this data is sourced from our research publications system, we assume this was a mistake made by the author; this can happen when authors incorrectly accept as their own papers suggested to them by the system. While this was almost certainly a genuine error, and may have been rectified later, the system had not yet updated to take account of such corrections. Another of these papers was mistakenly attributed to an author who had no direct involvement in the paper but who was part of a related wider research project. Another of the publications was excluded due to it not, in fact, having actually been cited in the relevant policy document. One of the papers that was included belonged to an author not at Sheffield at the time of publication, but who has since joined the institution. This showed that Altmetric.com’s regular updates were able to discover updated institutional information and realign authors with their current employer.

The two most cited papers came from our own department, the School of Health and Related Research (ScHARR), in the field of health economics. Only two of the 14 most cited publications were in a field other than health economics or pure economics, both of which were in environmental studies. In total, the 14 most cited research outputs were cited by 175 policy documents, but we identified 9% (16) of these as duplicates. Of those 175 citations we found that 61% (107) were national, i.e. from the UK, and 39% (68) were international, i.e. from countries other than the UK or from international bodies such as the United Nations or World Health Organization.

Altmetric.com continues to add further policy sources to its database to trawl for citations. As a result, it should follow that our sample of 1,463 research outputs will not only grow with more fresh policy citations, but as older research citations are identified through new policy sources of attention. This work also highlights the importance of research outputs having unique identifiers so they can be tracked through altmetric platforms; it is certain that more of our research will be cited in policy, but if no unique identifier is attached, especially to older outputs, it is unlikely the Altmetric.com system will pick it up.

Altmetric.com is a very useful indicator of interest in and influence of research within global policy. Yet there are clearly problems with the quality of the data and how it is attributed to subsequent Altmetric.com data. We found one third of our sample of the 21 most cited research outputs had been erroneously attributed to an institution or author. Whether this is representative of the whole dataset only further studies will find out. Therefore it is essential that any future explorations of research outputs and policy document citations be double-checked and not taken on face value.

This blog post is based on the authors’ article, “What Can Altmetric.com Tell Us About Policy Citations of Research? An Analysis of Altmetric.com Data for Research Articles from the University of Sheffield”, published in Frontiers in Research Metrics and Analytics (DOI: 10.3389/frma.2017.00009).

Featured image credit: Andrea Enríquez Cousiño, via Unsplash (licensed under a CC0 1.0 license).

Note: This article gives the views of the authors, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

About the authors

Andy Tattersall is an Information Specialist at The School of Health and Related Research (ScHARR) and writes, teaches and gives talks about digital academia, technology, scholarly communications, open research, web and information science, apps, altmetrics, and social media. In particular, their applications for research, teaching, learning, knowledge management and collaboration. Andy received a Senate Award from The University of Sheffield for his pioneering work on MOOCs in 2013 and is a Senior Fellow of the Higher Education Academy. He is also Chair for the Chartered Institute of Library and Information Professionals – Multi Media and Information Technology Committee. Andy was listed as one of Jisc’s Top Ten Social Media Superstars for 2017 in Higher Education.   He has edited a book on altmetrics for Facet Publishing which is aimed at researchers and librarians. He tweets @Andy_Tattersall and his ORCID ID is 0000-0002-2842-9576.

Chris Carroll is a Reader in Systematic Review and Evidence Synthesis at the University of Sheffield. His role principally involves synthesising published data to help inform policymaking, particularly in the fields of medicine and health, as well as the development of methods to conduct such work. Chris’ ORCID ID is 0000-0002-6361-6182.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic publishing | Citations | Evidence for Policy | Evidence-based policy | Impact | Measuring Research

Leave a Reply

Your email address will not be published. Required fields are marked *