LSE - Small Logo
LSE - Small Logo

Blog Admin

June 12th, 2013

Developing indicators of the impact of scholarly communication is a massive technical challenge – but it’s also much simpler than that

11 comments | 5 shares

Estimated reading time: 5 minutes

Blog Admin

June 12th, 2013

Developing indicators of the impact of scholarly communication is a massive technical challenge – but it’s also much simpler than that

11 comments | 5 shares

Estimated reading time: 5 minutes

Conversations on impact tend to revolve around technical issue of measurement and finding appropriate metrics. To widen the conversation J. Britt Holbrook presents a list of 56 indicators of impact developed by the Center for the Study of Interdisciplinarity to help simplify the question of impact. By moving beyond technical aspects there is a greater opportunity for academics to embrace and explore other facets of impact.

Perhaps the most frequently asked question regarding impact is logistical: how can we measure the impact of our work? A recent story in the Chronicle of Higher Education suggests: “The larger conversation about how to measure scholarly impact is probably as old as scholarship itself.” Today, the question ranges from the development of article level metrics to building a shared infrastructure for all of scholarly communication.

Why should we develop new ways to measure the impact of our work? Jason Priem et al. and Heather Piwowar answer that current measures of impact don’t work; since traditional approaches to measuring scholarly communication don’t reward impact, we need a way to measure and reward other approaches that do. Similar views are shared by some and contested by others.

What’s most striking about answers to the ‘why’ question is how quickly they turn toward the ‘how’ question. Altmetrics developers are doers and inventors – they take action and try to figure things out. We can’t figure out impact just by thinking about it; we have to do research, warns Piwowar in the post linked above. After noting that the system is broken, Priem and Piwowar quickly ask, “How can we fix it?” Instead of answering the ‘why’ question, we ask a different question: Do altmetrics work?

Should we resist attempts to measure the impact of our work? Philip Moriarty offers sophisticated arguments against the ‘impact agenda’, while others are more demonstrative. Robert Frodeman and I argue that embracing impact in a way that preserves autonomy is a better strategy than mere resistance. Despite contrary answers to the question of resistance, many of us agree that not everything that counts can be counted. Our disagreement rests on different conceptions of freedom. ‘The resistance’ tends toward a negative concept of freedom that sees all forms of interference as evils. Advocates of owning impact, however, embrace a positive view of freedom, emphasizing self-determination as fundamental.

Infrequently asked questions

Although these frequently asked questions are interesting and important, it’s ultimately simpler than that. Colleagues at the Center for the Study of Interdisciplinarity (CSID), Kelli Barr, Keith Brown, and I, submitted something simple and quite messy to Nature to see whether they might join us in catalyzing a conversation on the question of impact. Frankly, we doubted it. Nevertheless, we submitted the following list, along with a brief discussion of how and why we generated it.

56 indicators

Of course, the editors at Nature declined to publish it. Instead, they asked us to clean up the list to make it more generally applicable and less CSID-specific. We did so, and the officially published version of our correspondence is available here. It can also be viewed free of charge here. The editors of the LSE Impact blog had a similar urge to clean up our mess. This is the table they proposed:

Impact image map

Comparing the original and edited versions illustrates the value of the simpler questions about impact I propose we begin to ask more frequently. I realize that people have already been asking some of these questions – but not frequently enough.

What do we mean by ‘impact’? Does it make a difference if we speak of ‘scholarly impact’ rather than ‘the impact of scholarly communication’? Does ‘scholarly communication’ include teaching, as well as publishing (whether in traditional or alternative venues)? I’d be interested to develop an account of why we ought to revive one feature of the Humboldtian idea of the university: research and teaching as mutually reinforcing activities essential to what it means to be a professor. Our list includes both research and teaching activities, as well as activities that might fall under the rubric of ‘outreach’ or policy engagement.

Why do we use the term ‘impact’?! The impact horse has left the barn. But we ought to question what sorts of activities fall within its scope, even if we would prefer a different term. Claire Donovan’s reaction to CSID’s list is priceless: “Only one on the list has anything to do with impact.” Before you click on this link to the twitter exchange, see if you can guess which one Claire had picked out. I couldn’t.

No question could be simpler: if we want to have an impact, then who will/should be impacted? The answer to this question should inform and be informed by research on how to measure impact. But asking the question of audience takes us beyond thinking of the ‘how’ question as susceptible only to technical answers.

Not all academics are interested in putting a number on their impact. The more frequently asked question of measurement favors technical answers. Technical answers paradoxically foster expertise in impact measurement and take the question of impact out of the hands of most academics, while placing a ready-made answer into the hands of managers.  Technical answers, in other words, foster technocratic domination over the question of impact. Imagine if HEFCE or the US NIH merely had to grab and drag the Altmetric bookmarklet onto their bookmarks bars, visit someone’s CV to find an article, then ‘Altmetric it!’ to see its score. There’s a danger that push-button evaluation by non-experts could replace peer review. This could be the end of the data-driven impact story.

Despite my hyperbole, I’m not arithmophobic. CSID’s list incorporates numbers, and we include things captured by altmetrics. I love altmetrics and spend (too much) time ‘altmetricing’ my own products, using both Altmetric and Impact Story. But, along with the thrill that comes from knowing that people are paying attention to my work, what motivates me to explore altmetrics is a drive to write my own impact story. I don’t want ‘the data’ driving that story for me. Nor do I want someone else at the wheel. I want to tell my own story and to appeal to numbers when I decide they can help correct my course or bolster my impact claims. Numbers themselves – or metrics – are not the danger. The danger is numbers being arbitrarily imposed on us.

This is the simplest, yet most difficult, question of all: where do we want impact to go? For now, we still have some say in the matter. We might even be able to impact the question at a policy level. I’ve offered some of my suggestions, above. I’m very interested to hear yours.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics.  

About the Author

J. Britt Holbrook is Assistant Director of the Center for the Study of Interdisciplinarity at the University of North Texas (UNT), where he has served as Research Assistant Professor within the Department of Philosophy and Religion Studies since 2005. He has also held teaching positions in philosophy at Emory University and at Georgia State University. Holbrook’s current research focuses on interdisciplinarity, peer review, scholarly communication, and the relationship between science, technology, and society. He is especially interested in the incorporation of societal impacts considerations into the peer review process of publicly supported funding agencies. Link to ORCID iD. Follow him on twitter @jbrittholbrook. 

 

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Impact

11 Comments