LSE - Small Logo
LSE - Small Logo

Blog Admin

December 5th, 2012

Leading or following: Data and rankings must inform strategic decision making, not drive them

3 comments | 1 shares

Estimated reading time: 5 minutes

Blog Admin

December 5th, 2012

Leading or following: Data and rankings must inform strategic decision making, not drive them

3 comments | 1 shares

Estimated reading time: 5 minutes

At yesterday’s Future of Impact conference, Cameron Neylon argued that universities must ask how their research is being re-used, and choose to become the most skilled in using available data to inform strategic decision making. It’s time to put down the Impact voodoo doll and stop using rankings blindly.

“Impact” is a word that has gained great power in research policy making despite not having an agreed definition. Indeed arguments over what Research Impact is have become a proxy for underlying tensions about why we do research and how to assess our performance against those goals. Rather than articulate and discuss the benefits of research and how we configure our systems to deliver those we have instead attacked, defended, re-defined, and twisted this word as though it were a voodoo doll with the power to affect policy and those that make, and enact it.

I want to suggest that we need to return to basics, and to ask the underlying questions. Why do we, as researchers, institutions, funders, government, business and taxpayers support research? What do we want to achieve? What are the underlying values we bring to the decision to spend 1 or 2 or 5 per cent of GDP on research? As individuals and institutions we need to understand those values. As individuals and as institutions we need to articulate a mission that reflects and is informed by those values. And when we define that mission we need to identify the right data and information to help us understand how we are performing against that mission and to make strategic decisions on how we distribute resources. The blunt truth is that what we usually do, both as individuals and as institutions, is to accept those measures and rankings that are easily available and widely used and to use them blindly. Whether it’s journal impact factors or university rankings these simplistic and in many cases inappropriate, measures end up driving our mission rather than the other way around. The real mission becomes to publish in the right journal, not communicate to the right audience, to achieve a ranking, not to deliver value.

Part of the reason for this is one of data availability. Gaining high quality data on performance against mission objectives has been hard, and often impossible. But this is changing. Increasingly both research activity and its exploitation leave traces online. Researchers that re-use research have always left citations behind in the peer reviewed literature but also now leave bookmarks and blog posts. But users beyond the research community, educators, practitioners, community members, and policy creators are also leaving an online trace through discussions, bookmarks, and links. An online trace that can be tracked and measured, and perhaps most importantly interrogated to understand what the impact has been. What we have available to us are a new series of proxies of the re-use of research, proxies that probe users and uses of research that traditionally have been very hard to measure. And these proxies are useful in probing different types of impact. If we consider impact through the prism of re-use then we can integrate traditional forms of research impact, such as citations, with wider forms of impact, such as changes in clinical practice. 

But we have to understand that any measure can only ever be a proxy. Only ever data, to support strategic decision making. Too often the apparent “objectivity” of a numeric measure or ranking is used to avoid the responsibility for taking a decision. How many UK institutions have “to be a top ten university” in their strategy? This is misguided. Firstly it is bad management. The acceptance of the idea that there is some sort of scala naturae with Harvard at the top and someone else at the bottom means outsourcing our values and our mission. What is more these measures can only ever be based on data, and are weighted according to views of what is important that look backwards, not forwards. But more than just outsourcing our responsibilities acceptance of this idea means inevitable decline. UK institutions will never have the resources of a Harvard or and MIT (or a Stanford, Princeton, Virginia Tech) but also we will soon be unable to compete with the resourcing of institutions in a whole new range of countries. If we accept a single numeric ranking, defined by outsiders, we are choosing to lose the game.

But UK institutions can choose to re-define the game. Can choose to become the most skilled in using the data available to guide and inform strategic decision making. To have clearly articulated values, and missions that speak to the needs of our community. To take a leadership position on what being a high performance institution means in the 21st century. Or we can choose to follow; and at best manage the inevitable decline as institutions in China, Brazil, and India take our position. UK institutions used to have the confidence to define what mattered to them and to lead the world by pursuing those goals. We have the opportunity to do so again. If we choose. 

Note: This article gives the views of the author(s), and not the position of the Impact of Social Sciences blog, nor of the London School of Economics.

About the author:
Cameron Neylon is a biophysicist and well known advocate of opening up the process of research.  He is Advocacy Director for PLoS and speaks regularly on issues of Open Science including Open Access publication, Open Data, and Open Source as well as the wider technical and social issues of applying the opportunities the internet brings to the practice of science. He was named as a SPARC Innovator in July 2010 and is a proud recipient of the Blue Obelisk for contributions to open data. He writes regularly at his blog, Science in the Open.

 

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Impact | Rankings

3 Comments