LSE - Small Logo
LSE - Small Logo

Christopher Daley

Linda Hantrais

January 17th, 2024

A ranking for interdisciplinarity is a poor measure for the quality of research and teaching in universities

1 comment | 24 shares

Estimated reading time: 7 minutes

Christopher Daley

Linda Hantrais

January 17th, 2024

A ranking for interdisciplinarity is a poor measure for the quality of research and teaching in universities

1 comment | 24 shares

Estimated reading time: 7 minutes

Few academics, students or policymakers would dispute the value of addressing societal problems by setting them in the context of ‘the dynamic relationship’ between different disciplines. Christopher Daley and Linda Hantrais argue that interdisciplinarity, however defined, should not be used as a measure of the quality of research and teaching in global university rankings.


In a world facing multi-faceted societal challenges, agreement has long existed on the need to incentivise interdisciplinary research and teaching, and to share and map interdisciplinary good practice across different countries and institutions. In October 2023, Times Higher Education (THE) and Schmidt Science Fellows announced plans to launch Interdisciplinary Science Rankings (ISR) in 2024 with the broad aim to ‘improve scientific excellence and collaboration among universities’.

The intention of THE and Schmidt Science Fellows to introduce a catch-all concept of interdisciplinarity as a measure of university teaching and research excellence and as a mechanism for creating hierarchical quality rankings is concerning for three reasons:

  • The lack of a scientifically acceptable and operational definition of interdisciplinarity
  • The failure in rating interdisciplinarity within universities to take account of collaborations that cut across institutions, sectors and countries
  • The almost total exclusion of social sciences and humanities.

Identifying interdisciplinary research and teaching

In preparation for the launch of the rankings in 2024, the US-based Schmidt Science Fellows programme − committed to promoting interdisciplinarity in a select group of the world’s leading science and engineering institutions − commissioned Times Higher Education to undertake a feasibility study. In their Product Development Report, they adopt a definition which describes interdisciplinarity research in simple terms as: ‘where experts from distinct disciplines come together to research and solve a problem’.

Yet, as long ago as the 1970s the OECD warned: ‘the very concept of interdisciplinarity, and the allied concepts of multi-, pluri- and transdisciplinarity [are at times] difficult to delimit’. THE and Schmidt do not consider broader debates about definitions, or the extent to which genuine interdisciplinary collaboration takes place, as opposed to multidisciplinary research and teaching that is carried out in parallel and then juxtaposed.

THE and Schmidt do not consider broader debates about definitions, or the extent to which genuine interdisciplinary collaboration takes place

A more expansive and widely adopted definition of interdisciplinarity describes researchers from different disciplines seeking to integrate ‘relevant concepts, theories, and/or methodologies, as well as the results or insights these disciplines generate’ throughout the process.

Applying their narrow definition, the 2023 Product Development Report assesses the quality of the interdisciplinary research and teaching environment in a sample of 1200 science and engineering institutions across the world. They base their assessment on what they classify as interdisciplinary input and output metrics:

  • Quantitative input data representing ‘general reputation’ − funding dedicated to interdisciplinary research, industry funding, and recruitment of interdisciplinary researchers, and
  • Qualitative survey data designed to measure encouragement, enablement, reward and tenure − dedicated infrastructures and administrative support, and tenure/promotion specifically for interdisciplinary research.
  • Output measures reporting successful interdisciplinary research collaborations, derived from bibliometric data from citations indexes.

Overall, participation in the feasibility study was found to be ‘driven by institutions from countries in the Global South’. ‘Valid’ data submissions for the report were very low from UK, US, Canada and Australia (all outside the top 15 in terms of the number of valid submissions). The largest responses were from India, Russia and Pakistan. The top country for the proportion of research income dedicated to interdisciplinary science (the main input measure) was Egypt, followed by Uzbekistan and Saudi Arabia.

The output results within the report provided directly opposite conclusions from those for inputs and processes. They demonstrated high levels of interdisciplinary science in European countries with thirteen European nations having over 25% of all research classified as interdisciplinary, implying a significant level of investment in interdisciplinary infrastructure in Europe. In exploring outputs, the report primarily focused on journal articles, which are most prominent within the natural and physical sciences in the English-speaking world and are known to be problematic for interdisciplinary work (especially when it includes the social sciences and humanities) due to the absence of highly-rated multidisciplinary journals.

The report draws the debatable conclusion that the difference identified in the presence of interdisciplinarity between the Global North and South suggests the need ‘for greater dedication’ to interdisciplinary research in the Global North and ‘for developing global standard research outputs’ in the Global South.

Interdisciplinary work is rarely confined to single departments or institutions

A major reason for criticising the report and the proposal to use the findings to rank the quality of teaching and research is the narrow focus on universities. While higher education teaching is almost exclusively confined to universities, this approach fails to capture the breadth and scale of interdisciplinary research activity that takes place outside individual universities and internationally, particularly within Europe. None of the data are analysed in relation to the number and type of universities, funding sources, student population size and origins, or the role played by national research institutions and government policy for higher education and research.

this approach fails to capture the breadth and scale of interdisciplinary research activity that takes place outside individual universities and internationally, particularly within Europe

These THE league tables are in part intended to assist students in assessing the ‘quality’ of interdisciplinary resources in individual universities. But the ranking as proposed fails to capture the extent to which undergraduate and postgraduate courses routinely include a multidisciplinary or foundational introduction.

The countries with the highest valid response rates happened to be those with very high numbers of universities (over 1000 in India), often founded on technological disciplines, with wide-ranging governance structures, objectives, and funding models (public, private, regionally funded, central government and industry funded). All of which complicates the landscape when trying to understand how institutions record activities which could be classified as ‘interdisciplinary’.

In the UK, interdisciplinary research organisations, such as the Francis Crick and Alan Turing Institutes operate as independent legal entities, yet have ongoing partnerships with universities spanning multiple disciplines. In Germany and France, much significant interdisciplinary research takes place in research institutes, such as the French National Centre for Scientific Research, or the highly autonomous Max Planck Institutes. The approach proposed by THE does not therefore identify all interdisciplinary activity and may even overlook large-scale and internationally significant projects taking place beyond the walls of universities.

Why are SHAPE disciplines excluded?

The third major concern about this new ranking is the absence of disciplines encompassed by Social Sciences, Humanities and the Arts for People and the Economy (SHAPE). The report refers to a debate raised, but not pursued, as to whether interdisciplinary research can be fully measured if confined to Science, Technology, Engineering and Mathematics (STEM) subjects. For the purposes of the feasibility study, it was determined that research in the ‘sciences was a necessary condition to be included in the ranking’. For Schmidt Science Fellows, science means exclusively the natural sciences, computing, engineering and mathematics.

THE’s proposed rankings take the contested, complicated and nebulous area of interdisciplinarity and reduce it to a series of simplistic metrics with a restricted disciplinary focus

In providing a simple definition of interdisciplinary research, the report omits to say what mixes of disciplines qualify as being interdisciplinary. Although THE and Schmidt concede that individuals can work across disciplines, they do not acknowledge the role of hybrid and cognate disciplines. Demography, (human/political/economic) geography, anthropology, public administration, social policy, political economy, town planning, urban studies, transport studies, business studies, socio-legal studies, sociolinguistics, social psychology, clinical psychology are prime examples of disciplinary integration. These fields of research (interdisciplinary by definition) make the social sciences ideal partners in projects tackling global societal challenges such as anti-microbial resistance, climate change, food insecurity, and digital inclusion.

Ultimately, THE’s proposed rankings take the contested, complicated and nebulous area of interdisciplinarity and reduce it to a series of simplistic metrics with a restricted disciplinary focus. They should be commended for their efforts to offer some shape to the interdisciplinary landscape through mapping activity globally and incentivising future collaboration. But by seeking to shoehorn interdisciplinarity into another hierarchical ranking (especially one defined along contentious disciplinary and institutional lines), the risk is that they create perverse incentives whereby interdisciplinary collaboration is decided, in part, by its impact on rankings, rather than as a result of efforts to improve scientific excellence and collaboration. If the ranking proposal is implemented, interdisciplinary research and teaching may become about transactional exchange as opposed to a bottom-up process driven by organic cross-disciplinary innovation and collaboration.

 


The content generated on this blog is for information purposes only. This Article gives the views and opinions of the authors and does not reflect the views and opinions of the Impact of Social Science blog (the blog), nor of the London School of Economics and Political Science. Please review our comments policy if you have any concerns on posting a comment below.

Image Credit: Robert Keane via Unsplash. 


 

Print Friendly, PDF & Email

About the author

Christopher Daley

Dr Christopher Daley is Research and Innovation Evidence Manager at the London School of Economics and Political Science. His role involves supporting the delivery of the institution’s Research for the World Strategy through the provision of quantitative and qualitative information and analytics. Christopher is also chair of the meta-research special interest group with the Association of Research Managers and Administrators and an advisory board member for the journal Public Humanities.

Linda Hantrais

Linda Hantrais FAcSS is Visiting Professor at the LSE International Inequalities Institute; Emerita Professor in European Social Policy in the Department of Politics and International Studies at Loughborough University; and Chair of the UK Academy of Social Sciences’ International Advisory Group. Her most recent book-length publication is an edited guide on How to Manage International Multidisciplinary Research Projects (Elgar, 2022)

Posted In: LSE comment | Measuring Research | Research methods | SHAPE

1 Comments