Scientist Led or Mission Oriented – How much does it cost research funders to lead science?

Across national research systems, decisions about what research is funded are influenced by both researchers themselves, and policymakers with national priorities. However, beyond short term change in response to grants, it is unclear how the latter mission oriented form of research funding affects research practices over the long term. In this post Kyle Myers, shows that the funding needed to actually alter research trajectories over longer periods of time may in fact be higher than expected.

Governments – we, the taxpayers – fund the majority of basic research conducted worldwide.This is for good reason. It is difficult for firms to capture the full value of their scientific discoveries. So, their incentives for conducting basic research are weaker than we, society as a whole, would like them to be. Thankfully, governments have proven willing investors. And even better, these investments have proven to be valuable. This is particularly true in the biomedical sciences, where empirical studies have connected the dots from governments granting funds to scientists, to those scientists producing new knowledge, to that knowledge being used in the development of new therapies.

But, how should we decide what types of research to fund? What diseases, what populations, what methodologies should we focus on? We could leave this up to the scientists themselves. And in the United States, this has become a popular choice. The “investigator-initiated” grant, where scientists propose their own ideas to be evaluated by their peers, is commonplace. This is especially true at the National Institutes of Health (NIH), the single largest funder of biomedical research in the world.

This investigator-initiated approach makes sense. Scientists are, after all, the people who should know the most about what ideas are the most promising. But the incentive structure of science, with its emphasis on priority and prestige, may not lead scientists to prefer the same things as society would like. And, like the rest of us, scientists may be influenced by certain preferences, biases, or other constraints that could prove misaligned with social goals. It is not surprising then that many countries, and notably the EU’s Horizon Europe research framework, rely on more “top-down” or “mission-oriented” styles where policymakers directly allocate funds to specific topics.

The NIH balances this tradeoff by using a combination of both investigator-initiated grants, as well as a number of “targeted” grant mechanisms that solicit proposals for particular types of research. These targeted mechanisms request ideas that focus on a particular disease, methodology, or population, and have become increasingly popular (see fig.1). But the NIH, and most other scientific funding agencies, have long assumed that scientists will be willing to adjust their research trajectories in response to these sorts of targeted grants or mission-oriented policies. However, whether these adjustments actually occur in practice – do scientists do what policymakers ask them to? — and just how costly they are to induce, has been unclear.

Fig.1: Targeted research programs set aside funds for research on specific diseases, populations, or a narrow field of biomedical science. This includes the NIH’s Requests For Applications (RFA), Program Announcements with Set-side funds (PAS), and Program Announcements with special Receipt, referral, and/or review considerations (PAR). Research grants are defined by the NIH’s criteria.

In a recent study, I investigated a subset of the NIH’s targeted grant mechanisms to evaluate how scientists responded to them and whether they achieved their objectives. In these mechanisms, the NIH sets aside roughly $2-3 million to fund projects on a relatively narrow topic (e.g., opioid abuse therapy, pediatric brain tumors). With the help of the NIH Office of Extramural Research, I assembled a novel dataset of scientists that applied to these and other NIH mechanisms, as well as the scientists’ publications – the knowledge they create – both before and after applying.

The first question I asked was whether scientists that win grants awarded through these mechanisms actually conduct research that they would not have otherwise. To do so, I compared scientists who submitted proposals of equal quality, but had different chances of winning because of unrelated budget constraints facing the NIH – this helps identify only the effect of receiving a grant, and avoids conflating this with other underlying differences between applications. Overall, I found that winners both published more than losers, and that the winner’s publications are more similar to the objectives of the targeted mechanism.

However, both of these changes to the scientists’ rate and direction of study are only temporary, fading after four to five years. This suggests that it may take much larger sums of funds to make a persistent change to a scientists’ focus.

Fig.2: The magnitude of change in science refers to how different a scientist’s new project would be compared to their prior work. The shaded area is the range of cost estimates from the analyses.

To get a more general sense about how costly it is to incentivize scientists to change what they study, I combined three data points: the amount of grant funds made available in each targeted competition (the $2-3 million); the similarity between a scientists’ prior work and the objectives of each competition (i.e., if the similarity between the two is high, the scientist would not need to change their research much); and scientists’ decisions on whether or not to enter each competition. Using econometric techniques, I can infer how scientists tradeoff the possibility of obtaining (valuable) grant funds in exchange for making (costly) adjustments to their research direction. As seen in the accompanying figure, I estimate that substantial sums are necessary to induce scientists to make even minor changes to the type of projects they pursue. In order for a scientist to be willing to undertake a 30% change in their science – a magnitude that is commonly observed in the data – they would expect roughly $4 million. This is much larger than the amount of funds NIH awards under any of their common grant mechanisms.

This is one of the first investigations into quantifying what may be called the adjustment costs of science. And it appears that it is quite expensive to induce scientists to do what “we” want them to do. This is not entirely suprising given the high degree of specialization modern biomedical scientists undertake in their extensive training.

Understanding these costs is essential for policymakers and managers alike since the vast majority of scientists at public and non-profit institutions still choose their own pursuits with minimal oversight. Again, this system has arisen for good reason. And these large costs do not directly imply that anything is wrong with how scientists and governments interact. But it is rather surprising that virtually zero funds from any major scientific funding agency in the world explicitly subsidize scientific adjustments. For instance, there is no grant I am aware of where the objectives are for scientists to just change the focus of their laboratory – funds are almost always expected to be used for “doing” science, producing knowledge, writing papers. But if these large costs of adjustment are trapping scientists in suboptimal research paths, there may be many scientists choosing funding-friendly but socially inefficient ideas. Hopefully, further work can shed light on whether it might be worth bearing these costs to help scientists find new, fruitful paths of work.

 

About the author

Kyle Myers Ph.D. is an assistant professor at the Harvard Business School. He studies the economics of innovation, and his research lies at the intersections of science, health care, and public policy. See more of his work here.

 

 

Print Friendly, PDF & Email

Leave A Comment

This work by LSE Impact of Social Sciences blog is licensed under a Creative Commons Attribution 3.0 Unported.