LSE - Small Logo
LSE - Small Logo

Blog Admin

July 3rd, 2015

Is withholding your data simply bad science, or should it fall under scientific misconduct?

22 comments | 11 shares

Estimated reading time: 5 minutes

Blog Admin

July 3rd, 2015

Is withholding your data simply bad science, or should it fall under scientific misconduct?

22 comments | 11 shares

Estimated reading time: 5 minutes

nicole-janzA recent study sent data requests to 200 authors of economics articles where it was stated ‘data available upon request’. Most of the authors refused. What does the scientific community think about those withholding their data? Are they guilty of scientific misconduct? Nicole Janz argues that if you don’t share your data, you are breaking professional standards in research, and are thus committing scientific misconduct. Classifying data secrecy as misconduct may be a harsh, but it is a necessary step. 

I recently read a blog post by statistician Andrew Gelman, in which he commented on authors unwilling to share their data: “I’m not accusing [them] of scientific misconduct in not sharing their data.” I immediately remembered how I said to a group of grad students and post-docs at Berkeley that not sharing your data is not really misconduct, because they are not plagiarizing or committing fraud.

But was I right in saying that? Is withholding your data simply bad science, or does it – should it – fall under scientific misconduct? This question is crucial because we need to find new ways to fight data secrecy. A study by Krawczyk and Reuben published in 2015 sent data requests to 200 authors of articles in economic journals, and to authors of working papers. Only 44% provided the data on request. We are not talking about data that cannot be shared due to confidentiality or privacy concerns – obviously it is fine not to make these data public. In fact, the study had not addressed authors that did not promise to publish their data. Only those who stated that ‘data are available on request’ were targeted.  If we can punish data secrecy – and breaking promises – by labelling it misconduct, this could send a strong signal to the community.

Definition of scientific misconduct

What is scientific misconduct? Most definitions talk about the extreme cases of data fabrication, manipulation, and plagiarism, e.g. the National Science Foundation:

Research misconduct means fabrication, falsification, or plagiarism … Research misconduct does not include honest error or differences of opinion. (National Science Foundation)

The National Institutes of Health and the American Psychological Association use a very similar definition. And it makes sense to list the worst possible cases first and foremost when talking about misconduct. Fabrication means making up data or results. Falsification means manipulating your materials. Plagiarism means using ideas from others without credit. This is straightforward. However, there are cases when data secrecy should be added to the list of scientific misconduct examples.

Case 1: What if you try to cover up misconduct by hiding your data – is that misconduct in itself?

The UK’s “Concordat to support research integrity” (which is signed by the UK Government, funders and universities) states that misconduct includes:

improper dealing with allegations of misconduct: failing to address possible infringements such as attempts to cover up misconduct and reprisals against whistleblowers

Therefore, if you fail to provide information that shows you did not hide fabrication or falsification of your results, you are guilty of misconduct. For example, in the case of LaCour’s study on gay marriage that recently fell apart, data were manipulated, and in order to prevent anyone from finding out, the main author deleted his raw data. Most articles on the scandal saw all his actions as misconduct. If you cover up data manipulation or fabrication by ‘withholding’ your data, no one would doubt that this is part of the overall misconduct.

But what if you do not try to cover up any misconduct, but you simply don’t want to share your data? Reasons for withholding data can include valid concerns such as patient privacy, confidentiality and copyright issues. Savage and Vickers found out in a survey among researchers that some authors withhold their data because they want to publish more articles with the data. Data collection can be expensive and time-consuming – and some simply want to keep the data exclusively to themselves for that reason. Unfortunately this means that no one can cross-check or replicate their results.

So should we see that as misconduct? Are these authors doing some form of harm to the advancement of knowledge out of self-interest, or are they simply being practical? Again, it depends on how you define misconduct.

Case 2: What if you break professional standards in your field – is that misconduct?

Yes! Some institutions state that it is scientific misconduct when you don’t comply with your field’s professional standards. For example, the National Institutes of Health website lists, after the usual fabrication, falsification and plagiarism problems, another requirement for “making a finding of research misconduct”:

[If] there be a significant departure from accepted practices of the relevant research community” (National Institutes of Health)

Similarly, the UK’s “Concordat to support research integrity” states that research misconduct is the “failure to meet ethical, legal and professional obligations” which includes “behaviour or actions that fall short of the standards of ethics, research and scholarship required to ensure that the integrity of research is upheld.”

Based on such wider definitions that look beyond the usual extreme cases, it would not be far-fetched to say that when you withhold your data you don’t meet professional obligations as a researcher. Of course, this would imply that your research community’s professional standards include transparency and data sharing. And this is exactly the case.

Professional guidelines for political science state that “researchers have an ethical obligation to facilitate the evaluation of their evidence based knowledge claims through data access, production transparency, and analytic transparency.” The American Psychological Association affirmed the principle that sharing data “promotes scientific progress” and “encourages a culture of openness and accountability in scientific research.” Similar guidelines apply in economics, where one of the top journals states:

It is the policy of the American Economic Review to publish papers only if the data used in the analysis are clearly and precisely documented and are readily available.

If a researcher departs from these professional standards – according to the wider definitions I presented, scientific misconduct has occurred.

Slide1 data secrecy

My figure shows the scenario proposed here. On the left you can see features of good science, with authors providing their data and software code, and in the best cases even using pre-registration of their study and version control for maximum transparency. The grey area in the middle shows questionable research practices, which can include p-hacking, sloppy statistics, peer review abuse etc. On the right side and marked ‘red’ is scientific misconduct as commonly defined (falsification, fabrication, plagiarism). Between the grey and red are is data secrecy.

Some may argue that it is not actually misconduct, while I have argued that in some cases one could say that it is indeed misconduct: (1) when trying to cover up misconduct; (2) when deviating significantly from professional standards in your field.

In times where only few authors provide their data on request, classifying data secrecy as misconduct may be a harsh, but necessary step.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Author

Nicole Janz is a political scientist at Cambridge University and teaches research methods, including a Replication Workshop. She blogs and tweets at @polscireplicate.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Academic publishing | Data science | Research ethics

22 Comments