Share this:

Many or even most conspiracy theories are demonstrably false. But some, like Watergate, are true. How can we determine which are which? Drawing on his own experiences with conspiracy theorists, Stephan Lewandowsky writes that conspiratorial thinking is not necessarily truth-seeking behavior, but can often be a near-self destructive form of skepticism. We can use this skepticism, along with conspiracists’ tendency towards pattern-seeking and self-sealing reasoning, to flush out which are false, and which might be true after all.

9/11 was a false flag operation planned by the US government. That same government sold weapons to Iran in order to fund Central American terrorists, and also created AIDS to exterminate gay people, and the CIA organized a fake vaccination drive in Afghanistan to get Osama bin Laden’s family DNA.

There is no doubt that two of those conspiracies actually happened and were hushed up by the conspirators, whereas the other two are widely dismissed as fantastical conspiracy theories. This is the long-standing dilemma confronting philosophers: conspiracies do occur and they can seem quite outlandish and unexpected once publically revealed—who would have thought that Oliver North would sell arms to Iran from the basement of the White House and launder the money to supply arms to Nicaraguan rebels in contravention of explicit legal prohibitions. But by the same token, most conspiracy theories are bunkum—we can be quite certain that the US Government did not create AIDS or fly airliners into the Twin Towers.

What are the differences between conspiracy theories that are almost certainly false and the evidence for actual conspiracies? This is a non-trivial philosophical challenge, but it is an important one to sort out, given that the mere exposure to conspiracy theories can undermine people’s trust in government services and institutions. Conspiracy theories are not harmless fun, especially if they lead people to refuse life-saving vaccination or to fire an assault rifle in a pizza restaurant in Washington.

Conspiracists’ reasoning is often broken

One promising approach to classifying conspiracy theories has been to shift the focus to the people who believe in them, rather than on how these theories are justified by those people. A recent volume edited by Joe Uscinski of the University of Miami brought together a number of contributions under this umbrella, including a chapter on my experiences with people who believe in conspiracy theories.

My argument rests on the premise that, by and large, our cognition is a truth-tracking device.

There is much evidence that people’s cognition is “optimal” in many circumstances. People often conform to Bayes Theorem, the gold standard for how one should update beliefs in light of new evidence. Even when confronted with esoteric tasks, such as estimating the duration of the reign of Egyptian Pharaohs, people are surprisingly well attuned to the actual quantities. And when people get together to form a scientific community, they create an extremely useful and largely rational enterprise that has delivered a stunning amount of reliable knowledge.

But sometimes the way people think about things takes suboptimal twists and turns.

Someone who believes that their spouse or friend has been replaced by an impostor—the Capgras delusion—is unlikely to be acting in a rational manner. Likewise, I argue that conspiracist cognition is characterized by certain patterns of reasoning that are less truth-seeking or reliable than “standard” cognition.

People who believe in conspiracy theories typically exhibit an almost nihilistic degree of skepticism, to the point of distrusting more and more knowledge-producing institutions. It is not unusual for climate deniers to distrust the official temperature record based on a long catalogue of presumed improprieties by bureaus of meteorology around the world.

Photo by Jan Mellström on Unsplash

This overriding and immutable suspicion of the “official” account leads to several consequences. It may prevent the person from recognizing that some events occur by accident or are simply trivial. The way that conspiracists think means that they often believe that nothing occurs by accident; any random event is re-interpreted as evidence for the theory. For example, the fact that Timothy McVeigh fled the scene of the Oklahoma City bombing in a car without license plates is interpreted as proof of his innocence and that he was framed by federal agents.

A further consequence of immutable suspicion is that a person may abandon specific hypotheses when they become unsustainable, but those corrections will not compromise the overall abstraction that “something must be wrong” and that the official account is based on deception. At that higher level of abstraction, neither the validity of any particular hypothesis nor the coherence of the theory matter. What matters is that there must be a conspiracy. In consequence, conspiracy theories are often incoherent. It is not uncommon for climate deniers to be equally convinced that global temperature cannot be measured accurately and that there has been global cooling for the last 10 years.

Finally, and perhaps most crucially, conspiracists’ thought processes are inherently self-sealing, such that contrary evidence is re-interpreted as evidence for the theory. This reflects the assumption that the stronger the evidence against a conspiracy (e.g., climate scientists being exonerated of wrong-doing), the more the conspirators must want to hide the truth (i.e., investigations were rigged by George Soros to exonerate the scientists).

Using conspiracy thinking to classify conspiracy theories

What do these criteria for conspiracist cognition—nihilistic skepticism, seeing pattern in randomness, incoherence, self-sealing reasoning, and a few others not mentioned—buy us?

I argue that they help us in at least three ways. First, they can be clearly operationalized. Naïve judges have successfully used those criteria to differentiate between scientific critique and conspiracist discourse. This renders the criteria useful in determining the status of potentially contested material. Second, in another study I found that if participants are trained to detect incoherence in an argument, they subsequently become more resilient to false argumentation that is common in conspiracist rhetoric.

Finally, and perhaps more controversially, I suggest that these criteria may allow us to infer the likely truth value of a conspiracy theory.

One of the reasons we should, in the long run, trust science to deliver truthful insights into the world is because of the way it works. Many (though not all) philosophers of science believe that the way in which knowledge is socially constructed can give us insights into the likely utility of that knowledge. Likewise, a conspiracy that is revealed by conventional cognition—such as investigative journalism or the actions of whistleblowers—has sufficient potential virtue to be taken seriously. Unsurprisingly, many conspiracies that are now widely accepted as true, such as the Iran-Contra scandal, were revealed by conventional sources of information.

The converse, arguably, also holds. For the reasons just outlined, conspiracist cognition is unlikely to be a truth-tracking device. It follows, by the same logic of the social construction of knowledge, that if all evidence for a theory is based on conspiracist cognition, it is likely a conspiracy theory that ought to be dismissed rather than a true conspiracy.

  • This article was inspired by the experiences reported in the chapter: Lewandowsky, S. (2019). In whose hands the future? In J. E. Uscinski (Ed.), Conspiracy theories and the people who believe them (pp. 149-177). Oxford: Oxford University Press.

Please read our comments policy before commenting.             

Note:  This article gives the views of the author, and not the position of USApp– American Politics and Policy, nor of the London School of Economics.

Shortened URL for this post: https://bit.ly/2BYXCCz


About the author

Stephan Lewandowsky – University of Bristol (@STWorg)
Stephan Lewandowsky is a Professor at the School of Psychological Science and Chair of Cognitive Psychology at the University of Bristol. His recent research interest is in exploring the potential conflict between human cognition and the physics of global climate change, which has led him into collaborative research in climate science and climate modeling. More information at http://www.cogsciwa.com.

Print Friendly, PDF & Email