During the Cold War, the US intelligence community relied on linear ways of thinking and deductive analytic mental shortcuts. Josh Kerbel writes that the US government’s continuing attachment to using these heuristics in today’s messy global security environment is leading to repeated misperceptions of global challenges and how to tackle them. Only by tempering those analytic heuristics with more “synthetic” ones, he argues, will Washington be able to effectively anticipate and cope with the emergent challenges that a highly complex global environment inevitably generates.
In 1990, the world was on the cusp of major transformation. The divided, static, and hierarchical Cold War era was giving way to the entangled, dynamic, and networked world of today. The cognitive challenges posed by such a complex environment were memorably highlighted that year in Michael Crichton’s novel featuring genetically engineered dinosaurs, Jurassic Park, which referred disparagingly to the scientists who cloned them: “They don’t have intelligence. They have what I call ‘thintelligence.’ They see the immediate situation. They think narrowly and they call it ‘being focused.’ They don’t see the surround. They don’t see the consequences.”
Thinking narrowly in today’s messy global security environment
As a long-time intelligence analyst who spends a lot of time thinking and writing about the craft, that passage still resonates with me. Its characterization of the scientists’ thinking sounds alarmingly like how we in the US national security community—including the Intelligence Community—tend to think. But what Crichton called “thinking narrowly” and “being focused,” we call “analysis.”
Admittedly, analysis (i.e., reductive thought) works well for understanding and predicting the behavior of linear, complicated—discrete and hierarchical—issues. Consequently, it was an effective approach for thinking about the modern national security community’s early and important challenges: the Soviet Union and the Cold War. Indeed, it was analysis that enabled the appropriately modulated—linear behavior—that kept the Cold War from going “hot.”
However, analysis does not work particularly well when applied to nonlinear, complex issues like today’s much messier global security environment. That environment is neither discrete nor hierarchical, it’s unbounded and networked. Thus, it defies and confounds approaches which are overly analytic.
Our track record over the past 30 years reinforces this conclusion. It’s a history littered with miscalculations and unanticipated outcomes rooted in obsolete linear, analytical mental shortcuts that did not accurately reflect how an ever more complex world works.
Analytic heuristics and cognitive failure
Heuristics are a type of thinking strategy which people use to come to conclusions about situations quickly, often when they have incomplete information. Going back to these now obsolete mental shortcuts, it was the additivity heuristic—the notion that the whole is equal to the sum of its parts—that prompted us to simplistically reduce terrorism to terrorists and the Global War on Terrorism (as that terrible characterization suggests) to a unidimensional military response—a war. The result: we failed to understand terrorism as a broader systemic phenomenon—until Hamas’s surprise attack on Israel in October 2023 so violently reminded us.
Next, our support for China’s entry into the WTO in December 2001 rested erroneously on the linear heuristic of predictable, neat, and identifiable cause and effect: economic growth would inevitably spur political reform and ultimately China’s acceptance of the existing international order.
Photo by Mathieu Stern on Unsplash
Then, the 2008 financial crisis surprised us because the repeatability heuristic boosted our pre-existing expectation that the financial and economic system would just continue to hum along indefinitely as it had previously behaved. Until, to our surprise, it didn’t.
Last, and more recently, our cognitive failure during the COVID-19 pandemic was rooted in the proportionality heuristic—the notion that a small (large) input results in small (large) output—which hampered us from thinking about how a pandemic in a hyperconnected world could ripple exponentially across traditional disciplinary boundaries, not to mention the globe.
The increasing primacy of emergent challenges
So, what must we do to address these evident cognitive deficiencies? First, we must understand emergent phenomena.
Emergent phenomena—including terrorism, financial crises, and pandemics as highlighted above—are systemic macro-behaviors that grow organically from complex, interconnected, and interdependent systems. They “emerge” without top-down direction. And a scan of the national security horizon reveals many more emergent phenomena: climate change, the breakdown of our information environment, urbanization, globalization, mass migration, inequality, extremism, etc.
Unfortunately, we in the US struggle mightily with emergent phenomena. No better current example exists than how we are thinking about the China challenge. Viewing China in excessively discrete—yes, analytic—terms, we miss that China, unlike the Soviet Union, is fully enmeshed in today’s hyper-complex global system and thus the challenges it poses are substantially—if counter-intuitively—emergent. Because China is integral to all the emergent challenges, we will not be able to effectively address any of these phenomena—or China itself for that matter—without understanding how China fits into the larger systemic picture.
The need for complementary “synthetic” heuristics
Next, we need to infuse our thinking with new, more holistic—let’s call them “synthetic”—heuristics that are keyed to the behavioral characteristics of nonlinear, complex systems.
First, the whole can be more (or less) than the sum of its parts. The essence of a complex system cannot be discerned from its discrete pieces but rather in the interconnections and interdependencies that make it a systemic whole. To understand any complex system, we must see it in a “big picture” way and think about it synthetically or holistically—not analytically.
Second, cause-and-effect dynamics are often not readily identifiable, even in retrospect. What we often see is correlation, not causation. Any input in a complex system has more than one output—there are always side, second order, or tertiary effects.
Third, a complex system is not repeatable. Although circumstances might resemble what came before, they are not the same. Analogical reasoning is a bedrock practice of traditional intelligence analysis, but this new perspective compels us to think carefully about seeing our relationship with China, for example as many do, as a “new Cold War.”
Fourth, disproportionate dynamics between inputs and outputs are common. Seemingly large inputs can be absorbed and dampened by the system; conversely, small inputs can be amplified. Think of a bold policy measure that ends up having little or no observable effect. Or how the collapse of Lehman Brothers precipitated a global financial meltdown.
These new synthetic heuristics are vital to our ability to anticipate and cope with emergent security challenges. Conversely, even vastly improved analysis that’s not complemented by such synthetic heuristics will not spare us a Jurassic fate—it will merely render us excellent dinosaurs.
- Subscribe to LSE USAPP’s email newsletter to receive a weekly article roundup.
- Please read our comments policy before commenting.
- Note: This article gives the views of the author, and not the position of USAPP – American Politics and Policy, nor the London School of Economics.
- Shortened URL for this post: https://wp.me/p3I2YF-eEq