LSE - Small Logo
LSE - Small Logo

Helena Vieira

April 5th, 2018

Why flying is safer than ever and what we can learn from it

1 comment | 3 shares

Estimated reading time: 5 minutes

Helena Vieira

April 5th, 2018

Why flying is safer than ever and what we can learn from it

1 comment | 3 shares

Estimated reading time: 5 minutes

When it was reported that there had been no deaths from commercial passenger jet accidents in 2017, President Trump was quick to claim credit on Twitter: “Since taking office I have been very strict on Commercial Aviation. Good news – it was just reported that there were Zero deaths in 2017, the best and safest year on record!”

But the trend long predates Trump. In the United States, there have been no deaths from commercial airline accidents since 2013. In fact, for decades, there has been a general downward trend in the number of accidents per departure.

Complex systems are prone to failure, but commercial aviation seems to be an exception. Since the 1960s, U.S. commercial aviation has become significantly more complex, and yet, flying has become safer. In our book Meltdown, we argue that what lies behind this remarkable trend is a handful of smart approaches to management and design—solutions that hold lessons for all of us. Here are three of them.

1. Teach people to speak up—and to listen

A common error during airplane accidents used to be the failure of first officers to question the captain’s poor decisions. When the captain was flying the airplane, he (and most often it was a “he”) was hard to challenge, and his mistakes went unchecked.

All this began to change in the late 1970s with a training program known as Crew Resource Management (CRM). The program revolutionised the culture not just of the cockpit but also of the whole industry. It reframed safety as a team issue and put all crew members—from the captain to the cabin crew—on more equal footing. It was no longer disrespectful to question the decisions of a superior; it was required. And CRM taught crew members the language of dissent—detailed scripts for getting another person’s attention, expressing concerns, proposing a solution, and getting an explicit agreement.

The lesson isn’t simply that people lower down in the hierarchy should speak up and higher-ups should listen. What CRM has shown is that people can be taught to speak up and to listen. The ability to express and embrace dissent isn’t hardwired in our personality or cultural background; it’s a skill we can learn.

2. Learn from small failures and close calls

In 1976, the US Federal Aviation Administration created an industry-wide system to collect anonymous safety reports. The Aviation Safety Reporting System (ASRS), run by an independent unit at NASA, collects thousands of reports each month. Beyond receiving immunity for a mistake, it’s a point of pride for pilots to submit ASRS reports. They know the reports make air travel safer.

The reports are stored in a searchable database that anyone can access, and NASA highlights safety trends in its monthly newsletter, Callback. One issue, for example, included the story of a crew that received a last-minute runway change, which necessitated an overly aggressive descent to a lower altitude. The crew couldn’t make the altitude in time, so they submitted a report. In response, the FAA changed the approach procedure.

Here, too, there is a broader lesson. Small lapses and near misses can be a rich source of data about what might go wrong in our system. Mistakes shouldn’t be secrets. By openly sharing stories of failures and near failures, we can treat errors as an opportunity to learn—rather than as the impetus for a witch hunt. Implementing such a system, as sociologist Charles Perrow put it, can be “a powerful cognitive ‘wake-up call’; for designers, it offers a database that often generates counterintuitive findings about system flaws; for the organisation, it reinforces the notion that someone is, indeed, trying.”

3. Fight complexity with transparency

Aviation has also taken a smart approach to design. Aircraft engineers realise that sleek, beautiful designs aren’t always better; by increasing transparency, inelegant designs can be safer.

Consider the cockpit of a Boeing 737. In front of each pilot, there is a large W‑shaped control yoke mounted on a three-foot-tall control column. “When we fly the 737, we get these huge control wheels in front of us—and they’re moving when either pilot moves them,” says Ben Berman, an airline captain and accident investigator. “If I pull back hard, then the first officer’s column will also move back and probably bump their knees or poke them in the belly.”

To the casual observer, these controls seem oversized and awkward. But there is something brilliant about them: they make what’s happening clearly visible. There is no confusion about who is doing what. If your copilot—gripped by panic in a crisis—pulls back on the controls when the right move would be to push them forward, you can’t miss the error. It’s literally in your face and likely hitting you in the stomach.

Boeing engineers could certainly implement a more elegant design, such as sleek, high-tech touch screens (which are used in some cars) or small sidesticks (which are used, for example, in the Airbus A330). But the resulting elegance would come at the cost of transparency.

The bigger lesson is that the answer to complexity isn’t simplicity; it is transparency. There is tremendous value in being able to see the state of a system by simply looking at it. Transparent design makes it hard for us to do the wrong thing—and it makes it easier to realise if we have made a mistake.

The success of aviation isn’t due to “very strict” policies by any president. In fact, a strict and punitive approach is likely to be counterproductive. They key to aviation’s success was insight rather than oversight. The industry has embraced a learning-focused approach and developed its own solutions: teaching people how to speak up and listen, enabling everyone to learn from errors, and valuing transparency over elegance. But aviation’s lessons have been written in blood: it took many accidents until the industry began to pay serious attention to these issues. The good news is that these lessons are valuable across fields, and we can adopt them without paying such a high price.

♣♣♣

Notes:


Christopher Clearfield is a former derivatives trader and a licensed commercial pilot. He is the coauthor of Meltdown: Why Our Systems Fail and What We Can Do About It.

 

 

András Tilcsik holds the Canada Research Chair in Strategy, Organizations, and Society at the University of Toronto’s Rotman School of Management. He is the coauthor of Meltdown: Why Our Systems Fail and What We Can Do About It.

 

 

 

About the author

Helena Vieira

Posted In: Management

1 Comments