Read How We Decide Online

Authors: Jonah Lehrer

How We Decide (34 page)

BOOK: How We Decide
9.51Mb size Format: txt, pdf, ePub
ads

There are no shortcuts to this painstaking process; becoming an expert just takes time and practice. But once you've developed expertise in a particular area—once you've made the requisite mistakes—it's important to trust your emotions when making decisions in that domain. It is feelings, after all, and not the prefrontal cortex, that capture the wisdom of experience. Those subtle emotions saying shoot down the radar blip, or go all in with pocket kings, or pass to Troy Brown are the output of a brain that has learned how to read a situation. It can parse the world in practical terms, so that you know what needs to be done. When you overanalyze these expert decisions, you end up like the opera star who couldn't sing.

And yet, this doesn't mean the emotional brain should always be trusted. Sometimes it can be impulsive and short-sighted. Sometimes it can be a little
too
sensitive to patterns, which is why people lose so much money playing slot machines. However, the one thing you should always be doing is considering your emotions, thinking about why you're feeling what you're feeling. In other words, act like the television executive carefully analyzing the reactions of the focus group. Even when you choose to ignore your emotions, they are still a valuable source of input.

THINK ABOUT THINKING. If you're going to take only one idea away from this book, take this one: Whenever you make a decision, be aware of the kind of decision you are making and the kind of thought process it requires. It doesn't matter if you're choosing between wide receivers or political candidates. You might be playing poker or assessing the results of a television focus group. The best way to make sure that you are using your brain properly is to study your brain at work, to listen to the argument inside your head.

Why is thinking about thinking so important? First, it helps us steer clear of stupid errors. You can't avoid loss aversion unless you know that the mind treats losses differently than gains. And you'll probably think too much about buying a house unless you know that such a strategy will lead you to buy the wrong property. The mind is full of flaws, but they can be outsmarted. Cut up your credit cards and put your retirement savings in a low-cost index fund. Prevent yourself from paying too much attention to MRI images, and remember to judge a wine before you know how much it costs. There is no secret recipe for decision-making. There is only vigilance, the commitment to avoiding those errors that can be avoided.

Of course, even the most attentive and self-aware minds will still make mistakes. Tom Brady, after the perfect season of 2008, played poorly in the Super Bowl. Michael Binger, after a long and successful day of poker, always ends up regretting one of his bets. The most accurate political experts in Tetlock's study still made plenty of inaccurate predictions. But the best decision-makers don't despair. Instead, they become students of error, determined to learn from what went wrong. They think about what they could have done differently so that the next time their neurons will know what to do. This is the most astonishing thing about the human brain: it can always improve itself. Tomorrow, we can make better decisions.

Coda

There are certain statistics that seem like they'll never change: the high school dropout rate, the percentage of marriages that end in divorce, the prevalence of tax fraud. The same used to be true of plane crashes that were due to pilot error. Despite a long list of aviation reforms, from mandatory pilot layovers to increased classroom training, that percentage refused to budge from 1940 to 1990, holding steady at around 65 percent. It didn't matter what type of plane was being flown or where the plane was going. The brute fact remained: most aviation deaths were due to bad decisions in the cockpit.

But then, starting in the early 1990s, the percentage of crashes attributed to pilot error began to decline rapidly. According to the most current statistics, mistakes by the flight crew are responsible for less than 30 percent of all plane accidents, with a 71 percent reduction in the number of accidents caused by poor decision-making. The result is that flying has become safer than ever. According to the National Transportation Safety Board, flying on a commercial plane has a fatality rate of 0.04 per one hundred million passenger miles, making it the least dangerous form of travel by far. (In contrast, driving has a fatality rate of 0.86.) Since 2001, pilot error has caused only one fatal jetliner crash in the United States, even though more than thirty thousand flights take off every day. The most dangerous part of traveling on a commercial airplane is the drive to the airport.

What caused the dramatic reduction in pilot error? The first factor was the introduction in the mid-1980s of realistic flight simulators. For the first time, pilots could practice making decisions. They could refine their reactions to a sudden downdraft in a thunderstorm and practice landing with only one engine. They could learn what it would be like to fly without wing flaps and to land on a tarmac glazed with ice. And they could do all this without leaving the ground.

These simulators revolutionized pilot training. "The old way of teaching pilots was the 'chalk and talk' method," says Jeff Roberts, the group president of civil training at CAE, the largest manufacturer of flight simulators. Before pilots ever entered the cockpit, they were forced to sit through a long series of classroom lectures. They learned all the basic maneuvers of flight while on the ground. They were also taught how to react in the event of various worst-case scenarios. What should you do if the landing gear won't deploy? Or if the plane is struck by lightning? "The problem with this approach," Roberts says, "is that everything was abstract. The pilot has this body of knowledge, but they'd never applied it before."

The benefit of a flight simulator is that it allows pilots to internalize their new knowledge. Instead of memorizing lessons, a pilot can train the emotional brain, preparing the parts of the cortex that will actually make the decision when up in the air. As a result, pilots who are confronted with a potential catastrophe during a real flight—like an engine fire in the air above Tokyo—already know what to do. They don't have to waste critical moments trying to remember what they learned in the classroom. "A plane is traveling four hundred miles per hour," Roberts says. "It's the rare emergency when you've got time to think about what your flight instructor told you. You've got to make the right decision right away."

Simulators also take advantage of the way the brain learns from experience. After pilots complete their "flight," they are forced to endure an exhaustive debriefing. The instructor scrutinizes all of their decisions, so that the pilots think about why, exactly, they decided to gain altitude after the engine fire, or why they chose to land in the hailstorm. "We want pilots to make mistakes in the simulator," Roberts says. "The goal is to learn from those mistakes when they don't count, so that when it really matters, you can make the right decision." This approach targets the dopamine system, which improves itself by studying its errors. As a result, pilots develop accurate sets of flight instincts. Their brains have been prepared in advance.

There was one other crucial factor in the dramatic decline of pilot error: the development of a decision-making strategy known as Cockpit Resource Management (CRM). The impetus for CRM came from a large NASA study in the 1970s of pilot error; it concluded that many cockpit mistakes were attributable, at least in part, to the "God-like certainty" of the pilot in command. If other crew members had been consulted, or if the pilot had considered other alternatives, then some of the bad decisions might have been avoided. As a result, the goal of CRM was to create an environment in which a diversity of viewpoints was freely shared.

Unfortunately, it took a tragic crash in the winter of 1978 for airlines to decide to implement this new system. United Flight 173 was a crowded DC-8 bound for Portland, Oregon. About ten miles from the runway, the pilot lowered the landing gear. He noticed that two of his landing-gear indicator lights remained off, suggesting that the front wheels weren't properly deployed. The plane circled around the airport while the crew investigated the problem. New bulbs were put in the dashboard. The autopilot computers were reset. The fuse box was double-checked. But the landing-gear lights still wouldn't turn on.

The plane circled for so long that it began to run out of fuel. Unfortunately, the pilot was too preoccupied with the landing gear to notice. He even ignored the flight engineer's warning about the fuel levels. (One investigator described the pilot as "an arrogant S.O.B.") By the time the pilot looked at his gas gauge, the engines were beginning to shut down. It was too late to save the plane. The DC-8 crash-landed in a sparsely populated Portland suburb, killing ten and seriously wounding twenty-four of the 189 on board. Crash investigators later concluded that there was no problem with the landing gear. The wheels were all properly deployed; it was just a faulty circuit.

After the crash, United trained all of its employees with CRM. The captain was no longer the dictator of the plane. Instead, flight crews were expected to work together and constantly communicate with one another. Everyone was responsible for catching errors. If fuel levels were running low, then it was the job of the flight engineer to make sure the pilot grasped the severity of the situation. If the copilot was convinced that the captain was making a bad decision, then he was obligated to dissent. Flying a plane is an extremely complicated task, and it's essential to make use of every possible resource. The best decisions emerge when a multiplicity of viewpoints are brought to bear on the situation. The wisdom of crowds also applies in the cockpit.

Remember United Flight 232, which lost all hydraulic power? After the crash-landing, the pilots all credited CRM with helping them make the runway. "For most of my career, we kind of worked on the concept that the captain was
the
authority on the aircraft," says Al Haynes, the captain of Flight 232. "And we lost a few airplanes because of that. Sometimes the captain isn't as smart as we thought he was." Haynes freely admits that he couldn't have saved the plane by himself that day. "We had 103 years of flying experience there in the cockpit [on Flight 232], trying to get that airplane on the ground. If I hadn't used CRM, if we had not had everybody's input, it's a cinch we wouldn't have made it."

In recent years, CRM has moved beyond the cockpit. Many hospitals have realized that the same decision-making techniques that can prevent pilot error can also prevent unnecessary mistakes during surgery. Consider the experience of the Nebraska Medical Center, which began training its surgical teams in CRM in 2005. (To date, more than a thousand hospital employees have undergone the training.) The mantra of the CRM program is "See it, say it, fix it"; all surgical-team members are encouraged to express their concerns freely to the attending surgeon. In addition, team members engage in postoperation debriefings at which everyone involved is supposed to share his or her view of the surgery. What mistakes were made? And how can they be avoided the next time?

The results at the Nebraska Medical Center have been impressive. A 2007 analysis found that after fewer than six months of CRM training, the percentage of staff members who "felt free to question the decisions of those with more authority" had gone from 29 percent to 86 percent. More important, this increased willingness to point out potential errors led to a dramatic decrease in medical mistakes. Before CRM training, only around 21 percent of all cardiac surgeries and cardiac catheterizations were classified as "uneventful cases," meaning that nothing had gone wrong. After CRM training, however, the number of "uneventful cases" rose to 62 percent.

The reason CRM is so effective is that it encourages flight crews and surgical teams to think together. It deters certainty and stimulates debate. In this sense, CRM creates the ideal atmosphere for good decision-making, in which a diversity of opinions is openly shared. The evidence is looked at from multiple angles, and new alternatives are considered. Such a process not only prevents mistakes but also leads to startling new insights.

TO SIT IN
a modern airplane cockpit is to be surrounded by computers. Just above the windshield are the autopilot terminals, which can keep a plane on course without any input from the pilot. Right in front of the thrust levers is a screen relaying information about the state of the plane, from its fuel levels to the hydraulic pressure. Nearby is the computer that monitors the flight path and records the position and speed of the plane. Then there's the GPS panel, a screen for weather updates, and a radar monitor. Sitting in the captain's chair, you can tell why it's called the glass cockpit: everywhere you look there's another glass screen, the digital output of the computers underneath.

These computers are like the emotional brain of the plane. They process a vast amount of information and translate that information into a form that can be quickly grasped by the pilot. The computers are also redundant, so every plane actually contains multiple autopilot systems running on different computers and composed in different programming languages. Such diversity helps prevent mistakes, since each system is constantly checking itself against the other systems.

These computers are so reliable that they perform many of their tasks without any pilot input. If, for example, the autopilot senses a strong headwind, it will instantly increase thrust in order to maintain speed. The pressure in the cabin is seamlessly adjusted to reflect the altitude of the plane. If a pilot is flying too close to another plane, the onboard computers emit loud warning sirens, forcing the flight crew to notice the danger. It's as if the plane has an amygdala.

Pilots are like the plane's prefrontal cortex. Their job is to monitor these onboard computers, to pay close attention to the data on the cockpit screens. If something goes wrong, or if there's a disagreement among the various computers, then it's the responsibility of the flight crew to resolve the problem. The pilots must immediately intervene and, if necessary, take control of the plane. The pilots must also set the headings, supervise the progress of the flight, and deal with the inevitable headaches imposed by air-traffic control. "People who aren't pilots tend to think that when the autopilot is turned on, the pilot can just take a nap," my flight instructor in the simulator says. "But planes don't fly themselves. You can't ever relax in the cockpit. You always have to be watching, making sure everything is going according to plan."

BOOK: How We Decide
9.51Mb size Format: txt, pdf, ePub
ads

Other books

The Buddha's Return by Gaito Gazdanov
Good Mourning by Elizabeth Meyer
Captivate by Jones, Carrie
La Llorona by Marcela Serrano
Much More Than a Mistress by Michelle Celmer
Amanda Scott by Lady Escapade
Westwood by Stella Gibbons
Arrival by Charlotte McConaghy