Read The Wisdom of Psychopaths Online
Authors: Kevin Dutton
Of the twenty volunteers who scored low on the PPI, only one reached a verdict within the allotted time. The others were still deliberating. But of the twenty volunteers at the other end of the scale, it was a different story entirely. Without exception, all of them had made their minds up, and the results were unanimous. Holmes was free to walk.
If you’re trying to get your bearings inside this ethical hall of mirrors, don’t panic. The good news is that you’re obviously not a psychopath. In actual fact, on April 23, 1842, ten days after the trial first opened, it took sixteen hours for the jury to return a verdict—almost as long as Holmes had spent in the water. Guilty he might have been—of manslaughter, that is, not murder—but under such psychological
g-force that right and wrong had imploded under the pressure, becoming morally indistinguishable from each other. The judge handed Holmes a token six-month sentence, plus a twenty-dollar fine.
1
In contrast, consider the following case as reported in the
Daily Telegraph
back in 2007:
Two Police Community Support Officers did not intervene to stop a 10-year-old boy from drowning because they were “not trained” to deal with the incident, a senior police officer said today. The [officers] stood at the edge of a pond at a Wigan beauty spot as Jordon Lyon got into trouble while trying to rescue his eight-year-old step-sister. Two fishermen in their 60s jumped in and managed to save the girl, but the officers, who arrived at the scene shortly afterwards, did not attempt a rescue, deciding to wait until trained officers arrived. At the inquest into his death today the boy’s distraught parents demanded to know why more effort was not made to save their son. [His] stepfather said: “… You don’t have to be trained to jump in after a drowning child.”
At first glance, this case and that of able seaman Alexander Holmes have little in common. In fact, they appear to be polar opposites. The former revolves around an extraordinary reluctance to preserve life; the latter, around a curious ambivalence toward saving it. Yet look a little closer and striking similarities emerge. In both scenarios, for instance, it’s the breaking of rules that’s the problem. In the Jordon Lyon affair, the officers were paralyzed by a code of conduct: an all-consuming requirement to toe the party line. Like performing seals, they had been trained beyond their instincts. Trained, you might say, to eschew any action for which they had
not
been trained. In the
William Brown
tragedy, the “rules” were more deeply encoded—they were more functional, and more “ethically hygienic.” Yet they were, one could argue—as some quite vehemently did—no less detrimental to the exigencies of the moment. The seamen, so to speak, were in exactly the same boat as the police officers. Caught in the moral crosshairs on a bleak humanitarian knife-edge, they had to act quickly, decisively, and with manifest disregard for the consequences of their actions. Some did it better than others.
Yet alongside the challenge to our existential comfort zones, these two accounts also conceal, deep within the lining of their tragedies, a rather odd paradox. The fact that conformity is built into our brains is about as nailed down an evolutionary certainty as you can get. When a herd animal is threatened by a predator, what does it do? It huddles closer to the group. As individual salience decreases, chances of survival increase. This is just as true in humans as it is in other species. Streaming behind our supersonic, turbocharged brains are ancient Darwinian vapor trails stretching all the way back to the brutal, blood-soaked killing fields of prehistory.
In an experiment, for instance, that hitched the latest in social networking to its earliest biological origins, social psychologist Vladas Griskevicius, then at Arizona State University, and his coworkers found that when users of an Internet chat room are made to feel under threat, they show signs of “sticking together.” Their views display convergence, and they become more likely to conform to the attitudes and opinions of others in the forum.
But there are clearly times when the opposite is true: when the ability to break free of social convention, to “think outside the group,” can also be a lifesaver—both literally and metaphorically. In 1952, the sociologist William H. Whyte coined the term “groupthink” to conceptualize the mechanism by which tightly knit groups, cut off from outside influence, rapidly converge on normatively “correct” positions, becoming, as they do so, institutionally impervious to criticism: indifferent to out-group opposition, averse to in-group dissent, and ever more confident of their own unimpeachable rectitude.
The psychologist Irving Janis, who conducted much of the empirical work on the phenomenon, describes the process as “a mode of thinking
that people engage in when they are deeply involved in a cohesive in-group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action.” It’s not exactly conducive to good decision making.
As a case in point, take the space shuttle
Challenger
fiasco. Under considerable political pressure to get things under way (Congress, at the time, was seeking a large slice of revenue in furtherance of the space program, and a series of problems had already delayed the launch), scientists and engineers at NASA appeared systemically immune to concerns raised by a coworker, just twenty-four hours before liftoff, over the O-rings in the booster rockets. Though a string of conference calls had specifically been convened to discuss the problem in detail, the decision, incomprehensible in hindsight, was made to press on. The goal, after all, was to get the show on the road.
In the event, it proved disastrous. Inquests revealed, as the villains of the piece, not just the O-rings, but another, more viral, more insidiously carcinogenic culprit: a musty, asphyxiating psychology. The Rogers Commission, a dedicated task force set up by then President Ronald Reagan to investigate the accident, confirmed the nagging, unspoken fears of social psychologists the world over: that NASA’s organizational culture and decision-making processes had played a significant role in the lead-up to the tragedy. Pressure to conform, discounted warnings, sense of invulnerability. It was all there, plain as day.
2
So is the capacity to stand alone, to play by one’s own rules outside the normative safe haven of society, also hardwired? There’s evidence to suggest that it is—and that a fearless, untroubled minority has evolved within our midst.
Just how psychopathy got a toehold in the gene pool is an interesting question. If the “disorder” is so maladaptive, then why does its incidence remain stable across time, with an estimated 1 to 2 percent of the population qualifying as psychopathic?
Andrew Colman, professor of psychology at the University of Leicester, has an equally intriguing answer—one, I suspect, that will forever be close to my heart after a recent entanglement with the Newark Airport interchange.
In 1955, the film
Rebel Without a Cause
made its cinematic debut. Never before had rebellious, misunderstood youth been portrayed so sympathetically on the silver screen. But enough of the armchair criticism. For game theorists at least, one scene towers head and shoulders above the rest: the one in which Jim Stark (played by James Dean) and Buzz Gunderson (played by Corey Allen) hurtle, in a pair of stolen cars, inexorably toward the edge of the cliff in a deadly game of chicken.
Let’s think about that scene for a moment from the point of view of the drivers, says Colman. Or rather, think about a more familiar version of it in which the two protagonists accelerate directly toward each other in an impending head-on collision. Each of them has a choice: adopt the sensible, “non-psychopathic” strategy of swerving to avoid a pile-up, or choose the risky “psychopathic” one of keeping their foot on the gas. These choices, with their differential “payoff points,” constitute a classic you-scratch-my-back-I’ll-scratch-yours-or-then-again-maybe-I-won’t scenario that we can model using game theory—a branch of applied mathematics that seeks to quantify optimal decision-making processes in situations where outcomes depend not on the actions of the individual parties involved, but rather on their interaction (see
figure 3.1
).
Figure 3.1. A game- theoretical model of the evolution of psychopathy
If Jim and Buzz both go for the sensible option and swerve away from each other, the outcome is a draw with second-best payoffs going to each (3). In contrast, if both are psychopathic and decide to see it through, each risks death—or, at best, serious injury. And thus each receives the very worst payoff (1).
As Colman explains, however, if one driver—let’s say Jim—opts for caution, while Buzz turns out to be “nuts,” a differential suddenly appears. Jim drops points and gets the “chicken” payoff (2), while Buzz lucks out, with a maximum haul (4).
It’s a mathematical microcosm of what rubbing shoulders with psychopaths (and the Newark Airport interchange) is actually like. And biologically it works: when the game is played repeatedly in the lab, by computer programs specifically encoded with predetermined response strategies, something very interesting happens. When the payoffs are converted into units of Darwinian fitness and the assumption is made that those players in receipt of larger payoffs give rise to a greater number of offspring who then adopt precisely the same strategy as their progenitors, the population evolves to a stable equilibrium in which the proportion of individuals consistently behaving psychopathically actually mirrors the observed incidence of the disorder in real life (around 1 to 2 percent).
Whoever keeps their foot on the gas—whoever keeps their nerve—is always going to win: provided, that is, that their opposite number
is sane. Behaving “irrationally” might actually sometimes be rational.
In 2010, Hideki Ohira, a psychologist at Nagoya University, and his doctoral student Takahiro Osumi validated Colman’s theory for real. Psychopaths, they discovered, under certain extraordinary circumstances make better financial decisions than the rest of us, for precisely the reason that Colman had so elegantly demonstrated. They behave in a manner that would otherwise appear irrational.
To demonstrate, Ohira and Osumi deployed the ultimatum game—a paradigm widely used in the field of neuroeconomics, which explores, broadly speaking, the way we evaluate primarily monetary, but also certain other types of gain. The game involves two players interacting to decide how a sum of money they are given should be divided. The first player proposes a solution. The second player decides whether or not to accept the offer. If the second player decides to reject it, then both of the protagonists get nothing. But if the second player decides to accept, then the sum is split accordingly.
Take a look at
figure 3.2
and you’ll notice something interesting about the game. The offer that player 1 puts on the table can either be fair or unfair. They can propose to split the money, say, 50-50. Or alternatively, say, 80–20. Now, usually what happens is this. As proposals start approaching the 70-30 mark (in favor of player 1), player 2 goes into rejection mode.
3
After all, it’s not just about the money. There’s a principle at stake here, too!
Figure 3.2. The ultimatum game (1 = Player 1; 2 = Player 2; F = fair; U = unfair; A = accept; R = reject)
But psychopaths, Ohira and Osumi discovered, play the game rather differently. Not only do they show greater willingness to accept unfair offers, favoring simple economic utility over the exigencies of punishment and ego preservation, they are much less bothered by inequity. On measures of electrodermal activity (a reliable index of stress based on the autonomic response of our sweat glands), the difference between psychopaths and other volunteers was telling, to say the least. Psychopaths were far less fazed than controls when screwed by their opposite numbers—and at the conclusion of the study, had more in the bank to show for it. A thicker skin had earned them thicker wallets.