The Better Angels of Our Nature: Why Violence Has Declined (126 page)

Read The Better Angels of Our Nature: Why Violence Has Declined Online

Authors: Steven Pinker

Tags: #Sociology, #Psychology, #Science, #Amazon.com, #21st Century, #Crime, #Anthropology, #Social History, #Retail, #Criminology

BOOK: The Better Angels of Our Nature: Why Violence Has Declined
9.97Mb size Format: txt, pdf, ePub
Milgram ran his experiments in the 1960s and early 1970s, and as we have seen, many attitudes have changed since then. It’s natural to wonder whether Westerners today would still obey the instructions of an authority figure to brutalize a stranger. The Stanford Prison Experiment is too bizarre to replicate exactly today, but thirty-three years after the last of the obedience studies, the social psychologist Jerry Burger figured out a way to carry out a new one that would pass ethical muster in the world of 2008.
270
He noticed that in Milgram’s original studies, the 150-volt mark, when the victim first cries out in pain and protest, was a point of no return. If a participant didn’t disobey the experimenter then, 80 percent of the time he or she would continue to the highest shock on the board. So Burger ran Milgram’s procedure but broke off the experiment at the 150-volt mark, immediately explaining the study to the participants and preempting the awful progression in which so many people tortured a stranger over their own misgivings. The question is: after four decades of fashionable rebellion, bumper stickers that advise the reader to Question Authority, and a growing historical consciousness that ridicules the excuse “I was only following orders,” do people still follow the orders of an authority to inflict pain on a stranger? The answer is that they do. Seventy percent of the participants went all the way to 150 volts and so, we have reason to believe, would have continued to fatal levels if the experimenter had permitted it. On the bright side, almost twice as many people disobeyed the experimenter in the 2000s as did in the 1960s (30 percent as compared to 17.5 percent), and the figure might have been even higher if the diverse demographics of the recent study pool had been replaced by the white-bread homogeneity of the earlier ones.
271
But a majority of people will still hurt a stranger against their own inclinations if they see it as part of a legitimate project in their society.
 
Why do people so often impersonate sheep? It’s not that conformity is inherently irrational.
272
Many heads are better than one, and it’s usually wiser to trust the hard-won wisdom of millions of people in one’s culture than to think that one is a genius who can figure everything out from scratch. Also, conformity can be a virtue in what game theorists call coordination games, where individuals have no rational reason to choose a particular option other than the fact that everyone else has chosen it. Driving on the right or the left side of the road is a classic example: here is a case in which you really don’t want to march to the beat of a different drummer. Paper currency, Internet protocols, and the language of one’s community are other examples.
But sometimes the advantage of conformity to each individual can lead to pathologies in the group as a whole. A famous example is the way an early technological standard can gain a toehold among a critical mass of users, who use it because so many other people are using it, and thereby lock out superior competitors. According to some theories, these “network externalities” explain the success of English spelling, the QWERTY keyboard, VHS videocassettes, and Microsoft software (though there are doubters in each case). Another example is the unpredictable fortunes of bestsellers, fashions, top-forty singles, and Hollywood blockbusters. The mathematician Duncan Watts set up two versions of a Web site in which users could download garage-band rock music.
273
In one version users could not see how many times a song had already been downloaded. The differences in popularity among songs were slight, and they tended to be stable from one run of the study to another. But in the other version people could see how popular a song had been. These users tended to download the popular songs, making them more popular still, in a runaway positive feedback loop. The amplification of small initial differences led to large chasms between a few smash hits and many duds—and the hits and duds often changed places when the study was rerun.
Whether you call it herd behavior, the cultural echo chamber, the rich get richer, or the Matthew Effect, our tendency to go with the crowd can lead to an outcome that is collectively undesirable. But the cultural products in these examples—buggy software, mediocre novels, 1970s fashion—are fairly innocuous. Can the propagation of conformity through social networks actually lead people to sign on to ideologies they don’t find compelling and carry out acts they think are downright wrong? Ever since the rise of Hitler, a debate has raged between two positions that seem equally unacceptable: that Hitler single-handedly duped an innocent nation, and that the Germans would have carried out the Holocaust without him. Careful analyses of social dynamics show that neither explanation is exactly right, but that it’s easier for a fanatical ideology to take over a population than common sense would allow.
There is a maddening phenomenon of social dynamics variously called pluralistic ignorance, the spiral of silence, and the Abilene paradox, after an anecdote in which a Texan family takes an unpleasant trip to Abilene one hot afternoon because each member thinks the others want to go.
274
People may endorse a practice or opinion they deplore because they mistakenly think that everyone else favors it. A classic example is the value that college students place on drinking till they puke. In many surveys it turns out that every student, questioned privately, thinks that binge drinking is a terrible idea, but each is convinced that his peers think it’s cool. Other surveys have suggested that gay-bashing by young toughs, racial segregation in the American South, honor killings of unchaste women in Islamic societies, and tolerance of the terrorist group ETA among Basque citizens of France and Spain may owe their longevity to spirals of silence.
275
The supporters of each of these forms of group violence did not think it was a good idea so much as they thought that everyone else thought it was a good idea.
Can pluralistic ignorance explain how extreme ideologies may take root among people who ought to know better? Social psychologists have long known that it can happen with simple judgments of fact. In another hall-of-fame experiment, Solomon Asch placed his participants in a dilemma right out of the movie
Gaslight
.
276
Seated around a table with seven other participants (as usual, stooges), they were asked to indicate which of three very different lines had the same length as a target line, an easy call. The six stooges who answered before the participant each gave a patently wrong answer. When their turn came, three-quarters of the real participants defied their own eyeballs and went with the crowd.
But it takes more than the public endorsement of a private falsehood to set off the madness of crowds. Pluralistic ignorance is a house of cards. As the story of the Emperor’s New Clothes makes clear, all it takes is one little boy to break the spiral of silence, and a false consensus will implode. Once the emperor’s nakedness became common knowledge, pluralistic ignorance was no longer possible. The sociologist Michael Macy suggests that for pluralistic ignorance to be robust against little boys and other truth-tellers, it needs an additional ingredient: enforcement.
277
People not only avow a preposterous belief that they think everyone else avows, but they punish those who fail to avow it, largely out of the belief—also false—that everyone else wants it enforced. Macy and his colleagues speculate that false conformity and false enforcement can reinforce each other, creating a vicious circle that can entrap a population into an ideology that few of them accept individually.
Why would someone punish a heretic who disavows a belief that the person himself or herself rejects? Macy et al. speculate that it’s to prove their sincerity—to show other enforcers that they are not endorsing a party line out of expedience but believe it in their hearts. That shields them from punishments by their fellows—who may, paradoxically, only be punishing heretics out of fear that they will be punished if they don’t.
The suggestion that unsupportable ideologies can levitate in midair by vicious circles of punishment of those who fail to punish has some history behind it. During witch hunts and purges, people get caught up in cycles of preemptive denunciation. Everyone tries to out a hidden heretic before the heretic outs him. Signs of heartfelt conviction become a precious commodity. Solzhenitsyn recounted a party conference in Moscow that ended with a tribute to Stalin. Everyone stood and clapped wildly for three minutes, then four, then five . . . and then no one dared to be the first to stop. After eleven minutes of increasingly stinging palms, a factory director on the platform finally sat down, followed by the rest of the grateful assembly. He was arrested that evening and sent to the gulag for ten years.
278
People in totalitarian regimes have to cultivate thoroughgoing thought control lest their true feelings betray them. Jung Chang, a former Red Guard and then a historian and memoirist of life under Mao, wrote that on seeing a poster that praised Mao’s mother for giving money to the poor, she found herself quashing the heretical thought that the great leader’s parents had been rich peasants, the kind of people now denounced as class enemies. Years later, when she heard a public announcement that Mao had died, she had to muster every ounce of thespian ability to pretend to cry.
279
To show that a spiral of insincere enforcement can ensconce an unpopular belief, Macy, together with his collaborators Damon Centola and Robb Willer, first had to show that the theory was not just plausible but mathematically sound. It’s easy to prove that pluralistic ignorance, once it is in place, is a stable equilibrium, because no one has an incentive to be the only deviant in a population of enforcers. The trick is to show how a society can get there from here. Hans Christian Andersen had his readers suspend disbelief in his whimsical premise that an emperor could be hoodwinked into parading around naked; Asch paid his stooges to lie. But how could a false consensus entrench itself in a more realistic world?
The three sociologists simulated a little society in a computer consisting of two kinds of agents.
280
There were true believers, who always comply with a norm and denounce noncompliant neighbors if they grow too numerous. And there were private but pusillanimous skeptics, who comply with a norm if a few of their neighbors are enforcing it, and enforce the norm themselves if a lot of their neighbors are enforcing it. If these skeptics aren’t bullied into conforming, they can go the other way and enforce skepticism among their conforming neighbors. Macy and his collaborators found that unpopular norms can become entrenched in some, but not all, patterns of social connectedness. If the true believers are scattered throughout the population and everyone can interact with everyone else, the population is immune to being taken over by an unpopular belief. But if the true believers are clustered within a neighborhood, they can enforce the norm among their more skeptical neighbors, who, overestimating the degree of compliance around them and eager to prove that they do not deserve to be sanctioned, enforce the norm against each other and against
their
neighbors. This can set off cascades of false compliance and false enforcement that saturate the entire society.
The analogy to real societies is not far-fetched. James Payne documented a common sequence in the takeover of Germany, Italy, and Japan by fascist ideologies in the 20th century. In each case a small group of fanatics embraced a “naïve, vigorous ideology that justifies extreme measures, including violence,” recruited gangs of thugs willing to carry out the violence, and intimidated growing segments of the rest of the populations into acquiescence.
281
Macy and his collaborators played with another phenomenon that was first discovered by Milgram: the fact that every member of a large population is connected to everyone else by a short chain of mutual acquaintances—six degrees of separation, according to the popular meme.
282
They laced their virtual society with a few random long-distance connections, which allowed agents to be in touch with other agents with fewer degrees of separation. Agents could thereby sample the compliance of agents in other neighborhoods, disabuse themselves of a false consensus, and resist the pressure to comply or enforce. The opening up of neighborhoods by long-distance channels dissipated the enforcement of the fanatics and prevented them from intimidating enough conformists into setting off a wave that could swamp the society. One is tempted toward the moral that open societies with freedom of speech and movement and well-developed channels of communication are less likely to fall under the sway of delusional ideologies.
Macy, Willer, and Ko Kuwabara then wanted to show the false-consensus effect in real people—that is, to see if people could be cowed into criticizing other people whom they actually agreed with if they feared that everyone else would look down on them for expressing their true beliefs.
283
The sociologists mischievously chose two domains where they suspected that opinions are shaped more by a terror of appearing unsophisticated than by standards of objective merit: wine-tasting and academic scholarship.

Other books

Missing Hart by Ella Fox
The Boston Stranglers by Susan Kelly
Ice Breaker by Catherine Gayle
New Albion by Dwayne Brenna
The Undead Pool by Kim Harrison
Gone by Annabel Wolfe