Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time (53 page)

Read Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time Online

Authors: Michael Shermer

Tags: #Creative Ability, #Parapsychology, #Psychology, #Epistemology, #Philosophy & Social Aspects, #Science, #Philosophy, #Creative ability in science, #Skepticism, #Truthfulness and falsehood, #Pseudoscience, #Body; Mind & Spirit, #Belief and doubt, #General, #Parapsychology and science

BOOK: Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time
12.95Mb size Format: txt, pdf, ePub

6. Locus of Control and Belief

One of the most interesting areas of research on the psychology of belief is in the area of what psychologists call locus of control. People who measure high on external locus of control tend to believe that circumstances are beyond their control and that things just happen to them. People who measure high on internal locus of control tend to believe they are in control of their circumstances and that they make things happen (Rotter 1966). External locus of control leads to greater anxiety about the world, whereas internal locus of control leads one to be more confident in one's judgment, skeptical of authority, and less compliant and conforming to external influences. In relation to beliefs, studies show that skeptics are high in internal locus of control whereas believers are high in external locus of control (Marshall et al. 1994). A 1983 study by Jerome Tobacyk and Gary Milford of introductory psychology students at Louisiana Tech University, for example, found that those who scored high in external locus of control tended to believe in ESP, witchcraft, spiritualism, reincarnation, precognition, and were more superstitious than those students who scored high in internal locus of control.

An interesting twist to this effect, however, was found by James McGarry and Benjamin Newberry in a 1977 study of strong believers in and practitioners of ESP and psychic power. Surprisingly, this group scored high in internal locus of control. The authors offered this explanation: "These beliefs [in ESP] may render such a person's problems less difficult and more solvable, lessen the probability of unpredictable occurrences, and offer hope that political and governmental decisions can be influenced." In other words, a deep commitment to belief in ESP, which usually entails believing that one has it, changes the focus from external to internal locus of control.

The effect of locus of control on belief is also mitigated by the environment, where there is a relationship between the uncertainty of an environment and the level of superstitious belief (as uncertainty goes up so too do superstitions). The anthropologist Bronislaw Malinowski (1954), for example, discovered that among the Trobriand Islanders (off the coast of New Guinea), the farther out to sea they went to fish the more they developed superstitious rituals. In the calm waters of the inner lagoon, there were very few rituals. By the time they reached the dangerous waters of deep sea fishing, the Trobrianders were also deep into magic. Malinowski concluded that magical thinking derived from environmental conditions, not inherent stupidities: "We find magic wherever the elements of chance and accident, and the emotional play between hope and fear have a wide and extensive range. We do not find magic wherever the pursuit is certain, reliable, and well under the control of rational methods and technological processes. Further, we find magic where the element of danger is conspicuous." Think of the superstitions of baseball players. Hitting a baseball is exceedingly difficult, with the best succeeding barely more than three out of every ten times at bat. And hitters are known for their extensive reliance on rituals and superstitions that they believe will bring them good luck. These same superstitious players, however, drop the superstitions when they take the field, since most of them succeed in fielding the ball more than 90 percent of the time. Thus, as with the other variables that go into shaping belief that are themselves orthogonal to intelligence, the context of the person and the belief system are important.

7. Influence and Belief

Scholars who study cults (or, as many prefer to call them by the less pejorative term, "New Religious Movements") explain that there is no simple answer to the question "Who joins cults?" The only consistent variable seems to be age—young people are more likely to join cults than older people—but beyond that, variables such as family background, intelligence, and gender are orthogonal to belief in and commitment to cults. Research shows that two-thirds of cult members come from normal functioning families and showed no psychological abnormalities whatsoever when they joined the cult (Singer, 1995). Smart people and non-smart people both readily join cults, and while women are more likely to join such groups as J. Z. Knight's "Ramtha"-based cult (she allegedly channels a 35,000-year old guru named "Ramtha" who doles out life wisdom and advice, in English with an Indian accent no less!), men are more likely to join militias and other anti-government groups.

Again, although intelligence may be related to how well one is able to justify one's membership in a group, and while gender may be related to which group is chosen for membership, intelligence and gender are unrelated to the general process of joining, the desire for membership in a cult, and belief in the cult's tenets. Psychiatrist Marc Galanter (1999), in fact, suggests that joining such groups is an integral part of the human condition to which we are all subject due to our common evolutionary heritage. Banding together in closely knit groups was a common practice in our evolutionary history because it reduced risk and increased survival by being with others of our perceived kind. But if the process of joining is common among most humans, why do some people join while others do not?

The answer is in the persuasive power of the principles of influence and the choice of what type of group to join. Cult experts and activists Steve Hassan (1990) and Margaret Singer outline a number of psychological influences that shape people's thoughts and behaviors that lead them to join more dangerous groups (and that are quite independent of intelligence): cognitive dissonance; obedience to authority; group compliance and conformity; and especially the manipulation of rewards, punishments, and experiences with the purpose of controlling behavior, information, thought, and emotion (what Hassan 2000 calls the "BITE model"). Social psychologist Robert Cialdini (1984) demonstrates in his enormously persuasive book on influence, that all of us are influenced by a host of social and psychological variables, including physical attractiveness, similarity, repeated contact or exposure, familiarity, diffusion of responsibility, reciprocity, and many others.

Smart Biases in Defending Weird Beliefs

In 1620 English philosopher and scientist Francis Bacon offered his own Easy Answer to the Hard Question:

The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate. .. . And such is the way of all superstitions, whether in astrology, dreams, omens, divine judgments, or the like; wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, although this happened much oftener, neglect and pass them by.

Why do smart people believe weird things? Because, to restate my thesis in light of Bacon's insight,
smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.

As we have already seen, there is a wealth of scientific evidence in support of this thesis, but none more so than two extremely powerful cognitive biases that make it difficult for any of us to objectively evaluate a claim. These biases, in fact, are especially well manipulated by smart people: the Intellectual Attribution Bias and the Confirmation Bias.

Intellectual Attribution Bias.
When Sulloway and I asked our subjects why they believe in God, and why they think other people believe in God (and allowed them to provide written answers), we were inundated with thoughtful and lengthy treatises (many stapled multipage, typewritten answers to their survey) and we discovered that they could be a valuable source of data. Classifying the answers into categories, here were the top reasons given:

WHY PEOPLE BELIEVE IN GOD

1.
     
Arguments based on good design/natural beauty/perfection/complexity of the world or universe. (28.6%)
2.
     
The experience of God in everyday life/a feeling that God is in us. (20.6%)
3.
     
Belief in God is comforting, relieving, consoling, and gives meaning and purpose to life. (10.3%)
4.
     
The Bible says so. (9.8%)
5.
     
Just because/faith/or the need to believe in something. (8.2%)
WHY PEOPLE THINK OTHER PEOPLE BELIEVE IN GOD
1.
     
Belief in God is comforting, relieving, consoling, and gives meaning and purpose to life. (26.3%)
2.
     
Religious people have been raised to believe in God. (22.4%)
3.
     
The experience of God in everyday life/a feeling that God is in us. (16.2%)
4.
     
Just because/faith/or the need to believe in something. (13.0%)
5.
     
People believe because they fear death and the unknown. (9.1%)
6.
     
Arguments based on good design/natural beauty/perfection/ complexity of the world or universe. (6.0%)

Note that the intellectually based reasons for belief in God of "good design" and "experience of God," which were in 1st and 2nd place in the first question of
why do you believe in God?
, dropped to 6th and 3rd place for the second question of
why do you think other people believe in God?
Taking their place as the two most common reasons given for why other people believe in God were the emotionally based categories of religion being judged as "comforting" and people having been "raised to believe" in God. Grouping the answers into two general categories of rational reasons and emotional reasons for belief in God, we performed a Chi-Square test and found the difference to be significant (Chi-Square[l] = 328.63 [r =.49], N = 1,356, p < .0001). With an odds ratio of 8.8 to 1, we may conclude that people are nearly nine times more likely to attribute their own belief in God to rational reasons than they are other people's belief in God, which they will attribute to emotional reasons.

One explanation for this finding is the attribution bias, or the attribution of causes of our own and others' behaviors to either a situation or a disposition. When we make a situational attribution, we identify the cause in the environment ("my depression is caused by a death in the family"); when we make a dispositional attribution, we identify the cause in the person as an enduring trait ("her depression is caused by a melancholy personality"). Problems in attribution may arise in our haste to accept the first cause that comes to mind (Gilbert et al. 1988). Plus, social psychologists Carol Tavris and Carole Wade (1997) explain that there is a tendency for people "to take credit for their good actions (a dispositional attribution) and let the situation account for their bad ones." In dealing with others, for example, we might attribute our own success to hard work and intelligence, whereas the other person's success is attributed to luck and circumstance (Nisbett and Ross 1980).

We believe that we found evidence for an intellectual attribution bias, where we consider our own actions as being rationally motivated, whereas we see those of others as more emotionally driven. Our commitment to a belief is attributed to a rational decision and intellectual choice ("I'm against gun control because statistics show that crime decreases when gun ownership increases"); whereas the other person's belief is attributed to need and emotion ("he's for gun control because he's a bleeding-heart liberal who needs to identify with the victim"). This intellectual attribution bias applies to religion as a belief system and to God as the subject of belief. As pattern-seeking animals, the matter of the apparent good design of the universe, and the perceived action of a higher intelligence in the day-to-day contingencies of our lives, is a powerful one as an intellectual justification for belief. But we attribute other people's religious beliefs to their emotional needs and upbringing.

Smart people, because they are more intelligent and better educated, are better able to give intellectual reasons justifying their beliefs that they arrived at for nonintellectual reasons. Yet smart people, like everyone else, recognize that emotional needs and being raised to believe something are how most of us most of the time come to our beliefs. The intellectual attribution bias then kicks in, especially in smart people, to justify those beliefs, no matter how weird they may be.

Confirmation Bias.
At the core of the Easy Answer to the Hard Question is the confirmation bias, or the tendency to seek or interpret evidence favorable to already existing beliefs, and to ignore or reinterpret evidence unfavorable to already existing beliefs. Psychologist Raymond Nickerson (1998), in a comprehensive review of the literature on this bias, concluded: "If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. ... it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations."

Although lawyers purposefully employ a type of confirmation bias in the confrontational style of reasoning used in the courtroom by purposefully selecting evidence that best suits their client and ignoring contradictory evidence (where winning the case trumps the truth or falsity of the claim), psychologists believe that, in fact, we all do this, usually unconsciously. In a 1989 study, psychologists Bonnie Sherman and Ziva Kunda presented students with evidence that contradicted a belief they held deeply, and with evidence that supported those same beliefs; the students tended to attenuate the validity of the first set of evidence and accentuate the value of the second. In a 1989 study with both children and young adults who were exposed to evidence inconsistent with a theory they preferred, Deanna Kuhn found that they "either failed to acknowledge discrepant evidence or attended to it in a selective, distorting manner. Identical evidence was interpreted one way in relation to a favored theory and another way in relation to a theory that was not favored." Even in recall after the experiment, subjects could not remember what the contradictory evidence was that was presented. In a subsequent study in 1994, Kuhn exposed subjects to an audio recording of an actual murder trial and discovered that instead of evaluating the evidence objectively, most subjects first composed a story of what happened, and then sorted through the evidence to see what best fit that story. Interestingly, those subjects most focused on finding evidence for a single view of what happened (as opposed to those subjects willing to at least consider an alternative scenario) were the most confident in their decision.

Other books

Mumbo Jumbo by Ishmael Reed
The Oak and the Ram - 04 by Michael Moorcock
The Food of Love by Anthony Capella
The Memory Witch by Wood, Heather Topham
Lady Roma's Romance by Cynthia Bailey Pratt
Goblin Moon by Candace Sams
Half-Resurrection Blues by Daniel José Older
Too Close to the Sun by Jess Foley
Under the Light by Whitcomb, Laura