A Field Guide to Lies: Critical Thinking in the Information Age (19 page)

BOOK: A Field Guide to Lies: Critical Thinking in the Information Age
11.68Mb size Format: txt, pdf, ePub

For the various reasons already mentioned—fraud, incompetence, measurement error, interpretation errors—findings and claims become discredited. Individuals who were found guilty in properly conducted trials become exonerated. Vehicle airbags that underwent multiple inspections get recalled. Pundits change their minds. Merely looking at the newness of a site is not enough to ensure that it hasn’t been discredited. New sites pop up almost weekly claiming things that have been thoroughly debunked. There
are many websites dedicated to exposing urban myths, such as Snopes.com, or to collating retractions, such as RetractionWatch.com.

During the fall of 2015 leading up to the 2016 U.S. presidential elections, a number of people referred to fact-checking websites to verify the claims made by politicians. Politicians have been lying at least since Quintus Cicero advised his brother Marcus to do so in 64 B
.
C.E. What we have that Cicero didn’t is real-time verification. This doesn’t mean that all the verifications are accurate or unbiased, dear reader—you still need to make sure that the verifiers don’t have a bias for or against a particular candidate or party.

Politifact.com, a site operated by the
Tampa Bay Times,
won a Pulitzer Prize for their reporting, which monitors and fact-checks speeches, public appearances, and interviews by political figures, and uses a six-point meter to rate statements as True, Mostly True, Half True, Mostly False, False, and—at the extreme end of false—Pants on Fire
,
for statements that are not accurate and completely ridiculous (from the children’s playground taunt “Liar, liar, pants on fire”).
The
Washington Post
also runs a fact-checking site with ratings from one to four Pinocchios, and awards the prized Geppetto Checkmark for statements and claims that “contain the truth, the whole truth, and nothing but the truth.”

As just one example, presidential candidate Donald Trump spoke at a rally on November 21, 2015, in Birmingham, Alabama. To support his position that he would create a Muslim registry in the United States to combat the threat of terrorism from within the country, he recounted watching “thousands and thousands” of Muslims in Jersey City cheering as the World Trade Center came tumbling down on 9/11/2001. ABC News reporter George Stephanopoulos confronted Trump the following day on camera, noting
that the Jersey City police denied this happened. Trump responded that he saw it on television, with his own eyes, and that it was very well covered. Politifact and the
Washington Post
checked all records of television broadcasts and news reports for the three months following the attacks and found no evidence to support Trump’s claim. In fact, Paterson, New Jersey, Muslims had placed a banner on the city’s main street that read “The Muslim Community Does Not Support Terrorism.”
Politifact summarized its findings, writing that Trump’s recollection “flies in the face of all evidence we could find. We rate this statement Pants on Fire.” The
Washington Post
gave it their Four-Pinocchio rating.

During the same campaign, Hillary Clinton claimed “all of my grandparents” were immigrants. According to Politifact (and based on U.S. census records),
only one grandparent was born abroad; three of her four grandparents were born in the United States.

Copied and Pasted, Reposted, Edited?

One way to fool people into thinking that you’re really knowledgeable is to find knowledgeable-sounding things on other people’s Web pages and post them to your own. While you’re at it, why not add your own controversial opinions, which will now be enrobed in the scholarship of someone else, and increase hits to your site? If you’ve got a certain ideological ax to grind, you can do a hatchet job by editing someone else’s carefully supported argument to promote the position opposite of theirs. The burden is on all of us to make sure that we’re reading the original, unadulterated information, not someone’s mash-up of it.

Supporting Information

Unscrupulous hucksters count on the fact that most people don’t bother reading footnotes or tracking down citations. This makes it really easy to lie. Maybe you’d like your website to convince people that your skin cream has been shown to reverse the aging process by ten years. So you write an article and pepper it with footnotes that lead to Web pages that are completely irrelevant to the argument. This will fool a lot of people, because most of them won’t actually follow up. Those who do may go no further than seeing that the URL you point to is a relevant site, such as a peer-reviewed journal on aging or on dermatology, even though the article cited says nothing about your product.

Even more diabolically, the citation may actually be peripherally related, but not relevant. You might claim that your skin cream contains Vitamin X and that Vitamin X has been shown to improve skin health and quality. So far, so good. But how? Are the studies of Vitamin X reporting on people who spread it on their skin or people who took it orally? And at what dosage? Does your skin product even have an adequate amount of Vitamin X?

Terminology Pitfalls

You may read on CDC.gov that the incidence of a particular disease is 1 in 10,000 people. But then you stumble on an article at NIH.gov that says the same disease has a prevalence of 1 in 1,000. Is there a misplaced comma here, a typo? Aren’t incidence and prevalence the same thing? Actually, they’re not. The incidence of a disease is the number of new cases (incidents) that will be reported in a given
period of time, for example, in a year. The prevalence is the number of existing cases—the total number of people who have the disease. (And sometimes, people who are afraid of numbers make the at-a-glance error that 1 in 1,000 is less than 1 in 10,000, focusing on that large number with all the zeros instead of the word
in
.)

Take multiple sclerosis (MS), a demyelination disease of the brain and spinal cord. About 10,400 new cases are diagnosed each year in the United States, leading to an incidence of 10,400/
322,000,000, or 3.2 cases per 100,000 people—in other words, a 0.0032 percent chance of contracting it. Compare that to the total number of people in the United States who already have it, 400,000, leading to a prevalence rate of 400,000/322,000,000, or 120 cases per 100,000, a 0.12 percent chance of contracting it at some point during your lifetime.

In addition to incidence and prevalence, a third statistic, mortality, is often quoted—the number of people who die from a disease, typically within a particular period of time.
For coronary heart disease, 1.1 million new cases are diagnosed each year, 15.5 million Americans currently have it, and 375,295 die from it each year. The probability of being diagnosed with heart disease this year is 0.3 percent, about a hundred times more likely than getting MS; the probability of having it right now is nearly 5 percent, and the probability of dying from it in any given year is 0.1 percent. The probability of dying from it at some point in your life is 20 percent. Of course, as we saw in Part One, all of this applies to the aggregate of all Americans. If we know more about a particular person, such as their family history of heart disease, whether or not they smoke, their weight and age, we can make more refined estimates, using conditional probabilities.

The incidence rate for a disease can be high while the prevalence and mortality rates can be relatively low. The common cold is an example—there are many millions of people who will get a cold during the year (high incidence), but in almost every case it clears up quickly, and so the prevalence—the number of people who have it at any given time—can be low. Some diseases are relatively rare, chronic, and easily managed, so the incidence can be low (not many cases in a year) but the prevalence high (all those cases add up, and people continue to live with the disease) and the mortality is low.

When evaluating evidence, people often ignore the numbers and axis labels, as we’ve seen, but they also often ignore the verbal descriptors, too. Recall the maps of the United States showing “Crude Birth Rate” in Part One. Did you wonder what “crude birth rate” is? You could imagine that a birth rate might be adjusted by several factors, such as whether the birth is live or not, whether the child survives beyond some period of time, and so on. You might think that because the dictionary definition of the word “crude” is that it is something in a natural or raw state, not yet processed or refined (think crude oil) it must mean the raw, unadulterated, unadjusted number. But it doesn’t. Statisticians use the term crude birth rate to count live births (thus it is an adjusted number that subtracts stillborn infants). In trying to decide whether to open a diaper business, you want the crude birth rate, not the total birth rate (because total birth rate includes babies who didn’t survive birth).

By the way, a related statistic, the crude death rate, refers to the number of people who die at any age. If you subtract this from the crude birth rate, you get a statistic that public policy makers are (and Thomas Malthus was) very interested in: the RNI, rate of natural increase of a population.

O
VERLOOKED
, U
NDERVALUED
A
LTERNATIVE
E
XPLANATIONS

When evaluating a claim or argument, ask yourself if there is another reason—other than the one offered—that could account for the facts or observations that have been reported. There are always alternative explanations; our job is to weigh them against the one(s) offered and determine whether the person drawing the conclusion has drawn the most obvious or likely one.

For example, if you pass a friend in the hall and they don’t return your hello, you might conclude that they’re mad at you. But alternative explanations are that they didn’t see you, were late for a meeting, were preoccupied, were part of a psychology experiment, have taken a vow of silence for an hour, or were temporarily invaded by bodysnatchers. (Or maybe permanently invaded.)

Alternative explanations come up a great deal in pseudoscience and counterknowledge, and they come up often in real science too. Physics researchers at CERN reported that they had discovered neutrinos traveling faster than light. That would have upended a century of Einsteinian theory. It turns out it was just a loose cable in the linear accelerator that caused a measurement error. This underscores the point that a methodological flaw in an extremely complicated experiment is almost always the more likely
explanation than something that would cause us to completely rewrite our understanding of the nature of the universe.

Similarly, if a Web page cites experiments showing that a brand-new, previously unheard-of cocktail of vitamins will boost your IQ by twenty points—and the drug companies don’t want you to know!—you should wonder how likely it is that nobody else has heard of this, and if an alternative explanation for the claim is simply that someone is trying to make money.

Mentalists, fortune-tellers, and psychics make a lot of money performing seemingly impossible feats of mind reading. One explanation is that they have tapped into a secret, hidden force that goes against everything we know about cause and effect and the nature of space-time. An alternative explanation is that they are magicians, using magic tricks, and simply lying about how they do what they do. Lending credence to the latter view is that professional magicians exist, including James Randi, who, so far, has been able to use clever illusions to duplicate every single feat performed by a mentalist. And often, the magicians—in an effort to discredit the self-proclaimed psychics—will tell you how they did the tricks. In fairness, I suppose that it’s possible that it is the
magicians
who are trying to deceive us—they are really psychics who are afraid to reveal their gifts to us (possibly for fear of exploitation, kidnapping, etc.) and they are only
pretending
to use clever illusions. But again, look at the two possibilities: One causes us to throw out everything we know about nature and science, and the other doesn’t. Any psychologist, law enforcement officer, businessperson, divorced spouse, foreign service worker, spy, or lawyer can tell you that people lie; they do so for a variety of reasons and with sometimes alarming
frequency and alacrity. But if you’re facing a claim that seems unlikely, the more likely (alternative) explanation is that the person telling it to you is lying in one way or another.

People who try to predict the future without using psychic powers—military leaders, economists, business strategists—are often wildly off in their predictions because they fail to consider alternative explanations. This has led to a business practice called
scenario planning
—considering all possible outcomes, even those that seem unlikely. This can be very difficult to do, and even experts fail.
In 1968, Will and Ariel Durant wrote:

In the United States the lower birth rate of the Anglo-Saxons has lessened their economic and political power; and the higher birth rate of Roman Catholic families suggests that by the year 2000 the Roman Catholic Church will be the dominant force in national as well as in municipal or state governments.

What they failed to consider was that, during those intervening thirty-two years, many Catholics would leave the Church, and many would use birth control in spite of the Church’s prohibitions. Alternative scenarios to their view in 1968 were difficult to imagine.

Social and artistic predictions get upended too: Experts said around the time of the Beatles that “guitar bands are on their way out.” The reviews of Beethoven’s Fifth Symphony on its debut included a number of negative pronouncements that no one would ever want to hear it again. Science also gets upended. Experts said that fast-moving trains would never work because passengers would
die of asphyxiation. Experts thought that light moved through an invisible “ether.” Science and life are not static. All we can do is evaluate the weight of evidence and judge for ourselves, using the best tools we have at our disposal. One of those tools that is underused is employing creative thinking to imagine alternatives to the way we’ve been thinking all along.

Other books

Three To Get Deadly by Paul Levine
Rag Doll by Catori, Ava
Gasp (Visions) by Lisa McMann
Heretics by S. Andrew Swann
Soldier's Daughters by Fiona Field