Read On Immunity : An Inoculation (9781555973278) Online
Authors: Eula Biss
“Is the immune system at the heart of a new incarnation of social Darwinism that allows people of different ‘quality’ to be distinguished from each other?” asks the anthropologist Emily Martin. She believes the answer may be yes. Some of the people in her study expressed what she calls “immune machismo,” saying, for instance, that their immune systems “kick ass.” One person suggested, in Martin’s words, that “people without a good living standard need vaccines, whereas vaccines would only clog up the more refined systems of middle-class or upper-class people.” Even assuming that certain immune systems do kick ass, the problem remains that vaccination is most dangerous, in many cases, for people with compromised immune systems. Those of us with impaired immunity depend on people with more functional immune systems to carry immunity and protect us from disease.
“AIDS is everyone’s problem,” the vice president of the Red Cross announced in 1987, though media coverage, the journalist Richard Goldstein observed, positioned the average American as a witness to the epidemic, safe from infection. I have found myself so positioned, invited to regard AIDS as a problem that belongs to gay men and Africa. Disease happens to other people, this thinking implies, people who are not good or clean. That this attitude extends beyond AIDS is evident in the indignation expressed over the vaccination of newborn infants against hep B, another disease passed through blood. The hep B vaccine is often used to illustrate the absurdity of a public health system that would vaccinate a newborn baby against a sexually transmitted disease.
“Why target two and a half million innocent newborns and children?” Barbara Loe Fisher asks of the hep B vaccine. The implication behind the word
innocent
is that only those who are not innocent need protection from disease. All of us who grew up during the AIDS epidemic were exposed to the idea that AIDS was a punishment for homosexuality, promiscuity, and addiction. But if disease is a punishment for anything, it is only a punishment for being alive.
When I was a child, I asked my father what causes cancer and he paused for a long moment before saying, “Life. Life causes cancer.” I took this as an artful dodge until I read Siddhartha Mukherjee’s history of cancer, in which he argues not only that life causes cancer but that cancer
is
us. “Down to their innate molecular core,” Mukherjee writes, “cancer cells are hyperactive, survival-endowed, scrappy, fecund, inventive copies of ourselves.” And this, he notes, “is not a metaphor.”
I
GAVE MY SON a lavishly illustrated edition of
Alice’s Adventures in Wonderland
for his fourth birthday, and it did not take very long for me to realize that this was a gift for me, not him. As Alice engaged in repartee with a dodo early in the book, my son became bored. Alice’s bewilderment and disorientation, which I had anticipated might speak to my son’s experience of being a child in an adult’s world, spoke instead to my own experience navigating the world of information. Being lost in Wonderland is what it feels like to learn about an unfamiliar subject, and research is inevitably a rabbit hole. I fell down it, in my investigation of immunization, and fell and fell, finding that it was much deeper than I anticipated. Like Alice, I fell past shelves full of books, more than I could ever read. Like Alice, I arrived at locked doors. “Drink me,” I was commanded by one source. “Eat me,” I was told by another. They had opposite effects—I grew and shrank, I believed and did not believe. I cried and then found myself swimming in my own tears.
Early in my research, I read an article about three cases of potential vaccine injury that had been winding their way through the courts for the past seven years and had finally been decided. These three cases had been selected as the strongest of more than five thousand similar cases brought to a special division of the US Court of Federal Claims popularly known as “Vaccine Court” and were serving as test cases to determine whether autism could be considered a vaccine injury.
The burden of proof in Vaccine Court is relatively light, and cases are heard by specially appointed attorneys who use “more probable than not” as their guideline for judgment, or as one of these special masters put it, “fifty percent plus a feather.” Even so, the evidence that vaccination led to autism was insufficient in all three test cases. And the evidence against it was, as one special master would note, “overwhelming.” In her decision on
Colten Snyder vs. HHS
, Special Master Denise Vowell wrote, “To conclude that Colten’s condition was the result of his MMR vaccine, an objective observer would have to emulate Lewis Carroll’s White Queen and be able to believe six impossible (or at least highly improbable) things before breakfast.”
The problem, of course, is that believing highly improbable things is something we all do before breakfast. What makes science thrilling is the suggestion that improbable things are indeed possible. The idea, for instance, that pus from a sick cow can be scraped into a wound on a person and make that person immune to a deadly disease is almost as hard to believe now as it was in 1796. When we engage with science, we are in Wonderland. This seems as true for scientists as it is for lay people. But the difference for those of us who are not scientists is that, as with other news, what gets reported back to us most often from the land of science is that which supports our existing fears.
In the years since I became pregnant with my son, I have read about studies suggesting a link between autism and a family’s proximity to a freeway, the mother’s use of antidepressants, the father’s age at conception, and the mother’s infection with influenza during pregnancy. But none of these have enjoyed the kind of press devoted to one small, inconclusive study that suggested a link between vaccination and autism. “We live in a media culture,” the writer Maria Popova observes, “that warps seeds of scientific understanding into sensationalist, definitive headlines about the gene for obesity or language or homosexuality and maps where, precisely, love or fear or the appreciation of Jane Austen is located in the brain—even though we know that it isn’t the clinging to answers but the embracing of ignorance that drives science.”
Overwhelmed by information in my research on vaccination, I began to notice that information itself is overwhelmed at times. While searching for the source of a rumor that the H1N1 vaccine contained squalene, I found dozens of websites and blogs with relevant articles, but they were all the same article, “Squalene: The Swine Flu Vaccine’s Dirty Little Secret Exposed,” written by the physician Joseph Mercola and originally self-published on his website. The reproductions of Mercola’s article that proliferated across the Web early in the pandemic were then, and still remain, uncorrected. But by the time I traced them to the version on his website in the fall of 2009, the original article already included a correction in the header clarifying that none of the H1N1 vaccines distributed in the United States contained squalene. This was not a minor point of correction, but the article had gone viral before being corrected. Like a virus, it had replicated itself repeatedly, overwhelming more credible information about the vaccine.
For centuries before the word
virus
was first used to describe a specific type of microorganism, it was used more generally for anything that spread disease—pus, air, even paper. Now a bit of computer code or the content of a website can be viral. But, as with the kind of virus that infects humans, this content cannot reproduce without hosts.
Misinformation that finds a host enjoys a kind of immortality on the Internet, where it becomes undead. When I asked some other mothers to share with me the information on which they had based their vaccination decisions, one of the first articles I was sent was “Deadly Immunity” by Robert F. Kennedy Jr. It had been published in
Rolling Stone
magazine and online by
Salon
, where it had accumulated, by the time I read it, five significant factual corrections. A year later,
Salon
retracted it entirely. This unusual decision, the editor explained, was made in part because the story was flawed not just in its facts, but also in its logic, which was more difficult to correct. A former editor was critical of the retraction, noting that removing the article from
Salon’s
site did not make it unavailable—it is hosted on a number of other sites—but erased the only version of the article that had been corrected.
Science is, as scientists like to say, “self-correcting,” meaning that errors in preliminary studies are, ideally, revealed by subsequent studies. One of the primary principles of the scientific method is that the results of a study must be reproducible. Until the results of a small study are duplicated by a larger study, they are little more than a suggestion for further research. Most studies are not incredibly meaningful on their own, but gain or lose meaning from the work that has been done around them. And, as the medical researcher John Ioannidis has observed, “most published research findings are false.” The reasons for this are many, and include bias, study size, study design, and the very questions the researcher is asking. This does not mean that published research should be disregarded, but that, as Ioannidis concludes, “What matters is the totality of the evidence.”
Thinking of our knowledge as a body suggests the harm that can be done when one part of that body is torn from its context. Quite a bit of this sort of dismemberment goes on in discussions about vaccination, where individual studies are often used to support positions or ideas that are not supported by the body as a whole. “Any science may be likened to a river,” proposes the biologist Carl Swanson. “It has its obscure and unpretentious beginning; its quiet stretches as well as its rapids, its periods of drought as well as of fullness. It gathers momentum with the work of many investigators and as it is fed by other streams of thought; it is deepened and broadened by the concepts and generalizations that are gradually evolved.”
When one is investigating scientific evidence, one must consider the full body of information, or survey the full body of water. And if the body is large, this becomes an impossible task for one single person. A committee of eighteen medical experts took two years, for instance, to examine 12,000 peer-reviewed articles in order to prepare their 2011 report on vaccine side effects for the Institute of Medicine. The committee included a specialist in research methods, an expert on autoimmune disorders, a medical ethicist, an authority on childhood immune responses, a child neurologist, and a researcher dedicated to studying brain development. In addition to confirming the relative safety of vaccines, their report illustrated the kind of collaboration required to navigate the information now available to us. We do not know alone.
Dracula
was published in 1897, at a time when education reform in Britain had led to unprecedented literacy rates. Information was moving in new ways, reaching people it had not reached before. This was also a time in which new technologies were rapidly emerging and changing the way people lived. A time, in other words, not unlike our own.
Many new inventions of the day are featured in
Dracula
, including the typewriter. The setting of the novel is “nineteenth century up-to-date with a vengeance,” as one of the characters observes, before adding ominously, “And yet, unless my senses deceive me, the old centuries had, and have, powers of their own which mere ‘modernity’ cannot kill.” The heroine of
Dracula
is a working woman who types her own diary and transcribes a number of other documents that become, collectively, the novel. The extent to which its plot depends on the typewriter suggests that this book is, in part, about technologies for reproducing information. Bram Stoker seems optimistic about these technologies, in that they contribute to the triumph of good over evil. But anxieties over the uncertainties of modern life drive the plot of the novel and, as an 1897 review noted, the vampire is ultimately slain in a medieval fashion—an Englishman beheads him while an American plunges a Bowie knife through his heart.
Dracula
has no single narrator. The story unfolds through a collection of diary entries, letters, and newspaper articles. Each of these documents records the observations of a person who has witnessed something of Dracula’s doings, and only by putting these observations together does enough evidence emerge for the central characters to conclude that they are dealing with a vampire. Very early in the book a character notes in his diary, after meeting Dracula for the first time, that his hand is cold, “more like the hand of a dead than a living man,” but Dracula will not be exposed as undead until much later. The reader, having access to all the documents, inevitably understands what is going on long before anyone else does.
The vampire hunters refer to their growing collection of documents often, as if their observations would not exist without them. “One can see the insistence throughout the text,” the literary critic Allan Johnson writes, “of the fundamental value of recorded, empirical knowledge in the fight against the mysterious unknown.” Dracula is the unknown, as much as he is disease. The novel asks
How do we know what we know?
This is a question intended to unsettle the reader, and over a century later it is still an unsettling question.
Just before he leaves London, Count Dracula takes revenge on his pursuers by throwing their original documents, the diaries and letters and recordings they have been keeping of their observations, into a fire. All that is left is a typewritten copy of those documents, which we are to understand is the book we have just read. Because it is a copy and not an original, it is, as one character observes in his final note at the end of the book, not to be believed. “We could hardly ask anyone, even did we wish to,” he writes, “to accept these as proofs of so wild a story.”
Knowledge is, by its nature, always incomplete. “A scientist is never certain,” the scientist Richard Feynman reminds us. And neither, the poet John Keats would argue, is a poet. “Negative capability” was his term for the ability to dwell in uncertainty. My mother, a poet, has been instilling this ability in me since I was a child. “You have to erase yourself,” she says, meaning abandon what I think I know. Or “live the questions,” as Rainer Maria Rilke writes in his
Letters to a Young Poet.
This, my mother reminds me, is as essential to mothering as it is to poetry—we must live the questions our children raise for us.