Read Manufacturing depression Online
Authors: Gary Greenberg
I was, in other words, at risk of exactly what Peter Kramer, in
Against Depression,
warns about: mistaking mental illness for clarity. Of course, I had good company. Not only Job, with his conviction that a life in which everything you’ve lived for can be taken away so easily is not what it’s cracked up to be, but a whole pantheon of unacknowledged legislators—like William James, who, in his
Varieties of Religious Experience
put it this way:
The normal process of life
contains moments as bad as any of those which insane melancholy is filled with, moments in which radical evil gets its innings and takes its solid turn.
The lunatic’s visions of horror are all drawn from the material of daily fact.
Psychiatrists like Kramer cut through the philosophical question of when the “normal process of life” becomes insanity with their assertion that two weeks of those moments is quite enough. As unsatisfying as this answer is—give me Yahweh in a thunderstorm any day; at least then the fact that the line is arbitrarily drawn by those with the power to do so is obvious—I thought it could work in my favor. When I showed up at Mass General, veteran of the dissatisfactions of a middle-class, middle-aged American life—nothing spectacular, but enough to keep me up nights and make me blue for a couple of weeks at a time on a regular basis—I figured that a diagnosis of minor depression was a sure thing.
I’ll tell you all about how wrong I was, and why, and what happened, in due time. First, though, let me get the credibility problem out of the way. Yes, I went to Mass General with an agenda. I figured my temperament and the American Psychiatric Association’s zeal to create a new disease were made for each other—and added up to a good opportunity to write about a little-explored region of the medical-industrial complex. Clinical trials are the pivot of the depression industry, the venue in which the drug companies get the government to guarantee the public that an antidepressant works and won’t hurt you. But in the bargain, they also confer legitimacy on the disease that the drug purports to treat. Every approval of an antidepressant also ratifies the claim that the disease it treats really exists.
Many diseases don’t need this kind of advertising. Whatever
disease
means—and this question is far from settled—no one will deny that cancer or malaria deserve the label. But sometimes the public has to be convinced. That’s why GlaxoSmithKline (GSK) once paid a doctor to say that an “
uncontrollable urge
to move [the] legs, or ‘creepy-crawly’ sensations in the legs…that often leads to sleep disruption” was actually a disease called restless legs syndrome.
RLS, said GSK, causes insomnia, marital discord, and poor job performance. This campaign was a transparent attempt to persuade people to think of their suffering as a disease, and the linchpin of the argument (and, of course, the reason GSK was bothering to make it) was that the drug Requip, which as a treatment for Parkinson’s disease had reaped disappointing profits, relieved RLS. If a medicine makes a problem better, this logic goes, then the problem must have been a disease to begin with, and its sufferers are entitled to all the benefits we bestow upon the sick: sympathy, research money, insurance coverage, and so on.
So a clinical trial of a treatment for a research diagnosis like minor depression is also a trial of the diagnosis itself. That’s why I went to Mass General: to see if I could catch a glimpse of the depression machinery as it cranked up to turn out a new model.
As motives go, mine was less straightforward than, say, wanting to benefit humankind or get myself cured. But neither was my visit to Mass General a repeat of one of the greatest pranks ever perpetrated on psychiatry:
a study by David Rosenhan
, a sociologist who in 1972 sent a cadre of his graduate students into various emergency rooms complaining, dishonestly, that they were hearing the word “thud” in their heads. The students were hospitalized, most of them for schizophrenia. Once there, they behaved normally, or what passes for normally among graduate sociology students. They read, asked questions, and took extensive notes, all of which was duly noted in their charts as more symptoms. When they were released, it was with the diagnosis of paranoid schizophrenia, in remission.
Rosenhan called the 1973 paper he published in
Science
about his caper “On Being Sane in Insane Places.” You can imagine how embarrassed psychiatrists were when the story hit the press that the
One Flew over the Cuckoo’s Nest
nightmare was true. The profession was already under siege, thanks to a society that for many reasons was beginning to suspect that mental illnesses weren’t real, but merely ways of pathologizing nonconformity.
A cottage industry
sprang up to rebut, denounce, and generally scream at Rosenhan. But no one took issue with his finding that it’s easy to get diagnosed with a
mental illness and then much, much harder, if not impossible, to get undiagnosed. That part, as opposed to his allegedly shoddy ethics and research methods, was unassailable.
This was the best part of my going to Mass General: I didn’t have to lie. Not that I hadn’t thought about it. I do know the DSM-IV pretty well, certainly well enough to fake just about any psychiatric illness. But I couldn’t get myself to do it just for the sake of my writing career.
So when I found out
there was a study I thought I could get into by telling the truth, I jumped at the opportunity.
It wasn’t all ambition, however. The possibility that the trial drug, which was Celexa, might make me feel better—well, I can’t deny this was intriguing. I’ve got nothing against better living through chemistry. I’ve practiced my own amateur version of it for many years, in fact. And I’ve spent a couple of decades listening to patients (and friends) sing the praises of antidepressants and seeing the results up close and personal. It was enough to make me curious, and sometimes envious, especially when I was depressed. Sometimes I’d wonder if it wasn’t just stubbornness that stopped me from visiting a psychiatrist, some point of pride or a fear that I’d be sucked into the Prozac cult or forced to abandon some of my deepest convictions if the drug worked—reluctance that, whatever the explanation, felt insuperable, and, at times, depressing. The clinical trial gave me perfect cover from myself, a way to check out the drugs while maintaining that I was only doing research. Call it the Kinsey approach.
So when I got diagnosed with major depression instead of minor depression, I suppose I was only getting my comeuppance for trying to exploit the system, which was in turn glad to bestow a disease upon me, but not necessarily the one I wanted.
I didn’t get kicked out of Mass General. To the contrary, my doctor immediately gave me five major depression studies to choose from. But it was impossible to ignore the fact that after listening to my answers to his questions, a capable and compassionate doctor told me I had a serious mental illness—something wrong with my brain that was causing the trouble in my mind—and a much worse one than I had thought to begin with. I was the last person in the world that I would have expected to believe this. But as I’ll describe, the idea that my difficulties were an illness caused by biochemical imbalances grew on me during the trial—especially the part about the possibility that I could be cured of what I had long ago come to think of as myself.
But even before that happened, even as I walked to his office for the first time, it had dawned on me that this whole vast apparatus with its towers and pavilions arrayed like the castles of the Magic Kingdom, its maze of bustling streets—the doctors checking their watches, the patients, some wheeling IV stands down the sidewalk (one of them even sneaking a smoke as the liquid dripped into his veins), the family sitting crying on a bench—was a monument to one brilliant and magnificent idea: that our suffering is caused by diseases that can be cured by medicine. Well, actually, those are two ideas—that diseases exist in nature and that we can improve nature by finding the culprit and getting rid of it—and they seem, like all common sense, to be unassailable and timeless. They may even seem not to be ideas, but simple facts.
But they are ideas, invented by people rather than discovered in nature, and much newer ones than you might think. Indeed, the belief that we can turn what ails us into the target for a drug first appeared about 150 years ago and was not widely accepted until the early part of the twentieth century. Neither have illness and cure always been related in the way we usually think they are: that we identify diseases and then look for the remedy. Drug-driven diagnoses are not original to the depression industry, or for that matter to the restless legs syndrome industry. In fact, when it comes to the modern understanding of disease, the drugs have often come first.
Betty Twarog’s mussels weren’t the first mollusks to figure in the history of depression.
Murex trunculus
and
Murex bandaris,
two species
of sea snails that litter the coasts of Italy and Asia Minor, beat her critters to the punch by a good two thousand years. Both varieties are four or five inches long, with pastel-colored bands spiraling up their shells.
M. trunculus
looks like
an elephant’s head, its spiny shaft like a trunk, while
M. bandaris
features spikes that stick up like the points of a child’s jack. You might stoop to pick up a few of these specimens on a beachcombing walk, but they’re a little too subtle to be your prize find. Unless, that is, you happen to be the unnamed dog belonging to Heracles that,
according to legend
, took a mouthful of snails while on a walk with his master. The legend doesn’t say why Heracles was at the beach in Tyre (I suppose that even Greek half-god heroes need a vacation occasionally), but it does tell us that after the dog spit them out, its mouth had turned an extraordinary shade of purple.
It was left to the Phoenicians to figure out how to exploit this discovery by crushing, salting, and boiling the snails until they had extracted a dye that, according to Pliny the Elder, was “
exactly the colour
of clotted blood, and…of a blackish hue to the sight, but of a shining appearance when held up to the light.” It was laborious to produce—the mucus of thousands of snails was needed to color a single robe—but so glorious that it eventually fetched its weight in silver at ancient markets. Tyrian purple (also known as royal purple), became the color of kings and generals and nabobs.
And, in the late 1850s, of the ladies of Paris. Inspired, some would say inflamed, by the Empress Eugénie, wife of Napoleon III, whose haute couture graced the pages of the fashion magazines just then coming into production, Parisians couldn’t get enough of the scarlet-purple hue known to them as mauve. A little less red than the Phoenician original, mauve was nonetheless a gorgeous color and demand was high. “
Mauve Measles
,” as
Punch
called it, spread quickly across the Channel, leaving Englishwomen with a “measly rash of ribbons.”
The Parisian mauve came not from snail mucus but from
bat guano and from certain lichens
that also could stain fabric purple.
These sources were plentiful and easier to refine than the snails, but supplies still had to be found and secured, harvested and processed. Variations in sunlight and soil conditions and other vagaries of nature could affect hue and quantity. None of this would necessarily have been a problem worth solving—after all, wasn’t this kind of inconsistency the way of the natural world?—if it weren’t for the Industrial Revolution, which was imposing a new expectation: that commodities, particularly consumer commodities, should be uniform and easily available and certainly not made out of bat poop or snail snot if it could at all be avoided. It was left to a kid to figure out how to meet this emerging market.
By the time William Perkin
entered the City of London School in 1851, at the age of thirteen, he had already considered a number of careers: carpenter (his father’s trade), engineer, painter, musician. At his new school he took a shine to science and sought entry to Michael Faraday’s Saturday lectures about electricity, a request that was granted by the great man himself. But nothing caught his fancy like the twice-weekly chemistry lectures taught by Thomas Hall, his writing master. Soon young Perkin prevailed upon his father to allow him to set up a lab at home where he could explore the principles he was learning from Hall.
Chemistry barely existed as a scientific discipline in mid-nineteenth-century England, where it was associated with apothecaries and other charlatans. But Faraday, along with Prince Albert and other prominent Britons, saw the advances being made by chemists on the Continent, especially in Germany, and rounded up the money for the Royal College of Chemistry, which opened in 1845 with twenty-six students. Hall had attended the first classes there, and in 1853 he urged Perkin to enroll.
Perkin’s mentor at the Royal College was August Wilhelm von Hofmann, who had been
recruited from Germany
by Prince Albert himself.
Hofmann had an interest
that was perfect for demonstrating
to a skeptical educational establishment the practical value of studying chemistry. He thought that natural substances could be synthesized in the lab so long as you started with materials that contained the elements that chemists were just identifying as the building blocks of the natural world: carbon, oxygen, hydrogen, sulfur, and nitrogen. Nature, according to Hofmann, assembled these atoms into molecules, and then into the substances of daily life, in much the same way that industrialists were assembling raw materials into finished products. Figure out how nature did this, and
you
could conceivably make anything it could make—and without all the bother of, say, gathering and boiling snails.
Hofmann was lucky to have this insight at a time when a rich supply of those basic elements was just coming available—and on the cheap, as an unwanted by-product of industrialization: coal tar, the stinky residue of the process by which coal was refined into the gas that fueled London’s lamps.
An enterprising Scotsman, Charles Macintosh
, figured out how to smear coal tar on textiles to make a rubbery waterproof cloth, and soon people were wearing macintoshes in the Glasgow rain. But no one knew what else to do with the stuff, so mostly it got dumped into streams, where it killed fish and made washday a nightmare.
Hofmann, however, thought he could extract value
from the hydrocarbons, and he set Perkin to work on one of his pet projects: making quinine.