“But I'm saying all this based on what I believe, which is that medicine is a fundamental social good that should be part of the social fabric of our society, and that our society should be providing doctors and universal health care.”
It's late at night, in August of 2000, and Jerry and I are sitting in his home in Guilford after we've spent a day together at the hospital and his clinic, and only three weeks after he has come back from an international AIDS conference in Durban, South Africaâa city to which he will return in a year, during a sabbatical from Yale, in order
to collaborate with health-care workers there in setting up AIDS prevention and treatment programs.
“But everything I'm saying, you know,” he says, “is based on what I've come to see as the central medical issue of our time, whether in our country, or in Africa: the cruel disparities in access to prevention and treatment.”
What had once been a traditional emphasis on the individual patient and individual practitioner did not decline only since my friends began their medical training in the late 1950s and early 1960s, or with the coming of managed care in more recent years, but began to be modified in the early nineteenth century, at a time when the hospital was assuming new and increasing importance.
With the emergence of hospital medicine, the attention of doctors started to shift from the diagnosis and treatment of complexes of symptoms in individual sufferers to the diagnosis and classification of “cases.” The focus then shifted further, from the “âsick-man' as a small world unto himself” to the patient “as a collection of synchronized organs, each with a specialized function.”
*
Stanley Jackson's summary of subsequent developments helps place current beliefs and practices in perspective. After laboratory medicine “began to take shape in German universities in the latter half of the nineteenth century,” he writes, “the theories and techniques of the physical sciences were introduced into the study of living organisms, and experimental physiology flourished.
*
Cell theory entered the scene. The microscope and staining techniques brought to histology a new importance. Bacteriological investigations brought new perspectives on disease and led to new modes of therapeutic investigation. And clinical diagnosis was gradually reorganized around various âchemical tests of body substances designed to identify morbid physiological processes.' Attention gradually shifted away from the sick person to the case and then to the cell. Gradually the âdistance' between the sick person and the physician increased, and even when they were face to face. Or perhaps more accurately, the patient was gradually depersonalized in the doctor-patient relationship, and, all too often, the physician related to him or her more as an object than a sufferer.”
Such changes, by the beginning of the twentieth century, came to be associated with the notion of “scientific medicine,” wherein the “scientific” and “objective” treatment of patientsâtreatment based on data gathered in exams, from machines, and from laboratory tests, and, later on, from randomized controlled studies and evidence-based medicineâbecame the respected and prevalent mode for the practicing physician.
*
“As the scientific mode of gathering information, reaching a diagnosis, and planning a treatment increasingly took center stage in the clinical world,” Jackson writes, “a humanistic mode of knowing patients, relating to them personally, and working with them as suffering persons often became less valued.”
*
American medical schools, in fact, still seem dominated by reforms recommended nearly one hundred years ago by Abraham Flexner in the report of 1910 that bears his name. After reviewing ways doctors were educated in European universities, Flexner recommended that education in American medical schools start with a strong foundation in basic sciences, followed by the study of clinical medicine in a hospital environment that encouraged critical thinking, and, especially, research.
The appearance, rise, and hegemony of clinical academic departments between the two world wars that came about in large part because of the Flexner Reportâthat began in the late nineteenth century, and that accelerated after World War IIâresulted in the emergence of what we know as “clinical science,” wherein experimentation on patients or laboratory animals derived directly from the problems doctors encountered at the patient's bedside.
“In effect,” David Weatherall comments, this “set the scene for the appearance of modern high-technology medical practice.”
*
“Those who criticize modern methods of teaching doctorsâ” he continues, “âin particular, [a] Cartesian approach to the study of human biology and diseaseâbelieve that the organization of clinical departments along Flexner's lines may have done much to concentrate their minds on diseases rather than on those who suffer from them.”
In
Time to Heal
, the second in his two-volume history of American
medical education, Kenneth Ludmerer notes how the ascendancy of molecular biology in the 1970s and 1980s transformed biomedical research, especially in the fields of molecular biology and molecular disease, cell biology, immunobiology, and neuroscience, and created “a new theoretical underpinning of medical knowledge” wherein “the gaze of investigators focused on ever smaller particles, such as genes, proteins, viruses, antibodies, and membrane receptors.”
*
Although the results were often “gratifying in terms of medical discovery,” Ludmerer writes, “for the first time a conspicuous separation of functions occurred between clinical research on one hand and patient care and clinical education on the other.”
Many clinical departments established discrete faculty tracks: an academic track, pursued by “physician-scientists” (formerly called “clinical investigators”), and a “clinician-teacher,” or “clinical-scholar” track, pursued by those whose interests lay primarily in teaching and patient care. The result, according to Ludmerer: “the growing estrangement between medical science and medical practice.”
In addition, he submits, the premium put on speed and high productivity in academic hospitals (called “throughput”)âa direct result of fiscal measures derived from managed-care policies, “carried negative implications” for the education of medical students.
*
“Habits of thoroughness, attentiveness to detail, questioning, listening, thinking, and caring were difficult if not impossible to instill when both patient care and teaching were conducted in an eight- or ten-minute office visit,” Ludmerer explains. Few medical students “were likely to conclude that these sacrosanct qualities were important when they failed to observe them in their teachers and role models.”
In addition, Ludmerer shows, medical education began “to revert to the corporate form it had occupied before the Flexnerian revolution” and “a money standard [started] to replace a university standard.”
*
The greatest difficulties medical schools experienced in the 1990s were in receiving payment for time, yet “time remained the most fundamental ingredient of the rich educational environment that academic health centers had always been expected to provide,”
Ludmerer explains. “Without time, instructors could not properly teach, students and residents could not effectively learn, and investigators could not study problems.”
“Medicine is still a guild really,” Jerry says, “and it possesses many aspects of a guild. Mentoring and the passing down of traditions and expertise from one generation to the next are central because it's the way you become socialized into the profession. Unfortunately, we don't have enough strong role models these days to exemplify the best traditions in medicine, and this is due in large part, it seems to me, to the fact that so much of medical education is now dominated by basic science and new technologies.”
Like Rich and Phil, Jerry believes that many of the deficiencies in the practice of medicine today derive from the kind of education common to most medical schoolsâtwo or three years of basic sciences, followed by pathology, and a further few years in which students acquire clinical skills by working on the wards of large teaching hospitals.
“In the early stages of medical education, you learn the bricks and mortar of physiology, anatomy, biochemistry, and microbiology,” Jerry says, “and all that is very, very important. Given the paths most of our careers will take, however, we probably learn much more than we need to knowâand to the neglect of other essential elements of our profession.”
“Is this the best way to train a doctor?” Weatherall asks in his study of medical education, and he focuses on the questions my friends ask: “Do two years spent in the company of cadavers provide the best introduction to a professional lifetime spent communicating with sick people and their families? Does a long course of pathology, with its emphasis on diseased organs, and exposure to the esoteric diseases that fill the wards of many of our teaching hospitals, prepare students for the very different spectrum of illness they will encounter in the real world?
*
And is the protracted study of the âharder' basic biological sciences, to the detriment of topics like psychology and sociology, the best way to introduce a future doctor to human aspects of clinical practice?”
“Because most of us were trained since World War II in an era of
antibiotics and other interventions,” Jerry explains, “most doctors have come to believe they can cure most diseases. Certainly we were taught to believe that about my own specialty, infectious disease, where we saw that the administration of an appropriately chosen antibiotic could remarkably reverse the course of a virulent illness.
“But it turns out that most serious illness now is the result of chronic and not acute disease, and therefore
not
amenable to technical interventions.”
Numerous studies validate Jerry's statement. Most of the conditions that afflict us as we ageâheart disease, cancer, diabetes, depression, arthritis, stroke, Alzheimer's, and so onâare chronic conditions that require long-term, often lifetime management. But though we now live in an era of chronic disease, our system of medical education, as well as our system of health-care financing and delivery, continues to be based upon an
acute
disease model, and this fact, I begin to understand, is at the core of many of our health-care problems.
“The contemporary disarray in health affairs in the United States,” Daniel Fox, president of the Milbank Memorial Fund, a foundation that engages in analysis, study, and research on issues in health policy, argues, “is a result of history.
*
It is the cumulative result of inattention to chronic disabling illness.
“Contrary to what most peopleâeven most expertsâbelieve,” he continues, “deaths from chronic disease began to exceed deaths from acute infections [more than] three-quarters of a century ago. But U.S. policy, and therefore the institutions of the health sector, failed to respond adequately to that increasing burden.”
Fox explains: “Leaders in government, business, and health affairs remain committed to policy priorities that have long been obsolete. Many of our most vexing problems in health careâsoaring hospital and medical costs; limited insurance coverage, or no coverage at all, for managing chronic conditions; and the scarcity of primary care relative to specialized medical servicesâare the result of this failure to confront unpleasant facts.”
According to a report issued by the Robert Wood Johnson Foundation (
Chronic Care in America: A 21st Century Challenge
), approximately
105 million Americans now suffer from chronic conditions, and by the year 2030, largely because of the aging of our population, this number will rise to nearly 150 million, 42 million of whom will be limited in their ability to go to school, to work, or to live independently.
The report also notes that the question of how to provide adequately for people with chronic conditions has significant implications not just for our general well-being, but for national healthcare expenditures. We currently spend $470 billion (calculated in 1990 dollars) on the direct costs of medical services for people with chronic conditions; by 2030 it is estimated we will be spending $798 billion.
*
(In 2001, the Institute of Medicine reported that 46 percent of the U.S. population had one or more chronic illnesses, and that 75 percent of direct medical expenses went for the care of patients with chronic illnesses.)
These figures, however, represent only
medical
services, whereas treatment and care for people with chronic conditions require a multitude of non-medical services, from installing bathtub railings and finding supportive housing, to helping with basic activities such as shopping, cleaning, and cooking. In addition, the report emphasizes, “the best ways to provide these services often are not by medical specialists or in medical institutions. In fact, the services that keep people independent for as long as possible are frequently those that emphasize assistance and caring, not curing.”
For the millions of people who require help with everyday activities, the assistance of family and friends is indispensable. In 1990, for example, 83 percent of persons under age sixty-five with chronic disabilities, and 73 percent of disabled persons over sixty-five, relied exclusively on these informal caregivers. Yet even as the number of people with chronic conditions is rising, the number of caregivers is falling. Whereas in 1970 there were twenty-one “potential caregivers” (defined as people age fifty to sixty-four) for each very elderly person (age eighty-five or older) and in 1990 eleven potential caregivers for each very elderly person, by 2030 there will be only six such potential caregivers for each very elderly person.
Moreover, most doctors work in community settings, not in hospitals or clinics, and in helping their patients manage chronic conditions
they rely on the knowledge and experience they acquired in medical school. Yet their medical school experience has taken place almost entirely in hospitals, and has consisted largely of work with patients who suffer from
acute
conditions.