Read 1493: Uncovering the New World Columbus Created Online

Authors: Charles C. Mann

Tags: #Americas (North; Central; South; West Indies), #Expeditions & Discoveries, #United States, #Colonial Period (1600-1775), #History

1493: Uncovering the New World Columbus Created (18 page)

BOOK: 1493: Uncovering the New World Columbus Created
12.3Mb size Format: txt, pdf, ePub
ads

In time Carolina grew famous as a slave importer, a place where the slave ships arrived from Africa and the captives, dazed and sick, were hustled to auction. But for its first four decades the colony was mainly a slave
exporter
—the place from where captive Indians were sent to the Caribbean, Virginia, New York, and Massachusetts. Data on Indian shipments are scarce, because colonists, wanting to avoid taxes and regulations, shipped them on small vessels and kept few records. (The big slaving companies in Europe didn’t have this choice.) From the fragmentary evidence, Gallay has estimated that Carolina merchants bought between thirty and fifty thousand captive Indians between 1670 and 1720. Most of these must have been exported, given the much lower number found by the Carolina census. In the same period, ships in Charleston unloaded only 2,450 Africans (some came overland from Virginia, though).
3

Here notice a striking geographical coincidence. By 1700, English colonies were studded along the Atlantic shore from what would become Maine to what would become South Carolina. Northern colonies coexisted with Algonkian-speaking Indian societies that had few slaves and little interest in buying and selling captives; southern colonies coexisted with former Mississippian societies with many slaves and considerable experience in trading them. Roughly speaking, the boundary between these two types of society was Chesapeake Bay, not far from what would become the boundary between slave and non-slave states in the United States. Did the proximity of Indian societies with slaves to sell help grease the skids for what would become African slavery in the South? Was the terrible conflict of the U.S. Civil War a partial reflection of a centuries-old native cultural divide? The implication is speculative, but not, it seems to me, unreasonable.

In any case, the Indian slave trade was immensely profitable—and very short-lived. By 1715 it had almost vanished, a victim in part of its own success. As Carolina’s elite requested more and more slave raids, the Southeast became engulfed in warfare, destabilizing all sides. Victimized Indian groups acquired guns and attacked Carolina in a series of wars that the colony barely survived. Working in groups, Indian slaves proved to be unreliable, even dangerous employees who used their knowledge of the terrain against their owners. Rhode Island denounced the “conspiracies, insurrections, rapes, thefts and other execrable crimes” committed by captive Indian laborers, and banned their import. So did Pennsylvania, Connecticut, Massachusetts, and New Hampshire. The Massachusetts law went out of its way to excoriate the “malicious, surly and revengeful” Indian slaves.

The worst problem, though, was something else. As in Virginia, malaria came to Carolina. At first the English had extolled the colony’s salubrious climate. Carolina, one visitor wrote, has “no Distempers either Epidemical or Mortal”; colonists’ children had “Sound Constitutions, and fresh ruddy Complexions.” The colonists decided to use the warm climate to grow rice, then scarce in England. Soon after came reports of “fevar and ague”—rice paddies are notorious mosquito havens. Falciparum had entered the scene, accompanied a few years later by yellow fever. Cemeteries quickly filled. In some parishes, more than three out of four colonists’ children perished before the age of twenty. As in Virginia, almost half of the deaths occurred in the fall. (One German visitor’s summary: “in the spring a paradise, in the summer a hell, and in the autumn a hospital.”)

Unfortunately, Indians were just as prone to malaria as English indentured servants—and more vulnerable to other diseases. Native people died in ghastly numbers across the entire Southeast. Struck doubly by disease and slave raids, the Chickasaw lost almost half their population between 1685 and 1715. The Quapaw (Arkansas) fell from thousands to fewer than two hundred in about the same period. Other groups vanished completely—the last few dozen Chakchiuma were absorbed by the Choctaw. The Creek grew to power by becoming, in the phrase of one writer, “the receptacle for all distressed tribes.” It was God’s will, Carolina’s former governor observed in 1707, “to send unusual Sicknesses” to the Westo Indians, “to lessen their numbers; so that the English, in comparison to the Spaniard, have but little Indian Blood to answer for.”

Naturally, the colonists looked for a different solution to their labor needs—one less vulnerable to disease than European servants or Indian slaves.

VILLA PLASMODIA

Like other cells, red blood cells are covered by a surface membrane made up of proteins, the long, chain-like molecules that are the principal constituents of our bodies. One of these proteins is the Duffy antigen. (The name comes from the patient on whose blood cells the protein was first discovered; an “antigen” is a substance recognized by the immune system.) The Duffy antigen’s main function is to serve as a “receptor” for several small chemical compounds that direct the actions of the cell. The compounds plug into the receptor—think of a spaceship docking at a space station, scientists say—and use it as a portal to enter the cell.

The Duffy antigen is not especially important to red blood cells. Nonetheless, researchers have written hundreds of papers about it. The reason is that
Plasmodium vivax
also uses the Duffy antigen as a receptor. Like a burglar with a copy of the front-door key, it inserts itself into the Duffy antigen, fooling the blood cell into thinking it is one of the intended compounds and thereby gaining entrance.

Duffy’s role was discovered in the early 1970s by Louis H. Miller and his collaborators at the National Institutes of Health’s Laboratory of Parasitic Disease. To nail down the proof, Miller and his collaborators asked seventeen men, all volunteers, to put their arms into boxes full of mosquitoes. The insects were chockablock with
Plasmodium vivax
. Each man was bitten dozens of times—enough to catch malaria many times over. Twelve of the men came down with the disease. (The researchers quickly treated them.) The other five had not a trace of the parasite in their blood. Their red blood cells lacked the Duffy antigen—they were “Duffy negative,” in the jargon—and the parasite couldn’t find its way inside.

The volunteers were Caucasian and African American. Every Caucasian came down with malaria. Every man who didn’t get malaria was a Duffy-negative African American. This was no coincidence. About 97 percent of the people in West and Central Africa are Duffy negative, and hence immune to vivax malaria.

Duffy negativity is an example of
inherited
immunity, available only to people with particular genetic makeups. Another, more famous example is sickle-cell anemia, in which a small genetic change ends up deforming the red blood cell, making it unusable to the parasite but also less functional as a blood cell. Sickle-cell is less effective as a preventive than Duffy negativity—it provides partial immunity from falciparum malaria, the deadlier of the two main malaria types, but its disabling of red blood cells also leads many of its carriers to an early grave.

Both types of inherited immunity differ from
acquired
immunity, which is granted to anyone who survives a bout of malaria, in much the way that children who contract chicken pox or measles are thereafter protected against it. Unlike the acquired immunity to chicken pox, though, acquired malaria immunity is partial; people who survive vivax or falciparum acquire immunity only to a particular strain of vivax or falciparum; another strain can readily lay them low. The only way to gain widespread immunity is to get sick repeatedly with different strains.

Inherited malaria resistance occurs in many parts of the world, but the peoples of West and Central Africa have more than anyone else—they are almost completely immune to vivax, and (speaking crudely) about half-resistant to falciparum. Add in high levels of acquired resistance from repeated childhood exposure, and adult West and Central Africans were and are less susceptible to malaria than anyone else on earth. Biology enters history when one realizes that almost all of the slaves ferried to the Americas came from West and Central Africa. In vivax-ridden Virginia and Carolina, they were more likely to survive and produce children than English colonists. Biologically speaking, they were fitter, which is another way of saying that in these places they were—loaded words!—genetically superior.

Racial theorists of the last century claimed that genetic superiority led to social superiority. What happened to Africans illustrates, if nothing else, the pitfalls of this glib argument. Rather than gaining an edge from their biological assets, West Africans saw them converted through greed and callousness into social deficits. Their immunity became a wellspring for their enslavement.

How did this happen? Recall that vivax, covertly transported in English bodies, crossed the Atlantic early, as I said; certainly by the 1650s, given the many descriptions of tertian fever, quite possibly before. Recall, too, that by the 1670s Virginia colonists had learned how to improve the odds of survival; seasoning deaths had fallen to 10 percent or lower. But in the next decade the death rate went up again—a sign, according to the historians Darrett and Anita Rutman, of the arrival of falciparum. Falciparum, more temperature-sensitive than vivax, never thrived in England and thus almost certainly was ferried over the ocean inside the first African slaves.

Falciparum created a distinctive pattern. Africans in Chesapeake Bay tended to die more often than Europeans in winter and spring—the result, the Rutmans suggested, of bad nutrition and shelter, as well as unfamiliarity with ice and snow. But the African and European mortality curves crossed between August and November, when malaria, contracted during the high mosquito season of early summer, reaches its apex. During those months masters were
much
more likely to perish than slaves—so much more that the overall death rate for Europeans was much higher than that for Africans. Much the same occurred in the Carolinas. Africans there, too, died at high rates, battered by tuberculosis, influenza, dysentery, and human brutality. Many fell to malaria, as their fellows brought
Plasmodium
strains they had not previously encountered. But they did not die as fast as Europeans.

Because no colonies kept accurate records, exact comparative death rates cannot be ascertained. But one can get some idea by looking at another continent with endemic malaria that Europe tried to conquer: Africa. (The idea that one can compare malaria rates in places separated by the Atlantic Ocean is in itself a mark of the era we live in, the Homogenocene.) Philip Curtin, one of slavery’s most important historians, burrowed in British records to find out what happened to British soldiers in places like Nigeria and Namibia. The figures were amazing: nineteenth-century parliamentary reports on British soldiers in West Africa concluded that disease killed between 48 percent and 67 percent of them
every year
. The rate for African troops in the same place, by contrast, was about 3 percent, an order-of-magnitude difference. African diseases slew so many Europeans, Curtin discovered, that slave ships often lost proportionately more white crewmen than black slaves—this despite the horrendous conditions belowdecks, where slaves were chained in their own excrement. To forestall losses, European slavers hired African crews.

The disparity between European and African death rates in the colonial Americas was smaller, because many diseases killed Europeans in Africa, not just malaria and yellow fever. But a British survey at about the same time as the parliamentary report indicated that African survival rates in the Lesser Antilles (the southern arc of islands in the Caribbean) were more than three times those of Europeans. The comparison may understate the disparity; some of those islands had little malaria. It seems plausible to say that in the American falciparum and yellow fever zone the English were, compared to Africans, somewhere between three and ten times more likely to die in the first year.

For Europeans, the economic logic was hard to ignore. If they wanted to grow tobacco, rice, or sugar, they were better off using African slaves than European indentured servants or Indian slaves. “Assuming that the cost of maintaining each was about equal,” Curtin concluded, “the slave was preferable at anything up to three times the price of the European.”

Slavery and falciparum thrived together. Practically speaking,
P. falciparum
could not establish itself for long in Atlantic City, New Jersey; the average daily minimum temperature is above 66 degrees, the threshold for the parasite, for only a few weeks per year. But in Washington, D.C., just 120 miles south, slightly warmer temperatures let it become a menace every fall. (Not for nothing is Washington called the most northern of southern cities!) Between these two cities runs the Pennsylvania-Maryland border, famously surveyed by Charles Mason and Jeremiah Dixon in 1768. The Mason-Dixon Line roughly split the East Coast into two zones, one in which falciparum malaria was an endemic threat, and one in which it was not. It also marked the border between areas in which African slavery was a dominant institution and areas in which it was not (and, roughly, the division between indigenous slave and non-slave societies). The line delineates a cultural boundary between Yankee and Dixie that is one of the most enduring divisions in American culture. An immediate question is whether all of these are associated with each other.

For decades an influential group of historians argued that southern culture was formed in the cradle of its great plantations—the sweeping estates epitomized, at least for outsiders, by Tara in the movie
Gone with the Wind
. The plantation, they said, was an archetype, a standard, a template; it was central to the South’s vision of itself. Later historians criticized this view. Big colonial plantations existed in numbers only in the southern Chesapeake Bay and the low country around Charleston. Strikingly, these were the two most malarial areas in the British colonies. Sweeping drainage projects eliminated Virginia’s malaria in the 1920s, but coastal South Carolina had one of the nation’s worst
Plasmodium
problems for another two decades. From this perspective, the movie’s Tara seems an ideal residence for malaria country: atop a hill, surrounded by wide, smooth, manicured lawns, its tall windows open to the wind. Every element is as if designed to avoid
Anopheles quadrimaculatus
, which thrives in low, irregular, partly shaded ground and still air. Is the association between malaria and this Villa Plasmodia style a coincidence? It seems foolish to rule out the possibility of a link.

BOOK: 1493: Uncovering the New World Columbus Created
12.3Mb size Format: txt, pdf, ePub
ads

Other books

Ding Dong Dead by Deb Baker
English passengers by Matthew Kneale
Rage of Eagles by William W. Johnstone
Poorhouse Fair by John Updike
The Girl with Ghost Eyes by M.H. Boroson
Forgotten Father by Carol Rose