Post-classical history

 3

Evil Air

“EXTRACTIVE STATES”

In 1985 a bookseller in northeast Spain announced that he had possession of nine letters and reports by Cristóbal Colón, seven of them never seen before, including chronicles of all four of his American voyages. Later that year, Consuelo Varela and Juan Gil, editors of a definitive edition of the admiral’s writings, skeptically inspected the papers. Surprising their colleagues, Varela and Gil concluded that the manuscripts were handwritten copies of actual letters and reports by Colón—copies of the type routinely kept by wealthy people in the days before photocopiers. The Spanish government acquired the papers for an undisclosed sum; a facsimile edition was published in 1989. Nine years after that, an English translation appeared.

Because I am interested in Colón, I bought a copy of the translation when I spotted it in a used-book store. Part of a series the Italian state published to honor the five hundredth anniversary of his first voyage to the Americas, the book is a big, lush, cream-colored object that doesn’t fit on a standard bookshelf. Disappointing to readers like me, Gil and Varela announced in the introduction that “these previously unknown texts do not present any spectacular revelations” about Colón’s life and character. But halfway through the newly revealed chronicle of the admiral’s second voyage I came across a curious detail—one that wasn’t in the fine biographies by Samuel Eliot Morison and Felipe Fernández-Armesto.

In the translation, Colón explains that after the expedition arrived at La Isabela “all my people went ashore to settle, and everyone realized it rained a lot. They became gravely ill from tertian fever.” Tertian fever, an old-fashioned term, refers to bouts of fever and chills that occur in a regular forty-eight-hour pattern—a day of sickness followed by a day of quiet, then a day of sickness as the pattern repeats (tertian, taken from the Latin for “three days,” derives from the Roman custom of counting time from the beginning of one period to the beginning of the next). Tertian fever is the fingerprint of the most important types of malaria, one of humankind’s most intractable scourges. Taken literally, Colón seemed to be saying that at La Isabela his men contracted malaria. No wonder the colonists didn’t want to work, I thought, and marked the passage with a pencil.

In 2002 Noble David Cook, a historian at Florida International University, in Miami, published an article entitled, alarmingly, “Sickness, Starvation, and Death in Early Hispaniola,” which detailed the island’s catastrophic history after Colón’s landing. Researchers generally agree that human malaria did not exist in the Americas before 1492 (some believe a kind of monkey malaria was present). If Colón’s men contracted malaria, Cook explained, they must have brought the disease with them from Spain, which like much of Europe then was rife with the disease. It was a textbook case of the Columbian Exchange, recorded by its progenitor himself.

Remembering the cream-colored book, I hauled it from my bookshelf and turned to the relevant passage. The original Spanish, printed on the facing page, didn’t use the Spanish words for malaria or tertian fever. Instead Colón wrote that his men had contracted something called çiçiones, a term I had never encountered. Why did Cook and the translator of Colón’s letter think this meant malaria?

Çiçiones is hard to find in modern Spanish dictionaries—I consulted the dozen or so in my local library without success. Google, too, was no help. Nor was Colón himself. He provided no description of the symptoms of çiçiones, perhaps because he believed they were familiar to his readers. All he said about the disease, in fact, was to guess that it was spread by the native women around La Isabela, “who are abundant there; and since they [that is, the women] were immodest and disheveled, it is no wonder that they [that is, the men] had trouble.” To me, this sounded like the admiral thought çiçiones was some kind of venereal disease.

But that doesn’t jibe with other sources, as I learned when I contacted an expert in sixteenth-century Spanish, Scott Sessions of Amherst College. The first dictionary of the Spanish language appeared in 1611, Sessions told me. In it is an entry for çiçiones: “the fever that comes with chills, which is attributed to the cierzo [mistral wind], because it is the most acute, cold and penetrating.” The next authoritative Spanish dictionary, issued in multiple volumes by the Royal Spanish Academy between 1726 and 1739, similarly defines çiçiones as “the fever that starts with chills, which from being acute and penetrating like the mistral wind, as [the first dictionary] says, one derives the word: but it more likely refers to tertian fever”—malaria. Cook and the translator, in other words, were correct: Colón may well have been describing malaria.

The scenario isn’t implausible. Malaria can lie dormant in the body for months, only to reemerge at full strength. The disease is transmitted by mosquitoes, which take in microscopic malaria parasites when they drink blood from infected people and pass them on to the next people they bite. Colón left on his second voyage in September 1493. If one of his crew had a malaria relapse after landing in La Isabela, only one bite from the right type of mosquito would be necessary to spread the disease—and those mosquitoes are abundant on Hispaniola.

All of this is highly speculative, to say the least. Today we know that many different diseases cause chills and fevers, including influenza and pneumonia. But for centuries people couldn’t distinguish one from another; they didn’t understand that malaria was a specific disease. Sessions, the Amherst historian, told me that paludismo, the Spanish word for malaria, didn’t appear in Royal Spanish Academy dictionaries until 1914. Even then, few realized that it was caused by a mosquito-borne parasite—the 1914 dictionary defined paludismo as a “group of deadly phenomena produced by marshy emanations.” (The English word “malaria” comes from the Italian mal aria, evil or bad air.) Colón was using a word that probably indicates malaria, in other words, but he could well have been describing ordinary chills and fever. A single word is not enough to make a diagnosis.

Yet the impossibility of finding definitive answers does not mean historians should stop seeking them—the question is too important. Despite a global eradication program that began in the 1950s, malaria is still responsible for unimaginable suffering: more than three-quarters of a million deaths per annum, the great majority of them children under the age of five. Every year about 225 million people contract the disease, which even with modern medical care can incapacitate for months. In Africa it afflicts so many people so often that economists believe it is a major drag on development; since 1965, according to one widely cited calculation, countries with high rates of malaria have had annual per capita growth rates 1.3 percent less than countries without malaria, enough to ensure that many of the former lost ground to the latter.

As it does today, malaria played a huge role in the past—a role unlike that of other diseases, and arguably larger. When Europeans brought smallpox and influenza to the Americas, they set off epidemics: sudden outbursts that shot through Indian towns and villages, then faded. Malaria, by contrast, became endemic, an ever-present, debilitating presence in the landscape. Socially speaking, malaria—along with another mosquito-borne disease, yellow fever—turned the Americas upside down. Before these maladies arrived, the most thickly inhabited terrain north of Mexico was what is now the southeastern United States, and the wet forests of Mesoamerica and Amazonia held millions of people. After malaria and yellow fever, these previously salubrious areas became inhospitable. Their former inhabitants fled to safer lands; Europeans who moved into the emptied real estate often did not survive a year.

The high European mortality rates had long-lasting impacts, the Harvard and Massachusetts Institute of Technology economists Daron Acemoglu, Simon Johnson, and James A. Robinson have argued. Even today, the places where European colonists couldn’t survive are much poorer than places that Europeans found more healthful. The reason, the researchers said, is that the conquering newcomers established different institutions in disease zones than they did in healthier areas. Unable to create stable, populous colonies in malarial areas, Europeans founded what Acemoglu, Johnson, and Robinson called “extractive states,” the emblematic example being the ghastly Belgian Congo in Joseph Conrad’s Heart of Darkness, where a tiny cohort of high-collared Europeans forces a mass of chained, naked slaves, “shadows of disease and starvation,” to build a railroad to ship ivory from the interior.

Tobacco brought malaria to Virginia, indirectly but ineluctably, and from there it went north, south, and west, until much of North America was in its grip. Sugarcane, another overseas import, similarly brought the disease into the Caribbean and Latin America, along with its companion, yellow fever. Because both diseases killed European workers in American tobacco and sugar plantations, colonists imported labor in the form of captive Africans—the human wing of the Columbian Exchange. In sum: ecological introductions shaped an economic exchange, which in turn had political consequences that have endured to the present.

It would be an exaggeration to say that malaria and yellow fever were responsible for the slave trade, just as it would be an exaggeration to say that they explain why much of Latin America is still poor, or why the antebellum cotton plantations in Gone with the Wind sat atop great, sweeping lawns, or why Scotland joined England to form the United Kingdom, or why the weak, divided thirteen colonies won independence from mighty Great Britain in the Revolutionary War. But it would not be completely wrong, either.

SEASONING

Malaria is caused by the two hundred or so species in the genus Plasmodium, ancient microscopic parasites that plague countless types of reptile, bird, and mammal. Four of those two hundred species target humankind. They are dishearteningly good at their jobs.

Although the parasite consists of but a single cell, its life story is wildly complex; it changes outward appearance with the alacrity of characters in a Shakespearean comedy. From the human point of view, though, the critical fact is that it is injected into our flesh by mosquitoes. Once in the body, the parasite pries open red blood cells and climbs inside. (I am here skipping several intermediate steps.) Floating about the circulatory system like passengers in so many submarines, the parasites reproduce in huge numbers inside the cell. Eventually the burgeoning offspring burst out of the cell and pour into the bloodstream. Most of the new parasites subvert other red blood cells, but a few drift in the blood, waiting to be sucked up by a biting mosquito. When a mosquito takes in Plasmodium, it reproduces yet again inside the insect, taking on a different form. The new parasites squirm into the mosquito’s salivary glands. From there the insect injects them into its next victim, beginning the cycle anew.

In the body, Plasmodium apparently uses biochemical signaling to synchronize its actions: most of the infected red blood cells release their parasites at about the same time. Victims experience these eruptions as huge, coordinated assaults—a single infection can generate ten billion new parasites. Overwhelmed by the deluge, the immune system sets off paroxysms of intense chills and fever. Eventually it beats back the attack, but within days a new assault occurs; some of the previous wave of parasites, which have hidden themselves inside red blood cells, have produced a new generation of Plasmodium, billions strong. The cycle repeats until the immune system at last fights off the parasite. Or seems to—Plasmodium cells can secret themselves in other corners of the body, from which they emerge a few weeks later. Half a dozen episodes of chills and fever, a bit of respite, then another wave of attacks: the badge of full-blown malaria.

Single-celled Plasmodium parasites burst out of dying red blood cells, beginning the assault on the body that leads to full-blown malaria. (Photo credit 3.1)

If the suffering caused by malaria today is difficult to grasp, it is almost impossible to imagine what it was like when its cause was unknown and no effective treatments existed. One can get a hint by reading the accounts of victims like Samuel Jeake, a seventeenth-century merchant in southeast England, who doggedly recorded every skirmish in his decades-long war with what we now recognize as malaria. To pick an example almost at random, here is Jeake on February 6, 1692, near the end of one six-month bout, stoically recording that he had been “taken ill the Seventh time: with a Tertian Ague [fever]; about 3h p.m. it began, & was of the same nature with my last which I had all January, but this was the worst.”

Feb. 8: A 2d fit which took me earlier & was worse.

Feb. 10: About noon a 3d fit. which shook me about 3h p.m. a very bad fit & violent feaver.…

Feb. 12: Before noon, a 4th fit. with which I shook about 3h p.m. & then went to bed: where had a very violent Feaver; this being the worst fit of all: my breath very short; & delirious.…

Feb. 14: About noon, a 5th fit.…

Feb. 16: About 2h. p.m. a 6th fit, very little, or scarce sensible, but sweat much in the night. And it pleased God that this was the last fit.

The respite lasted just fifteen days.

Mar. 3: About 4h. p.m. Taken ill the Eighth time: of a Tertian ague, succeeded by a Feaver & sweat in the night.…

Mar. 5: About 3h p.m. A 2d fit; worse than the former.

The attacks stopped nine weeks later. But malaria was not done with Jeake. The parasite, a superbly canny creature, can hide in the liver for as long as five years, periodically emerging to produce full-blown malarial relapses. Six months later, Plasmodium again massed in his blood.

Tertian fever of the sort experienced by Jeake is the signature of Plasmodium vivax and dium falciparum, which cause the two most widespread types of malaria. Despite the similarity of the symptoms, the two Plasmodium species have different effects on the body. After inserting itself inside red blood cells, falciparum, unlike vivax, manages to alter them so that they stick to the walls of the tiny capillaries inside the kidneys, lungs, brain, and other organs. This hides the infected cells from the immune system but slowly cuts off circulation as the cells build up on the capillary walls like layers of paint on an old building. Untreated, the circulation stoppage leads to organ failure, which kills as many as one out of ten falciparum sufferers. Vivax doesn’t destroy organs, and thus is less deadly. But during its attacks sufferers are weak, stuporous, and anemic: ready prey for other diseases. With both species, sufferers are infectious while sick—mosquitoes that bite them can acquire the parasite—and can be sick for months.

dium, a tropical beast, is exquisitely sensitive to temperature. The speed at which the parasite reproduces and develops in the mosquito depends on the temperature of the mosquito, which in turn depends on the temperature outside (unlike mammals, insects cannot control their own internal temperature). As the days get colder, the parasite needs more and more time to develop, until it takes longer than the mosquito’s lifespan. Falciparum, the most deadly variety of malaria, is also the most temperature sensitive. Around 72°F it hits a threshold; the parasite needs three weeks at this temperature to reproduce, which approaches the life expectancy of its mosquito host; below about 66°F it effectively cannot survive. Vivax, less fussy, has a threshold of about 59°F.

Unsurprisingly, falciparum thrives in most of Africa but gained a foothold only in the warmest precincts of Europe: Greece, Italy, southern Spain, and Portugal. Vivax, by contrast, became endemic in much of Europe, including cooler places like the Netherlands, lower Scandinavia, and England. From the American point of view, falciparum came from Africa, and was spread by Africans, whereas vivax came from Europe, and was spread by Europeans—a difference with historic consequences.

Human malaria is transmitted solely by the Anopheles mosquito genus. In Jeake’s part of England the principal “vector,” as the transmitting organism is known, is a clutch of tightly related mosquito species known jointly as Anopheles maculipennis. The mosquito’s habitat centers on the coastal wetlands of the east and southeast: Lincolnshire, Norfolk, Suffolk, Essex, Kent, and Sussex counties. A. maculipennis—and the dium vivax it carries—seem to have been uncommon in England until the late sixteenth century, when Queen Elizabeth I began encouraging landlords to drain fens, marshes, and moors to create farmland. Much of this low, foggy terrain had been flooded regularly by the North Sea tides, which washed away mosquito larvae. Draining blocked the sea but left the land dotted with pockets of brackish water—perfect habitat for A. maculipennis. Farmers moved into the former marsh, still soggy but now usable. Their homes and barns, heated during cold weather, provided space for the mosquito—and the vivax parasites inside its body—to survive the cold weather, ready to breed and spread in the following spring.

As the British medical historian Mary Dobson has documented, draining the marshes set off an inferno of vivax malaria. Visitors to maculipennis habitat recoiled at the wretchedness they encountered. An all-too-typical sight, lamented the Kent writer Edward Hasted in 1798, was “a poor man, his wife, and whole family of five or six children, hovering over the fire in their hovel, shaking with an ague [fever], all at the same time.” Curates died in such numbers after being sent to coastal Essex, the writer John Aubrey remarked, that the area was known as “Killpriest.” Natives fared no better; babies born in the marshland, Hasted wrote, seldom “lived to the age of twenty-one.” Dobson recorded baptisms and burials in twenty-four wetland parishes. In the 1570s, before Queen Elizabeth drained the swamps, baptisms exceeded burials by 20 percent—the population was rising. Two decades later draining was in full swing, and burials outnumbered baptisms by almost a factor of two. Population boomed elsewhere in England, but these parishes didn’t return to their earlier growth rates for two centuries.1

“The marshes would have these bursts of mortality,” Dobson told me. “About every ten years they’d have a year in which 10 or 20 percent of the population would die. A few miles away, in higher ground, were some of the healthiest parts of England.” Inured to the cavalcade of suffering, residents viewed their circumstances with fatalistic cheer. (Readers of Charles Dickens will recall the stoicism of the fen-dwelling Gargerys in Great Expectations, raising the child Pip within a short walk of the “five little stone lozenges” that marked the resting places of his “five little brothers.”) Traveling in feverish Essex County, writer Daniel Defoe met men who claimed to have “from five or six, to fourteen or fifteen wives.” Explaining how this was possible, one “merry fellow” told Defoe that men thereabouts brought in wives from healthier inland precincts.

[W]hen they took the young lasses out of the wholesome and fresh air, they were healthy, fresh and clear, and well; but when they came out of their native air into the marshes among the fogs and damps, there they presently changed their complexion, got an ague or two, and seldom held it above half a year, or a year at the most; and then, said he, we go to the uplands again, and fetch another.

The marshman laughed as he spoke, Defoe wrote, “but the fact, for all that, is certainly true.”

In 1625 the bubonic plague engulfed England. More than fifty thousand people died in London alone. Many of the urban wealthy fled into the malarial eastern marshes, with results later described by the satirical poet George Wither:

In Kent, and (all along) on Essex side

A Troupe of cruell Fevers did reside:…

And, most of them, who had this place [London] forsooke,

Were either slaine by them, or Pris’ners tooke…

As this nineteenth-century copy of a now-lost earlier drawing suggests, malaria was long a constant fear in England’s southeastern marshlands. (Photo credit 3.2)

In the end, Wither explained, “poorest beggers found more pitty here [London], / And lesser griefe, then richer men had there.” The implication is mind-boggling: people who fled to vivax country would have been better off staying home with the bubonic plague.

Data are sketchy and incomplete, but according to the Brandeis University historian David Hackett Fischer about 60 percent of the first wave of English emigrants came from nine eastern and southeastern counties—the nation’s Plasmodium belt. One example was the hundred-plus colonists who began Jamestown. Fifty-nine of their birthplaces are known, according to Preservation Virginia, the organization that backs Jamestown archaeology; thirty-seven were in malaria-ridden Essex, Huntingdonshire, Kent, Lincolnshire, Suffolk, Sussex, and London. Most of these men, one assumes, set off from higher, inland areas that were less malarial than the coastal wetlands. But many would have come from the marshes. Even those who didn’t come from the malaria zone usually passed through it just before departure, their ships waiting for weeks or months at Sheerness, a Kent harbor town near the mouth of the Thames that was a malaria center. Other ships waited at the almost equally pestilential Blackwall, east of London on the same river.

People in malarial paroxysms would have been unlikely candidates for an arduous sea voyage. But Plasmodium vivax, one recalls, can hide itself inside the apparently healthy. Colonists could board a ship without symptoms, land in Chesapeake Bay tobacco country, and then be struck by the teeth-chattering chills and sweat-bursting fevers of malaria. At which point, alas, they could unknowingly pass the parasite to every mosquito that bit them.

“In theory, one person could have established the parasite in the entire continent,” said Andrew Spielman, a malaria researcher at the Harvard School of Public Health. Almost certainly many of the tassantassas at Jamestown were infectious. At some point one of them was bitten by Anopheles quadrimaculatus, a cluster of five closely related mosquitoes that is the East Coast’s primary malaria vector. “It’s a bit like throwing darts,” Spielman told me before his death in 2006. “Bring enough sick people in contact with enough mosquitoes in suitable conditions, and sooner or later you’ll hit the bull’s-eye—you’ll establish malaria.”

By 1657 the governor of Connecticut colony, John Winthrop, was recording cases of tertian fever in his medical journal. Winthrop, a member of the Royal Society, was one of the most careful scientific observers in New England. “If he said he saw tertian fever, he probably was seeing tertian fever,” said Robert C. Anderson, the genealogist who is transcribing Winthrop’s medical journal. More than that, Anderson told me, the existence of malaria in the 1650s suggests a date of introduction before 1640—after that year, political convulsions in England shut down emigration to New England for decades. “There were few colonists to bring it over,” Anderson said. If Plasmodium vivax had come to Connecticut by, say, 1635, I asked Spielman, could one make any inferences about Virginia? “New England is cold,” he said. “It’s hard to believe that malaria would have established itself there before Virginia.” Could the parasite have invaded Chesapeake Bay as early as the 1620s? “Given that hundreds or thousands of people from malaria zones came into the area, I wouldn’t have trouble believing that,” he said. “Once malaria has a chance to get into a place, it usually gets in fast.”

Tracing the past movement of malaria parasites is difficult—their existence was not discovered until 1880, so all previous data are indirect. By combining health records, estimates of past wetland extent, and early-twentieth-century malaria surveys by the British military, one can see that southeast England must have been seething with malaria. Of the fifty-nine birthplaces of the first Jamestown colonists that have been tracked by the historic-preservation group Preservation Virginia, thirty-five occurred in the regions the military identified as “extremely” or “more” favorable to Plasmodium. In addition, all the colonists passed through London and the malarial Thames delta en route. It seems almost certain that some brought the disease with them to Chesapeake Bay.

Click here to view a larger image.

Indeed, malaria may have come in before 1620. Conditions for the disease were perfect between 1606 and 1612, when tidewater Virginia was struck by drought. (I mentioned the drought in the last chapter.) A. quadrimaculatus is happy when wet areas get dry. “In drought years little tributary streams turn into a series of pools,” explained David Gaines, public-health entomologist at the Virginia Department of Health. The larvae “thrive in that kind of environment.” Quads, as entomologists call them, prefer to breed in open areas rather than shaded forests. After the peace created by Pocahontas’s marriage in 1614, colonists cleared land for tobacco—making the environment, Gaines told me, “more quad-friendly, because it would have created those little open pockets of water they love.” The tassantassas were issuing “an invitation for malaria,” he said. “In my experience, malaria takes up invitations right away.” If Plasmodium arrived with the first colonists, it could help explain, along with salt poisoning, why they were so often described as listless and apathetic; they had malaria.2

Malaria’s precise date of arrival will always lie in the realm of speculation. What is clear is that malaria rapidly made itself at home in Virginia. It became as inescapable there as it was in the English marshes—a constant, sapping part of life.

When London investors shipped people to Virginia, Governor George Yeardley warned in 1620, they “must be content to have littell service done by new men the ffirst yeare till they be seasoned”—seasoning being the term for the period in which newcomers were expected to battle disease. The prolonged incapacitation of recent migrants was taken for granted. Jamestown minister Hugh Jones wrote a pamphlet in 1724 describing Virginia to Britons. The colony’s climate, he incorrectly explained, causes chills and fever, “a severe Fit of which (called a Seasoning) most expect, some time after their Arrival in that Climate.” Seasoning often was a path to the cemetery; during Jamestown’s first half century, as many as a third of new arrivals died within a year of disembarking. After that, Virginians learned by trial and error to live with vivax, avoiding marshes and staying indoors at dusk; those with acquired immunity carefully tended the sick, most of whom were children as in Africa today. Seasoning deaths fell from 20 or 30 percent around 1650 to 10 percent or lower around 1670—a considerable improvement, but still a level that represented much suffering.

Landon Carter had a prosperous Virginia plantation about sixty miles north of Jamestown. A devoted father, Carter agonized as malaria repeatedly hit his family in the summer and fall of 1757. Worst affected was his infant daughter Sukey, racked by chills and fever in the classic tertian pattern. Like Samuel Jeake, Carter recorded her struggle in a diary:

Dec. 7: Sukey lookt badly all this evening with a quick Pulse.

Dec. 8: ’Tis her usual Period of attack which is now got to every Fortnight.… Seems brisk and talkt cheerfully. Her fever not higher.

Dec. 9: Continues better though very pale.

Dec. 10: Sukey a fever early and very sick at her stomach and head ach. This fever went off in the night.

Dec. 11: The Child no fever to day but I thought her pulse a little quick at night.

Dec. 12: Sukey’s fever rose at 1 in the night.… This Child dangerous ill at 12, dead pale and blue.…

Dec. 13: Sukey’s fever kept wearing away Yesterday till one in the night when she was quite clear.

To live in Virginia, a heartworn Carter wrote that day, “it is necessary that man should be acquainted with affliction, and ’tis certainly nothing short of it to be confined a whole year in tending one’s sick Children. Mine are now never well.”

Sukey died the following April, short of her third birthday.

ABOUT-FACE

Malaria had impacts beyond the immediate sufferings of its victims. It was a historical force that deformed cultures, an insistent nudge that pushed societies to answer questions in ways that today seem cruel and reprehensible. Consider the seventeenth-century English entrepreneurs who wanted to make money in North America. Because Chesapeake Bay had no gold and silver, the best way to profit was to produce something else that could be exported to the home country. In New England, the Pilgrims depended on selling beaver fur. In Chesapeake Bay, the English settled on tobacco, for which there was huge demand. To satisfy that demand, the colonists wanted to expand the plantation area. To do that, they would have to take down huge trees with hand tools; break up soil under the hot sun; hoe, water, and top the growing tobacco plants; cut the heavy, sticky leaves; drape them on racks to dry; and pack them in hogsheads for shipping. All of this would require a lot of labor. Where could the colonists acquire it?

Before answering this question make the assumption, abundantly justified, that the colonists have few moral scruples about the answer and are concerned only with maximizing ease and profit. From this point of view, they had two possible sources for the required workforce: indentured servants from England and slaves from outside of England (Indians or Africans). Servants or slaves: which, economically speaking, was the best choice?

Indentured servants were contract laborers recruited from England’s throngs of unemployed. Because the poor could not afford the costly journey across the sea, planters paid for the voyage and servants paid off the debt by working for a given period, typically four to seven years. After that, indentured servants were free to claim their own land in the Americas. Slavery is harder to define, because it has existed in many different forms. But its essence is that the owner acquires the right to coerce labor from slaves, and slaves never gain the right to leave; they must work and obey until they die or are freed by their owners. Indentured servants are members of society, though at a low rank. Slaves are usually not considered members of society, either because they were born far away or because they somehow have forfeited their social standing, as in the occasional English practice of turning convicts into slaves.

During the last quarter of the seventeenth century, England chose slaves over servants—indeed, it became the world’s biggest slaver. So well known nowadays is the English embrace of slavery that the idea of another path is hard to grasp. But in many respects the nation’s turn to bondage is baffling—the institution has so many inherent problems that economists have often puzzled over why it exists. More baffling still is the form that bondage took in the Americas: chattel slavery, a regime much harsher than anything seen before in Europe or Africa.

On the simplest level, slaves were more expensive than servants. In a well-known study, Russell R. Menard of the University of Minnesota tallied up the prices in Virginia and Maryland of slaves and servants whose services had to be sold after their masters’ deaths. In the last decades of the seventeenth century, the average price of a prime-age male African slave was £25. Meanwhile, the servants’ contracts typically cost about £10. (Technically, I should say that Menard discovered the price was equivalent to £25 and £10, because coins were scarce and even illegal in colonial Chesapeake Bay, and people paid their bills with tobacco.) At that time, £25 was a substantial sum: about four years’ pay for the typical hired worker in England. The servant was substantially cheaper.

To be sure, servants would eventually be able to leave their master’s employ, lowering their value (attempting to take this into account, Menard looked only at servants with more than four years remaining on their contracts). But the longer period of service one could expect from a slave still would not justify slavery economically, the great economist Adam Smith argued. An inherent flaw with slavery, he maintained, is that slaves made unsatisfactory workers. Because they were usually from distant cultures, they often didn’t speak their owners’ language and could be so unfamiliar with their owners’ societies that they would have to be trained from scratch (Africans, for example, knew only tropical forms of agriculture). Worse, they had every incentive to escape, wreak sabotage, or kill their owners, the people who were depriving them of liberty. Indentured servants, by contrast, spoke the same language, accepted the same social norms, and knew the same farming methods. And their contracts were for a limited time, so they had little reason to run away (unless they thought the planter was going to cheat). Because willing hands are more likely to do their jobs well, Smith reasoned in The Wealth of Nations, “the work done by freemen comes cheaper in the end than that performed by slaves.” All else being equal, he argued, economics suggests that planters should have chosen the cheaper, easier, less threatening alternative: servants from Europe.

Smith, who hated slavery, was trying to prove that something he detested was not only immoral, but foolish economically. Slavery, in his view, was largely the irrational product of humankind’s “love to domineer.” But he also believed that people try to find ways around economic problems that stand in the way of their desires. Just as one would expect, slave owners throughout history have created incentives for their slaves to work efficiently: paths to liberty. Work hard and true, masters in effect said, and you will eventually be allowed to walk away. Often, too, slaves were assigned tasks that brought some satisfaction, as in the case of African or Roman armies made up of captive soldiers—the slaves had switched sides, so to speak, but their lives were unchanged in many ways, and there was always the prospect of earning glory.

Slavery in the Americas, though, was something else: a lifelong sentence, in most cases, to awful work in brutal conditions without hope of winning freedom. Every one of Smith’s disincentives for effective work was present as rarely before; none of the workarounds developed in the past were employed. The regime was so brutal that it should have generated constant shirking, sabotage, and strife—and, indeed, slaveholder records are endless threnodies of complaint and fear. Why did it arise?

Of all the nations in western Europe, moreover, England would be the last that one would expect to take up this especially brutal form of bondage, because opposition to slavery was more common there than the rest of Europe. If the continent had an antislavery culture, in fact, it was England. This was less a tribute to the nation’s moral advancement than an enraged response to the constant targeting of her ships by Barbary pirates, who from the sixteenth to the eighteenth century enslaved tens of thousands of English sailors, soldiers, and merchants. Based in northwest Africa, these Muslim corsairs prowled as far north as the English Channel, ransacking seaside villages and seizing ships at anchor; in just ten days, the mayor of Plymouth complained in 1625, buccaneers lurking outside the harbor took twenty-seven vessels. (Inviting charges of hypocrisy, England lionized Francis Drake, who terrorized Spanish colonies in a similar fashion.) Most English captives were sent to the galleys; many were forcibly converted to Islam; others disappeared into slave caravans bound across the deserts to Ottoman Egypt or sub-Saharan Africa. In those days Algiers alone often held 1,500 English slaves; the Moroccan town of Salé had 1,500 more. Some were sold to Spain and Portugal. Escapees published lurid memoirs of their years under the lash, inflaming the public; churchmen denounced Muslim slavery in the pulpit and took up collections in church to ransom captives. Political leaders, Protestant ministers, and legal experts alike vehemently proclaimed freedom as an English birthright and condemned the pagans and papists (Moroccans and Spaniards) who enslaved them.

Slavery had been widespread in England in medieval times, as it was in the rest of Europe. In Spain and Portugal, beset by conflict with Islam and short of labor for sugar plantations, it continued to be a useful enterprise. (I discuss this further in Chapter 8.) In England, though, it became exceptional—not actually illegal, but rare—for political reasons, for the economic reasons described by Smith, and because slavery as an institution had little appeal in a nation aswarm with mobs of unemployed workers. Publicly outraged by bondage and with no domestic slave industry to protect, the English were Europe’s least likely candidates for slavemasters.

In consequence, the English colonies initially turned to indentured servants and largely avoided slaves. Indentured servants comprised between a third and a half of the Europeans who arrived in North America in the first century of colonization. Slaves were rare—only three hundred lived in all of Virginia in 1650. By comparison, the few Dutch in New Amsterdam, the colonial predecessor to New York, had five hundred slaves. As more English ships came to North America, slaves slowly became more common.

Then, between 1680 and 1700, the number of slaves suddenly exploded. Virginia’s slave population rose in those years from three thousand to more than sixteen thousand—and kept soaring thereafter. In the same period the tally of indentured servants shrank dramatically. It was a pivot in world history, the time when English America became a slave society and England became the dominant player in the slave trade.

What accounts for this about-face? Economists and historians have mulled it over for decades. It was not the lure of profits from the trade itself: the slave business was incredibly important as a historical force and moral stain but not all that important as an economic industry. At its height at the end of the eighteenth century, according to the historians David Eltis and Stanley L. Engerman, slave shipments “accounted for less than 1.5 percent of British ships, and much less than 3 percent of British shipping tonnage.” Caribbean sugar, the main slave crop, then accounted for a bit less than 2.5 percent of British GDP, large but not overwhelmingly so; the textile industry, for instance, was more than six times bigger. (Slaves were producing raw materials, not the far more valuable finished industrial goods.)

Some have argued that England changed its collective mind because its American colonies were especially conducive to slavery—they had so much available real estate. Adam Smith predicted in The Wealth of Nations that laborers would see the available land around them and leave their jobs, “in order to become landlords themselves.” They would hire other workers in turn, who would “soon leave them for the same reason they left their first master.” Not for more than a century did other economists fully draw out the implications of Smith’s idea. If employers constantly lost workers to the lure of cheap land, then they would want to restrict their freedom of movement. Bondage was the inevitable end result. Paradoxically enough, America’s wide-open frontier was, from this perspective, an incitement to slavery.

On some level, this notion must be true; slavery wouldn’t exist if employers didn’t want to control workers’ movements. But it doesn’t explain why slavery was uncommon in the English colonies of New England and New York, which had abundant land, and common in the English colonies of Barbados and St. Kitts, Caribbean islands that had little. In consequence, many researchers turned to a second explanation: England’s religious civil war in the mid-seventeenth century, part of the worldwide unrest associated with the Little Ice Age and the uncertainties in the silver trade. The conflict was disastrous; between 1650 and 1680 the country’s population fell almost 10 percent. As economics would predict, the decline in the number of English workers drove up English wages, which inevitably increased the price necessary to lure indentured servants across the Atlantic. Meanwhile, the indentured servants who had finished their terms in Massachusetts, Virginia, and Carolina were establishing new plantations and seeking their own indentured servants, increasing demand, which, as one would expect, further lifted prices.

Again, this explanation must be true; any rise in the cost of indentured servants would necessarily make alternatives seem more attractive. But it doesn’t explain why colonists chose the alternative they did: captive Africans. Planters could have found labor in Scotland—and, to a lesser extent, Ireland—which in different ways also had been thrown into turmoil by the English civil war. The Little Ice Age had piled on, making the sea too cold for cod, piling up snow in the hills, and walloping the populace with a series of bad harvests. In the worst period, between 1693 and 1700, the Scottish oat harvest failed in every year but one. Desperate Scots fled their homes in huge numbers. Thousands became mercenaries in Russia, Sweden, Norway, and the German principalities; thousands more set up shop in northern Ireland, setting off a cultural collision that endures to this day. Gangs of Scottish refugees roamed London streets, begging for work and food—obvious candidates, or so it would seem, for American colonies. English farmers had employed indigent Scots for centuries. Yet at the very time the supply of desperate Scots was increasing the colonists turned to captive Africans—people who couldn’t speak the language, had no wish to cooperate, and cost more to transport. Why?

One way to examine the question would be to evaluate the fortunes of the biggest group of Scots who traveled to the Americas in those years: the Scottish colony in Panama. Organized by an ambitious huckster named William Paterson, the scheme proposed using Panama’s strategic location to break Spain’s near monopoly on the silk and silver trades. “Seated between the two vast oceans of the universe,” Paterson rhapsodized, the colony would control “at least two-thirds of what both Indies [that is, silk-rich Asia and the silver-rich Americas] yield to Christendom.” Scottish Panama, he promised, would become “arbitrator of the commercial world,” a financial perpetual-motion machine that would endlessly spew riches as it demonstrated that “trade is capable of increasing trade, and money of begetting money to the end of the world.”

Dazzled by this vision, more than 1,400 Scots subscribed to a joint-stock company, pledging what has been estimated at between a quarter and a half of the poor nation’s available capital. In July 1698 five ships set sail with 1,200 colonists and a year’s food supply. They landed on the Panamanian coast and set about clearing the forest to create the port of New Edinburgh. Just eight months later the ragged survivors—fewer than three hundred people—bolted for home, Paterson among them. They arrived just days after the departure of a second Panama expedition: four ships, 1,300 colonists. Nine months later it, too, fled. Not a hundred people made it home. Lost with the dead was every penny invested in the venture.

Calamity usually has many fathers, and Paterson’s colony was no exception. Thinking to get started by trading with the local Indians, the Scots had stuffed their ships with the nation’s finest woolen hose, tartan blankets, ornamental wigs, and leather shoes—25,000 pairs. Alas, it proved difficult to sell warm socks and itchy blankets in the tropics. Meanwhile, the hard equatorial rain rotted their stores and washed away all efforts to farm. As New Edinburgh grew desperate, William, king of England and Scotland, instructed his other colonies not to help, for fear of offending Spain. Spain for its part knew about the project and periodically attacked.

The main causes of the disaster, though, were malaria, dysentery, and yellow fever. Colonists’ accounts record dozens of deaths a week from disease. The first time Spain assaulted New Edinburgh, its soldiers found four hundred fresh graves. The colony had been well supplied, blessed with an adequate water supply, and never troubled by its Indian neighbors. European and African disease had filled that cemetery.

Back in Scotland, the debacle of New Edinburgh set off riots—it had wiped out much of the nation’s capital. At the time, England and Scotland remained separate nations despite sharing a monarch. England, the bigger partner, had been pushing a complete merger for decades. Scots had resisted, believing they would become an afterthought in a London-dominated economy. Now England promised to reimburse New Edinburgh’s investors as part of a union agreement. “Even some committed Scottish patriots such as Paterson endorsed the Union Act of 1707,” the historian J. R. McNeill wrote in Mosquito Empires, a pioneering history of Caribbean epidemiology, ecology, and war. “Thus Great Britain was born, with assistance from the fevers of Panama.”

More than that, New Edinburgh showed that Scots—and other Europeans—died too fast in malarial areas to be useful as forced labor. Individual Britons and their families continued to make their own way to the Americas, to be sure, but businesspeople increasingly resisted sending over large groups of Europeans. Instead they looked for alternative sources of labor. Alas, they found them.

“NO DISTEMPERS EITHER EPIDEMICAL OR MORTAL”

The colony of Carolina was founded in 1670, when about two hundred colonists from Barbados relocated to the banks of a river that empties into Charleston Harbor (it was initially called Charles Town, after the reigning king). Like Virginia, Carolina was a commercial enterprise, founded by eight powerful English nobles who hoped to take advantage of the now-established traffic to Virginia by redirecting some of it to the south. The proprietors intended to lease pieces of the colony to would-be planters, realizing a profit without actually having to expend much effort or money. Barbados, full of sugar plantations, was crowded. Some of its English inhabitants, looking to acquire land, decided to take a flyer on Carolina. Knowing of Virginia’s labor problem, the proprietors promised extra land to anyone who imported indentured servants, as well as the servants themselves.

Whereas Jamestown had confronted a single Indian empire under a strong leader, Carolina began amid a chaotic swirl of native groups. Beginning in about 1000 A.D., hundreds of densely packed towns—“Mississippian” societies, as archaeologists call them—arose in the Mississippi Valley and the Southeast. Ruled by powerful theocrats who lived atop great earthen mounds, they were the most technologically sophisticated cultures north of Mexico. For reasons that are not well understood, these societies fell apart in the fifteenth century. The disintegration was accelerated by the onset of European diseases. By the time Carolina came into existence, the fragments of Mississippian societies were coalescing into confederacies of allied communities—Creek, Choctaw, Cherokee, Catawba—that were jostling for power across the Southeast.

Slavery occurred in most Indian societies, but the institution differed from place to place. Among Algonkian-language societies like the Powhatan, for instance, slavery was usually a temporary state. Slaves were prisoners of war who were treated as servants until they were either tortured and slain, ransomed back to their original groups, or inducted into Powhatan society as full members. Occasionally, Jamestown’s tassantassas were able to buy Indian captives for their fields, but they were not generally a source of labor either for the Powhatan or the English. South of Chesapeake Bay was a cultural border where Algonkian societies ran into the nascent confederacies, many of which spoke Muskogean languages. War captives also became slaves in the confederacies, but there slavery was both more common and longer-lasting—traditions dating back to the Mississippians, whose leaders viewed captives as symbols of power and vengeance. Slaves worked in fields, performed menial tasks, and could be given away as gifts; female slaves provided sexual services to honored male visitors (a gesture frequently misunderstood by Europeans, who thought that the Indians were offering their wives). When foreigners appeared in Carolina, the confederacies were more than willing to trade surplus captives for axes, knives, metal pots, and, above all, guns.

In the late seventeenth century, the new flintlock rifle was becoming available—the first European firearm that native people regarded as superior to their bows. The matchlocks John Smith brought to Virginia used a lever to lower a burning match onto a small pan of gunpowder; the resultant flash pushed the projectile down the barrel. Heavy and unrifled, matchlocks had to be braced on tripods; because soldiers had to carry around burning fuses to fire them, the weapons were unsuitable for beaver wetlands and almost useless in rain. In optimal conditions, matchlocks could shoot a deadly projectile farther than a bow. But in warfare, conditions are never optimal. Colonial records are replete with descriptions of tassantassas unhappily discovering that as a practical matter their weapons were outmatched by native bows—weapons with no moving parts, weapons that could get wet, weapons that could be fired in an instant. Flintlocks, by contrast, ignited the gunpowder by snapping a chunk of flint against a piece of steel, creating a spark. The spark ignited a small charge that in turn set off a bigger charge in the barrel. Smaller, lighter, and more accurate than matchlocks, they could be fired quickly and used in wet weather.

The southeastern confederacies, quickly understanding the new weapons’ superiority, determined not to be outgunned, either by the English or their native rivals. An arms race ensued across the Southeast. To build up their stores of flintlocks, native people raided their enemies for slaves to sell—an action that required more firearms. Needing guns to defend themselves, they in turn staged their own slaving raids, selling the captives to Europeans in return for guns. Demand fed demand in a vicious cycle.

Despite the fears of the Virginia Company, Jamestown never was directly threatened by Spain or France. Carolina, closer to Spanish Florida and French Louisiana, had much more reason to worry; indeed, Spain tried to extinguish the colony within months of its founding. Carolina’s leaders came up with an elegant scheme; they asked nearby native groups to provide them with slaves by raiding the Indians who were allied with Spain and France, destabilizing their enemies and reducing their labor shortage at the same time.

Economically speaking, indigenous slavery was a good deal for both natives and newcomers. In the Charleston market Indians sometimes could sell a single slave for the same price as 160 deerskins. “One slave brings a Gun, ammunition, horse, hatchet, and a suit of Cloathes, which would not be procured without much tedious toil a hunting,” a Carolina slave buyer noted, perhaps with some exaggeration, in 1708. “The good prices The English traders give them for slaves Encourages them to this trade Extreamly.”

“Good prices” from the Indian point of view, but cheap to the English. Indian captives cost £5–10, as little as half the price of indentured servants, according to the Ohio State University historian Alan Gallay, author of The Indian Slave Trade (2002), a widely lauded account of its rise and fall. More important, the annual cost of ownership was much lower, because slaves did not have to be released after a few years—the purchase price could be amortized over decades. Unsurprisingly, the colonists chose Indian slaves over European servants. A 1708 census, Carolina’s first, found four thousand English colonists, almost 1,500 Indian slaves, and just 160 servants, the majority presumably indentured.

In time Carolina grew famous as a slave importer, a place where the slave ships arrived from Africa and the captives, dazed and sick, were hustled to auction. But for its first four decades the colony was mainly a slave exporter—the place from where captive Indians were sent to the Caribbean, Virginia, New York, and Massachusetts. Data on Indian shipments are scarce, because colonists, wanting to avoid taxes and regulations, shipped them on small vessels and kept few records. (The big slaving companies in Europe didn’t have this choice.) From the fragmentary evidence, Gallay has estimated that Carolina merchants bought between thirty and fifty thousand captive Indians between 1670 and 1720. Most of these must have been exported, given the much lower number found by the Carolina census. In the same period, ships in Charleston unloaded only 2,450 Africans (some came overland from Virginia, though).3

Here notice a striking geographical coincidence. By 1700, English colonies were studded along the Atlantic shore from what would become Maine to what would become South Carolina. Northern colonies coexisted with Algonkian-speaking Indian societies that had few slaves and little interest in buying and selling captives; southern colonies coexisted with former Mississippian societies with many slaves and considerable experience in trading them. Roughly speaking, the boundary between these two types of society was Chesapeake Bay, not far from what would become the boundary between slave and non-slave states in the United States. Did the proximity of Indian societies with slaves to sell help grease the skids for what would become African slavery in the South? Was the terrible conflict of the U.S. Civil War a partial reflection of a centuries-old native cultural divide? The implication is speculative, but not, it seems to me, unreasonable.

In any case, the Indian slave trade was immensely profitable—and very short-lived. By 1715 it had almost vanished, a victim in part of its own success. As Carolina’s elite requested more and more slave raids, the Southeast became engulfed in warfare, destabilizing all sides. Victimized Indian groups acquired guns and attacked Carolina in a series of wars that the colony barely survived. Working in groups, Indian slaves proved to be unreliable, even dangerous employees who used their knowledge of the terrain against their owners. Rhode Island denounced the “conspiracies, insurrections, rapes, thefts and other execrable crimes” committed by captive Indian laborers, and banned their import. So did Pennsylvania, Connecticut, Massachusetts, and New Hampshire. The Massachusetts law went out of its way to excoriate the “malicious, surly and revengeful” Indian slaves.

The worst problem, though, was something else. As in Virginia, malaria came to Carolina. At first the English had extolled the colony’s salubrious climate. Carolina, one visitor wrote, has “no Distempers either Epidemical or Mortal”; colonists’ children had “Sound Constitutions, and fresh ruddy Complexions.” The colonists decided to use the warm climate to grow rice, then scarce in England. Soon after came reports of “fevar and ague”—rice paddies are notorious mosquito havens. Falciparum had entered the scene, accompanied a few years later by yellow fever. Cemeteries quickly filled. In some parishes, more than three out of four colonists’ children perished before the age of twenty. As in Virginia, almost half of the deaths occurred in the fall. (One German visitor’s summary: “in the spring a paradise, in the summer a hell, and in the autumn a hospital.”)

Unfortunately, Indians were just as prone to malaria as English indentured servants—and more vulnerable to other diseases. Native people died in ghastly numbers across the entire Southeast. Struck doubly by disease and slave raids, the Chickasaw lost almost half their population between 1685 and 1715. The Quapaw (Arkansas) fell from thousands to fewer than two hundred in about the same period. Other groups vanished completely—the last few dozen Chakchiuma were absorbed by the Choctaw. The Creek grew to power by becoming, in the phrase of one writer, “the receptacle for all distressed tribes.” It was God’s will, Carolina’s former governor observed in 1707, “to send unusual Sicknesses” to the Westo Indians, “to lessen their numbers; so that the English, in comparison to the Spaniard, have but little Indian Blood to answer for.”

Naturally, the colonists looked for a different solution to their labor needs—one less vulnerable to disease than European servants or Indian slaves.

VILLA PLASMODIA

Like other cells, red blood cells are covered by a surface membrane made up of proteins, the long, chain-like molecules that are the principal constituents of our bodies. One of these proteins is the Duffy antigen. (The name comes from the patient on whose blood cells the protein was first discovered; an “antigen” is a substance recognized by the immune system.) The Duffy antigen’s main function is to serve as a “receptor” for several small chemical compounds that direct the actions of the cell. The compounds plug into the receptor—think of a spaceship docking at a space station, scientists say—and use it as a portal to enter the cell.

The Duffy antigen is not especially important to red blood cells. Nonetheless, researchers have written hundreds of papers about it. The reason is that Plasmodium vivax also uses the Duffy antigen as a receptor. Like a burglar with a copy of the front-door key, it inserts itself into the Duffy antigen, fooling the blood cell into thinking it is one of the intended compounds and thereby gaining entrance.

Duffy’s role was discovered in the early 1970s by Louis H. Miller and his collaborators at the National Institutes of Health’s Laboratory of Parasitic Disease. To nail down the proof, Miller and his collaborators asked seventeen men, all volunteers, to put their arms into boxes full of mosquitoes. The insects were chockablock with Plasmodium vivax. Each man was bitten dozens of times—enough to catch malaria many times over. Twelve of the men came down with the disease. (The researchers quickly treated them.) The other five had not a trace of the parasite in their blood. Their red blood cells lacked the Duffy antigen—they were “Duffy negative,” in the jargon—and the parasite couldn’t find its way inside.

The volunteers were Caucasian and African American. Every Caucasian came down with malaria. Every man who didn’t get malaria was a Duffy-negative African American. This was no coincidence. About 97 percent of the people in West and Central Africa are Duffy negative, and hence immune to vivax malaria.

Duffy negativity is an example of inherited immunity, available only to people with particular genetic makeups. Another, more famous example is sickle-cell anemia, in which a small genetic change ends up deforming the red blood cell, making it unusable to the parasite but also less functional as a blood cell. Sickle-cell is less effective as a preventive than Duffy negativity—it provides partial immunity from falciparum malaria, the deadlier of the two main malaria types, but its disabling of red blood cells also leads many of its carriers to an early grave.

Both types of inherited immunity differ from acquired immunity, which is granted to anyone who survives a bout of malaria, in much the way that children who contract chicken pox or measles are thereafter protected against it. Unlike the acquired immunity to chicken pox, though, acquired malaria immunity is partial; people who survive vivax or falciparum acquire immunity only to a particular strain of vivax or falciparum; another strain can readily lay them low. The only way to gain widespread immunity is to get sick repeatedly with different strains.

Inherited malaria resistance occurs in many parts of the world, but the peoples of West and Central Africa have more than anyone else—they are almost completely immune to vivax, and (speaking crudely) about half-resistant to falciparum. Add in high levels of acquired resistance from repeated childhood exposure, and adult West and Central Africans were and are less susceptible to malaria than anyone else on earth. Biology enters history when one realizes that almost all of the slaves ferried to the Americas came from West and Central Africa. In vivax-ridden Virginia and Carolina, they were more likely to survive and produce children than English colonists. Biologically speaking, they were fitter, which is another way of saying that in these places they were—loaded words!—genetically superior.

Racial theorists of the last century claimed that genetic superiority led to social superiority. What happened to Africans illustrates, if nothing else, the pitfalls of this glib argument. Rather than gaining an edge from their biological assets, West Africans saw them converted through greed and callousness into social deficits. Their immunity became a wellspring for their enslavement.

How did this happen? Recall that vivax, covertly transported in English bodies, crossed the Atlantic early, as I said; certainly by the 1650s, given the many descriptions of tertian fever, quite possibly before. Recall, too, that by the 1670s Virginia colonists had learned how to improve the odds of survival; seasoning deaths had fallen to 10 percent or lower. But in the next decade the death rate went up again—a sign, according to the historians Darrett and Anita Rutman, of the arrival of falciparum. Falciparum, more temperature-sensitive than vivax, never thrived in England and thus almost certainly was ferried over the ocean inside the first African slaves.

Falciparum created a distinctive pattern. Africans in Chesapeake Bay tended to die more often than Europeans in winter and spring—the result, the Rutmans suggested, of bad nutrition and shelter, as well as unfamiliarity with ice and snow. But the African and European mortality curves crossed between August and November, when malaria, contracted during the high mosquito season of early summer, reaches its apex. During those months masters were much more likely to perish than slaves—so much more that the overall death rate for Europeans was much higher than that for Africans. Much the same occurred in the Carolinas. Africans there, too, died at high rates, battered by tuberculosis, influenza, dysentery, and human brutality. Many fell to malaria, as their fellows brought Plasmodium strains they had not previously encountered. But they did not die as fast as Europeans.

Because no colonies kept accurate records, exact comparative death rates cannot be ascertained. But one can get some idea by looking at another continent with endemic malaria that Europe tried to conquer: Africa. (The idea that one can compare malaria rates in places separated by the Atlantic Ocean is in itself a mark of the era we live in, the Homogenocene.) Philip Curtin, one of slavery’s most important historians, burrowed in British records to find out what happened to British soldiers in places like Nigeria and Namibia. The figures were amazing: nineteenth-century parliamentary reports on British soldiers in West Africa concluded that disease killed between 48 percent and 67 percent of them every year. The rate for African troops in the same place, by contrast, was about 3 percent, an order-of-magnitude difference. African diseases slew so many Europeans, Curtin discovered, that slave ships often lost proportionately more white crewmen than black slaves—this despite the horrendous conditions belowdecks, where slaves were chained in their own excrement. To forestall losses, European slavers hired African crews.

The disparity between European and African death rates in the colonial Americas was smaller, because many diseases killed Europeans in Africa, not just malaria and yellow fever. But a British survey at about the same time as the parliamentary report indicated that African survival rates in the Lesser Antilles (the southern arc of islands in the Caribbean) were more than three times those of Europeans. The comparison may understate the disparity; some of those islands had little malaria. It seems plausible to say that in the American falciparum and yellow fever zone the English were, compared to Africans, somewhere between three and ten times more likely to die in the first year.

For Europeans, the economic logic was hard to ignore. If they wanted to grow tobacco, rice, or sugar, they were better off using African slaves than European indentured servants or Indian slaves. “Assuming that the cost of maintaining each was about equal,” Curtin concluded, “the slave was preferable at anything up to three times the price of the European.”

Slavery and falciparum thrived together. Practically speaking, P. falciparum could not establish itself for long in Atlantic City, New Jersey; the average daily minimum temperature is above 66 degrees, the threshold for the parasite, for only a few weeks per year. But in Washington, D.C., just 120 miles south, slightly warmer temperatures let it become a menace every fall. (Not for nothing is Washington called the most northern of southern cities!) Between these two cities runs the Pennsylvania-Maryland border, famously surveyed by Charles Mason and Jeremiah Dixon in 1768. The Mason-Dixon Line roughly split the East Coast into two zones, one in which falciparum malaria was an endemic threat, and one in which it was not. It also marked the border between areas in which African slavery was a dominant institution and areas in which it was not (and, roughly, the division between indigenous slave and non-slave societies). The line delineates a cultural boundary between Yankee and Dixie that is one of the most enduring divisions in American culture. An immediate question is whether all of these are associated with each other.

For decades an influential group of historians argued that southern culture was formed in the cradle of its great plantations—the sweeping estates epitomized, at least for outsiders, by Tara in the movie Gone with the Wind. The plantation, they said, was an archetype, a standard, a template; it was central to the South’s vision of itself. Later historians criticized this view. Big colonial plantations existed in numbers only in the southern Chesapeake Bay and the low country around Charleston. Strikingly, these were the two most malarial areas in the British colonies. Sweeping drainage projects eliminated Virginia’s malaria in the 1920s, but coastal South Carolina had one of the nation’s worst Plasmodium problems for another two decades. From this perspective, the movie’s Tara seems an ideal residence for malaria country: atop a hill, surrounded by wide, smooth, manicured lawns, its tall windows open to the wind. Every element is as if designed to avoid Anopheles quadrimaculatus, which thrives in low, irregular, partly shaded ground and still air. Is the association between malaria and this Villa Plasmodia style a coincidence? It seems foolish to rule out the possibility of a link.

“What would be the attitudes of a population that had a relatively high rate of illness and short life expectancy?” asked the Rutmans. Some have suggested that the reckless insouciance and preoccupation with display said to be characteristic of antebellum southern culture are rooted in the constant menace of disease. Others have described a special calm in the face of death. Maybe so—but it is hard to demonstrate that southerners were, in fact, unusually rash or vain or stoic. Indeed, one could imagine arguing the opposite: that the steady, cold breath of mortality on southerners’ necks could make them timid, humble, and excitable.

Tara (shown behind Scarlett O’Hara in this publicity image from Gone with the Wind) was created on a studio backlot. Nonetheless, it was a faithful image of the classic southern plantation. High on a nearly treeless hill, with tall windows to admit the breeze, it was ideally suited to avoid mosquitoes and the diseases that accompanied them. (Photo credit 3.3)

More than four hundred species of mosquito belong to the genus Anopheles. Perhaps a quarter can transmit malaria, but only about thirty species are common vectors. More than a dozen of these thirty exist in the Americas, the most important being A. quadrimaculatus, A. albimanus, and A. darlingi. Their habitat range and the average temperature go far to explain why the history of certain parts of the Americas—and not others—was dominated by malaria.

Click here to view a larger image.

A different point is more susceptible to empirical demonstration: the constant risk of disease meant that the labor force was unreliable. The lack of assurance penalized small farmers, who were disproportionately affected by the loss of a few hands. Meanwhile, the Rutmans noted, “a large labor force insured against catastrophe.” Bigger planters had higher costs but were better insulated. Over time, they gained an edge; smaller outfits, meanwhile, struggled. Accentuating the gap, wealthy Carolinian plantation owners could afford to move to resorts in the fever-free mountains or shore during the sickness season. Poor farmers and slaves had to stay in the Plasmodium zone. In this way disease nudged apart rich and poor. Malarial places, the Rutmans said, drift easily toward “exaggerated economic polarization.” Plasmodium not only prodded farmers toward slavery, it rewarded big plantations, which further lifted the demand for slaves.

Malaria did not cause slavery. Rather, it strengthened the economic case for it, counterbalancing the impediments identified by Adam Smith. Tobacco planters didn’t observe that Scots and Indians died from tertian fever and then plot to exploit African resistance to it. Indeed, little evidence exists that the first slave owners clearly understood African immunity, partly because they didn’t know what malaria was and partly because people in isolated plantations could not easily make overall comparisons. Regardless of whether they knew it, though, planters with slaves tended to have an economic edge over planters with indentured servants. If two Carolina rice growers brought in ten workers apiece and one ended up after a year with nine workers and the other ended up with five, the first would be more likely to flourish. Successful planters imported more slaves. Newcomers imitated the practices of their most prosperous neighbors. The slave trade took off, its sails filled by the winds of Plasmodium.

Slavery would have existed in the Americas without the parasite. In 1641 Massachusetts, which had little malaria, became the first English colony to legalize slavery explicitly. During the mid-nineteenth century, the healthiest spot in English North America may have been western Massachusetts’s Connecticut River Valley, according to an analysis by Dobson and Fischer. Malaria there was almost nonexistent; infectious disease, by the standards of the day, extremely rare. Yet slavery was part of the furniture of daily life—at that time almost every minister, usually the most important man in town, had one or two. About 8 percent of the inhabitants of the main street of Deerfield, one of the bigger villages in the valley, were African slaves.

On the other side of the hemisphere’s malaria belt, the southern terminus of the habitat for Anopheles darlingi, the main South American vector for falciparum, is by the Rio de la Plata (Silver River), the border between Spanish and Portuguese America. South of the river is Argentina. With few mosquitoes to transmit Plasmodium, Argentina had little malaria. Yet, like Massachusetts, it had African slaves; between 1536, when Spain founded its first colony on the Rio de la Plata, and 1853, when Argentina abolished slavery, 220,000 to 330,000 Africans landed in Buenos Aires, the main port and capital.

On the other side of the mosquito border were the much bigger Brazilian ports of Rio de Janeiro and São Paulo, where at least 2.2 million slaves arrived. Despite the difference in size, southern Brazil and Argentina were demographically similar: in the 1760s and 1770s, when Spain and Portugal first systematically censused in their colonies, about half of the population in both areas was of African descent. Yet the impact of slavery in them was entirely different. Slavery was never critical to colonial Argentina’s most important industries; colonial Brazil could not have functioned without it. Argentina was a society with slaves; Brazil was culturally and economically defined by slavery.

All American colonies, in sum, had slaves. But those to which the Columbian Exchange brought endemic falciparum malaria ended up with more. Falciparous Virginia and Brazil became slave societies in ways that non-falciparous Massachusetts and Argentina were not.

YELLOW JACK

In the 1640s a few Dutch refugees from Brazil landed on Barbados, the easternmost Caribbean island. Unlike the rest of the Caribbean, Barbados never had a large Indian population. English colonists moved in, hoping to capitalize on the tobacco boom. When the Dutch refugees arrived the island had about six thousand inhabitants, among them two thousand indentured servants and two hundred slaves. Tobacco had turned out not to grow particularly well on Barbados. The Dutch showed the colonists how to plant sugarcane, which they had learned during an ill-fated venture in Brazil. Europe, then as now, had a sweet tooth; sugar was as popular as it was hard to come by. Barbados proved to be good cane territory. Production rapidly expanded.

Sugar production is awful work that requires many hands. The cane is a tall, tough Asian grass, vaguely reminiscent of its distant cousin bamboo. Plantations burn the crop before harvest to prevent the knifelike leaves from slashing workers. Swinging machetes into the hard, soot-smeared cane under the tropical sun, field hands quickly splattered themselves head to foot with a sticky mixture of dust, ash, and cane juice. The cut stalks were crushed in the mill and the juice boiled down in great copper kettles enveloped in smoke and steam; workers ladled the resultant hot syrup into clay pots, where the pure sugar crystallized out as it cooled. Most of the leftover molasses was fermented and distilled to produce rum, a process that required stoking yet another big fire under yet another infernal cauldron.

The question as ever was where the required labor would come from. As in Virginia, slaves then typically cost twice as much as indentured workers, if not more. But the Dutch West India Company, a badly run outfit that was desperate for cash, was willing to sell Africans cheap in Barbados. Slaves and indentured servants there were roughly the same price. As one would expect, the island’s new sugar barons imported both by the thousands: the sweepings of English streets and luckless captives from Angolan and Congolese wars. Covered in perspiration and gummy cane soot, Europeans and Africans wielded machetes side by side. Then the Columbian Exchange raised the relative cost of indentured servants.

Hidden on the slave ships was a hitchhiker from Africa: the mosquito Aedes aegypti. In its gut A. aegypti carried its own hitchhiker: the virus that causes yellow fever, itself also of African origin. The virus spends most of its time in the mosquito, using human beings only to pass from one insect to the next. Typically it remains in the body no more than two weeks. During this time it drills into huge numbers of cells, takes over their functioning, and uses the hijacked genetic material to produce billions of copies of itself. These flood the bloodstream and are picked up by biting aegypti. For imperfectly understood reasons this cellular invasion usually has little impact on children. Adults are hit by massive internal bleeding. The blood collects and coagulates in the stomach. Sufferers vomit it blackly up—the signature symptom of yellow fever. Another symptom is jaundice, which gave rise to the disease’s nickname of “yellow jack.” (A yellow jack was the flag flown by quarantined ships.) The virus kills about half of its victims—43 to 59 percent in six well-documented episodes McNeill compiled in Mosquito Empires. Survivors acquire lifelong immunity. In Africa yellow fever was a childhood disease that inflicted relatively little suffering. In the Caribbean it was a dire plague that passed over Africans while ravaging Europeans, Indians, and slaves born in the islands.

The first yellow fever onslaught began in 1647 and lasted five years. Terror spread as far away as Massachusetts, which instituted its first-ever quarantine on incoming vessels. Barbados had more Africans and more Europeans per square mile than any other Caribbean island, which is to say that it had more potential yellow fever carriers and potential yellow fever victims. Unsurprisingly, the epidemic hit there first. As it began a man named Richard Ligon landed in Barbados. “We found riding at Anchor, 22 good ships,” he wrote later,

with boats plying to and fro, with Sails and oars, which carried commodities from place to place: so quick stirring, and numerous, as I have seen it below the bridge at London. Yet notwithstanding all this appearance of trade, the Inhabitants of the Islands, and shipping too, were so grievously visited with plague (or as killing a disease) that before a month was expired, after our arrival, the living were hardly able to bury the dead.

Six thousand died on Barbados alone in those five years, according to one contemporary estimate. Almost all of the victims were European—a searing lesson for the island’s colonists. McNeill estimates that the epidemic “may have killed 20 to 50 percent of local populations” in a swathe from coastal Central America to Florida.

The epidemic didn’t kill off the sugar industry—it was too lucrative. Incredibly, Barbados, an island of 166 square miles, was then on its way to making more money than all of the rest of English America. Meanwhile sugar had expanded to nearby Nevis, St. Kitts, Antigua, Montserrat, Martinique, Grenada, and other places. (Cuba had begun growing sugar decades before, but production was small; Spaniards were much too preoccupied by silver to pay attention.) A heterogeneous mass of English, French, Dutch, Spanish, and Portuguese was clearing these islands as fast as possible, sticking cane in the flatlands and cutting trees on the slopes for fuel. Deforestation and erosion were the nigh-unavoidable result; rainfall, no longer absorbed by vegetation, washed soil down the slopes, forming coastal marshes. In the not-too-distant future workers would be ordered to carry the soil in baskets back up the hills—“a true labor of Sisyphus,” McNeill remarked in Mosquito Empires. McNeill quotes one Caribbean naturalist marveling at “the inconsideration or rather stupidity of west Indian planters in extinguishing many useful woods that spontaneously grow on those islands.” Writing in 1791, the naturalist judged that many islands were “almost rendered unfit for cultivation.”

Even the worst ecological mismanagement benefits some species. Among the winners in the Caribbean was Anopheles albimanus, the region’s most important malaria vector. A resident of the bigger Caribbean islands and coastal areas in Yucatán and Central America, A. albimanus is a reluctant malaria host, hard for falciparum to infect and slow to pick up vivax (many mosquitoes have bacteria in their gut that inhibit the parasite). It likes to breed in coastal, algae-covered marshes under the open sun. Erosion and deforestation are its friends. Field experiments have shown that albimanus can reproduce in huge numbers when it has favorable habitat. Given its preferences, the European move into the Caribbean must have marked the beginning of a golden age. As the mosquito population soared, P. vivax had more opportunities to overcome the mosquito’s reluctance to host it. (Indeed, it may have beaten the mosquito while traveling with Colón; in addition to the reference to çiçiones in the admiral’s second voyage, his son Hernán later claimed that “intermittent fever” appeared on his fourth voyage.) From the Caribbean, vivax malaria spread into Mexico. Falciparum came much later, the delay partly due to A. albimanus’s more complete resistance to the parasite.

Sugar plantations denuded Barbados, as shown in the background of this photograph of workers’ huts in the 1890s. (Photo credit 3.5)

Another beneficiary was Aedes aegypti, the yellow fever vector. A. aegypti likes to breed in small amounts of clear water near human beings; naval water casks are a well-known favorite. Sugar mills abounded with equivalent vessels: the crude clay pots used to crystallize sugar. Plantations had hundreds or thousands of these vessels, which were only used for part of the year and often broken. Today we know that aegypti likes to breed in the puddles that collect in the interior of cast-off automobile tires. Sugar pots were a seventeenth- and eighteenth-century equivalent. McNeill noted that the pots would have been full of sugary residue, fodder for the bacteria that aegypti larvae feed upon. Sugar plantations were like factories for producing yellow fever.

Incoming Europeans didn’t know these details. But they were entirely aware that the Caribbean was, as historian James L. A. Webb wrote in a recent history of malaria, “a lethal environment for non-immunes.”

Malaria percolated from the Caribbean into South America, and thence up the Amazon. The river has a plenitude of hosts: a 2008 survey of the Madeira River, an important Amazonian tributary, found no less than nine Anopheles mosquito species, all of which carried the parasite. The first Europeans to visit Amazonia described it as a thriving, salubrious place; malaria and, later, yellow fever turned many rivers into death traps. By 1782 the parasite was sabotaging expeditions into the upper reaches of the river basin. For two centuries the disease was a sometime, scattered thing: big stretches of Amazonia, depopulated by smallpox and slavery, had too few inhabitants to sustain the parasite. It may have been more common in the far western tributaries like the Madeira, because they experienced fewer Dutch and Portuguese slave raids, and thus had more people to infect. Malaria nearly killed French naturalist Alcide d’Orbigny in 1832 in the Madeira region, but a decade later another naturalist, the U.S. amateur William Henry Edwards, “encountered but one case” of it on the river, despite camping for days near its mouth.

Much worse was the northeastern bulge of South America, the region the geographer Susanna Hecht has called the Caribbean Amazon. Bounded to the south by the Amazon River in Brazil and to the west by the Orinoco River in Venezuela, it was a watery place that Arawak and Carib people controlled with sprawling networks of dikes, dams, canals, berms, and mounds. Large expanses of forest were managed for tree crops, especially the palms that in tropical places provide fruit, oil, starch, wine, fuel, and building material. Beneath the palms lay scattered patches of manioc (cassava). This landscape of gardens, orchards, and waterways served for centuries as a bridge between the interior and the islands. Such complex arrangements typically are supervised by strong, well-organized governments. Europeans certainly thought the Indians had them—it explained why their repeated efforts to seize this rich agricultural land were repelled. Only in the eighteenth century did the foreigners gain a foothold, aided by the introduction of European diseases: smallpox, tuberculosis, and influenza cleared the way for malaria. Indians retreated into the interior as Europeans seized the coast, creating sugar plantations in what eventually became, after much international squabbling, Guyane (French Guiana), Suriname (formerly Dutch Guiana), and Guyana (formerly British Guiana).

Archetypical may have been Guyane, which was formally acquired by France in a treaty in 1763. Initial colonization efforts proved so disastrous that the nation almost forgot its existence until three decades later, when a military-backed coup overthrew the parliament established by the French Revolution. The new dictatorship piled 328 unwanted deputies, clergymen, and journalists into small vessels and dumped them in the colony. P. falciparum greeted them on the shore. Within two years more than half were dead, either killed by malaria or sufficiently weakened by it to be slain by other ailments. Undeterred, the French state kept sending over criminals and undesirables. French prisoners had in the past served as galley slaves on special prison ships in the Mediterranean. After the steam engine made galleys obsolete convicts were dispatched to Guyane. Violent offenders ended up in the infamous prison on Devils Island, seven miles off the coast; the rest joined chain gangs of agricultural labor. Disease claimed so many that Guyane became known as a “dry guillotine”—a blade that killed without needing to wet itself with blood. Perhaps eighty thousand Frenchmen made the passage. Very, very few returned.

Unable to settle in disease zones, Europeans never established communities there. The ideal was offshore ownership. Europeans would remain in the safety of the home country while small numbers of onsite managers directed the enslaved workforce. Because captives would outnumber captors, intimidation and brutality would be necessary to keep the sugar mills grinding. In the realm of falciparum and yellow fever, sugar despotism became the rule: tiny bands of Europeans atop masses of transplanted Africans, angry or demoralized or stoic according to their characters.

Nothing is wrong with offshore ownership per se. If French wine makers buy wineries in California or U.S. wine makers buy wineries in Bordeaux or Burgundy, the acquisitions may sting local pride but are unlikely to have any larger effect on either nation. It is different if foreign wine makers buy every winery—or, stronger yet, if people thousands of miles away dominate every industry. One all-too-representative example: a single Liverpool firm, Booker Brothers, controlled three-quarters of British Guiana’s economy for almost a century. All the profits ended up on the other side of the ocean. So did all of the entrepreneurial, managerial, and technical expertise. Locals provided only labor. Indeed, they were punished if they tried to do anything else.

The French artist Édouard Riou, now best known for his illustrations of Jules Verne, traveled to France’s colony of Guyane in 1862–63. A visit to the prison islands produced this image of the sea burial of a convict, presumably a victim of malaria or yellow fever. (Photo credit 3.4)

As the economists Acemoglu, Johnson, and Robinson noted, distant and disconnected owners had little interest in building the institutions necessary to maintain complex societies: schools, highways, sewers, hospitals, parliaments, legal codes, agricultural-extension agencies, and other governmental systems. In places with a full array of functioning institutions, locals can compete economically with foreigners by developing new technologies and new business methods. In extractive states, locals never got the chance. Most of the English colonists who went to Virginia or Australia were servants or convicts, the lowest tiers in the social pyramid. But despite their bottom ranking their status as citizens gave them some ability to use the institutions of the homeland to push back when their leaders tried to oppress them. (Australia’s convicts, for example, began winning lawsuits against their would-be abusers almost as soon as they landed.) The slaves in extractive states had no such ability to tap into these institutions. Indeed, elites actively sought to cut off their access. A particular worry was education; echoing many in British Guiana, Booker president Josiah Booker denounced the notion of teaching his company’s employees to read because it would encourage them to aspire “far above their station in life.” Wrong ideas in the wrong people’s hands could put the elites’ political power at risk.

History suggests, Acemoglu, Johnson, and Robinson wrote, that industrialization cannot occur without “both investments from a large number of people who were not previously part of the ruling elite and the emergence of new entrepreneurs.” Both are next to impossible in extractive states. Over the decades, reformers tried to counteract the system’s effects. Missionaries provided education for Guyana’s children; the British Anti-Slavery Society thundered unceasingly against mistreatment, launched investigations, and provided aid. “Jock” Campbell, the visionary head of Booker Brothers’ corporate successor, spent decades improving sugar workers’ conditions. The reformers did everything but change the basic extractive system. When Guyana gained formal independence in 1966, 80 percent of its export earnings were controlled by three foreign companies, one of them Campbell’s. The new nation had just one university, a night school established three years before.

WAR AND MOSQUITOES

In malaria zones, the primary victims are children. Adults as a rule have already had the disease and become immune upon survival. The adults who have most to fear are recent arrivals—a lesson that was learned in the Americas again and again, perhaps most dramatically during the U.S. Civil War. Much of the war was fought in the South by troops from the North. Crossing the Mason-Dixon Line, Yankees broke an epidemiological barrier. The effects were enormous.

In July 1861, three months after the conflict began, the Union’s Army of the Potomac marched from Washington to the Confederate capital of Richmond, Virginia. It was repulsed at what became known to Yankees as the Battle of Bull Run and to Confederates as the Battle of Manassas. After fleeing to Washington, the generals dragged their feet about further action. President Lincoln railed against their pusillanimity, but they may have had a point. In the year after Bull Run, more than a third of the Army of the Potomac suffered from what army statistics describe as remittent fever, quotidian intermittent fever, tertian intermittent fever, quartan intermittent fever, or congestive intermittent fever—terms generally taken today to mean malaria. Union troops in North Carolina fared still worse. An expeditionary force of fifteen thousand landed at Roanoke Island in early 1862, and spent much of the war enforcing a naval blockade from a fort on the coastline. The air at dusk shimmered with Anopheles quadrimaculatus. Between the summer of 1863 and the summer of 1864, the official annual infection rate for intermittent fevers was 233 percent—the average soldier was felled two times or more.

From the beginning the Union army was bigger and better supplied than the Confederate army. As at Bull Run, though, the North lost battle after battle. Incompetent generalship, valiant opponents, and long supply lines were partly to blame. But so was malaria—the price of entering the Plasmodium zone. During the war the annual case rate never dropped below 40 percent. In one year Plasmodium infected 361,968 troops. The parasite killed few directly, but it so badly weakened them that they succumbed readily to dysentery or measles or what military doctors then called “chronic rheumatism” (probably a strep infection). At least 600,000 soldiers died in the Civil War, the most deadly conflict in U.S. history. Most of those lives were not lost in battle. Disease killed twice as many Union troops as Confederate bullets or shells.

Malaria affected the course of the war itself. Sick soldiers had to be carried in litters or shipped out at considerable cost. With so many sick for so long the resource drain was constant. Confederate generals did not control malaria or even know what it was, but it was an extra arrow in their quiver. Plasmodium likely delayed the Union victory by months or even years.

In the long run this may be worth celebrating. Initially the North proclaimed that its goal was to preserve the nation, not free slaves; with few dissenting votes, Congress promised rebel states that “this war is not waged” for the “purpose of overthrowing or interfering with [their] rights or established institutions,” where “established institutions” was taken to mean slavery. The longer the war ground on, the more willing grew Washington to consider radical measures. Should part of the credit for the Emancipation Proclamation be assigned to malaria? The idea is not impossible.

Plasmodium’s contribution to the birth of the United States was stronger still. In May of 1778 Henry Clinton became commander in chief of the British forces during the Revolutionary War. Partly on the basis of inaccurate reports from American exiles in London, the British command believed that the Carolinas and Georgia were full of loyalists who feared to announce their support of the home country. Clinton decided upon a “southern strategy.” He would send a force south, which would hold the region long enough to persuade the silent loyalist majority to declare its support for the king. In addition, he promised, slaves who fought for his side would be freed. Although Clinton didn’t know it, he was leading an invasion of the malaria zone.

Although almost forgotten today, yellow fever was a terror from the U.S. South to Argentina until the 1930s, when a safe vaccine was developed. This cartoon illustrated a magazine article about an 1873 outbreak in Florida. (Photo credit 3.6)

English troops were not seasoned; indeed, two-thirds of the troops who served in 1778 were from malaria-free Scotland. To be sure, many British soldiers had by 1780 spent a year or two in the colonies—but mostly in New York and New England, north of the Plasmodium line. By contrast, the southern colonists were seasoned; almost all were immune to vivax and many had survived falciparum.

British troops successfully besieged Charleston in 1780. Clinton left a month later and instructed his troops to chase the Americans into the hinterlands. The man he put in charge of the foray was Major General Charles Cornwallis. Cornwallis marched inland in June, high season for Anopheles quadrimaculatus. By autumn, the general complained, disease had “nearly ruined” his army. So many men were sick that the British could barely fight. Loyalist troops from the colonies were the only men able to march. Cornwallis himself lay feverish while his Loyalists lost theBattle of Kings Mountain. “There was a big imbalance. Cornwallis’s army simply melted away,” McNeill told me.

Beaten back by disease, Cornwallis abandoned the Carolinas and marched to Chesapeake Bay, where he planned to join another British force. He arrived in June 1781. Clinton ordered him to take a position on the coast, where the army could be transported to New York if needed. Cornwallis protested: Chesapeake Bay was famously disease ridden. It didn’t matter; he had to be on the coast if he was to be useful. The army went to Yorktown, fifteen miles from Jamestown, a location Cornwallis bitterly described as “some acres of an unhealthy swamp.” His camp was between two marshes, near some rice fields.

To Clinton’s horror and surprise, a French fleet appeared off Chesapeake Bay, sealing in Cornwallis. Meanwhile, Washington marched south from New York. The revolution was so short of cash and supplies that his army had twice mutinied. Nonetheless an opportunity had arisen. The British army was unable to move; Cornwallis later estimated that only 3,800 of his 7,700 men were fit to fight. McNeill takes pains to credit the bravery and skill of the revolution’s leaders. But what he wryly referred to as “revolutionary mosquitoes” played an equally critical role. “Anopheles quadrimaculatus stands tall among the Founding Fathers,” he said to me. With Cornwallis’s troops falling to the Columbian Exchange in ever-greater numbers, the British army surrendered, effectively creating the United States, on October 17, 1781.

Click here to view a larger image of this entire map.

Click here to view a larger image of this entire map.

1 It may seem odd that malaria, a tropical disease, flourished in England of the Little Ice Age. But history is an interplay of social and biological processes. Just as Elizabethan marsh-draining techniques unintentionally helped vivax flourish, the improved drainage methods of the Victorian era dramatically cut malaria, because they didn’t leave the brackish pools, thus simultaneously eliminating mosquito habitat and creating better pasturage for cattle, which A. maculipennis, if given the choice, prefers to feed upon. Even so, researchers routinely found “thousands” of the insects roosting “in the dark and ill-ventilated pigsties” of poor coastal farmers as late as the 1920s. Today some fear that global warming will foster the spread of malaria. But if people continue destroying mosquito habitat by draining wetlands the hotter weather may have no impact on malaria rates.

2 Early arrival of the parasite could help explain, too, why Opechancanough never expelled the colonists, even after almost wiping them out in 1622. Debilitated by disease, the Powhatan might have had difficulty mounting a sustained war. Unhappily, these intriguing speculations have the disadvantage of having no empirical support.

3 These figures do not include Indians seized in other colonies. During a vicious Indian war in 1675–76, for instance, Massachusetts sent hundreds of native captives to Spain, Portugal, Hispaniola, Bermuda, and Virginia. And the French in New Orleans seized thousands more. Carolina was a bigger slaver than others, but every English colony in North America was in the same business, with or without the cooperation of local Indians.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!