FIVE

THE STABLE AND THE LABORATORY

Far from the battlefields of the nation’s first overseas colonial wars, American health officials on the U.S. mainland encountered rising resistance after 1900 to their own widening war on smallpox. The contentious politics of smallpox control centered on the growing divide between public health authorities and the public itself regarding the risks of vaccination.

Turn-of-the-century Americans lived in a world filled with risk. Each year one out of every fifty workers was killed on the job or disabled for at least four weeks due to a work accident. Railroad and streetcar accidents annually killed and maimed tens of thousands of people. Children worked in mines, stole rides on the back of moving cars, and played stickball in alleys carpeted with horse manure. Apart from a few things recognized by the courts as “imminently dangerous,” such as arsenic or nitroglycerin, product liability did not exist. The average American breadwinner carried just enough insurance to cover his own burial.1

During the first two decades of the twentieth century, a spate of new progressive social policies would create an enlarged role for the American government in managing the ordinary risks of modern urban-industrial life. The resulting “socialization” of risk, though narrow by the standards of Britain and Germany, was a dramatic departure for American institutions that prized individual freedom and responsibility. European-style social insurance gained traction in the first American workman’s compensation laws, enacted in forty-two states between 1911 and 1920. Mothers’ pension programs (launched in forty states during the same decade) provided aid to families that lost the wages of the “normal” (male) breadwinner due to his sudden death or disability. In tort law, too, the courts had women and children first in mind as they imposed tougher standards of liability upon railroad corporations. U.S. social politics still had a long way to go before a recognizably modern national welfare state insured its citizens against the financial insecurities of old age, or an American court seriously entertained the argument that an exploding Coke bottle entitled the injured party to compensation from the manufacturer. But the foundation was laid, in the social and political ferment of the Progressive Era, for a government that would one day promise its citizens “freedom from fear.”2

Arriving just as the American people and their policy makers began to seriously debate these issues, the turn-of-the-century smallpox epidemics raised broad public concerns about the quality and safety of the nation’s commercial vaccine supply. The ensuing controversy caused ordinary Americans, private physicians, and public officials to revise old expectations about risk and responsibility and the role of government in managing both.3

By the fall of 1901, the wave of American epidemics had carried small-pox to every state and territory in the union. The new mild type smallpox was the culprit in the majority of places, but deadly variola major struck several major American cities, particularly in the Northeast. Compulsory vaccination was the order of the day, enforced at the nation’s borders, in cities and towns, at workplaces, and, above all, in the public schools. The public policy was a boon to the vaccine industry, driving up demand for smallpox vaccine. American vaccine makers of the day ranged in size from rising national pharmaceutical firms such as Detroit’s Parke, Davis & Company and Philadelphia’s H. K. Mulford Company (a U.S. forerunner of today’s Merck) to the dozens of small “vaccine farms” that sprouted up around the country. To meet the unprecedented demand for vaccine-coated ivory points or capillary tubes of liquid lymph, the makers flooded the market with products, some inert, some “too fresh,” and some seriously tainted. Complaints of vaccine-induced sore arms and feverish bodies filled the newspapers and medical journals. Every family seemed to have its own horror story.

Popular distrust of vaccine surged in the final months of the year, as newspapers across the country reported that batches of tetanus-contaminated diphtheria antitoxin and smallpox vaccine had caused the deaths of thirteen children in St. Louis, four in Cleveland, nine in Camden, and isolated fatalities in Philadelphia, Atlantic City, Bristol (Pennsylvania), and other communities. In all but St. Louis, where antitoxin was the culprit, the reports implicated vaccine. Even The New York Times, a relentless champion of compulsory vaccination, expressed horror at the news from Camden, the epicenter of the national vaccine scare. “Vaccination has been far more fatal here than smallpox,” the paper told its readers. “Parents are naturally averse to endangering their children to obey the law, claiming that the chances of smallpox seem to be less than those of tetanus.”4

Pain, sickness, and the occasional death after vaccination were nothing new. But the clustering, close sequence, and staggering toll of these events was unprecedented in America. Newspaper stories of children dying in terrible agony—their jaws locked and bodies convulsing, as helpless parents and physicians bore witness—turned domestic tragedies into galvanizing public events. Allegations of catastrophic vaccine failure triggered extraordinary levels of conflict between angry citizens and defensive officials. In one typical incident, which occurred as the ninth Camden child entered her death throes, the health officials of Plymouth, Pennsylvania, discovered that many parents, ordered to get their children vaccinated for school, were secretly wiping the vaccine from their sons’ and daughters’ arms.5

Jolted from their professional complacency, physicians and public health officials were forced to reconsider the existing distribution of coercion and risk in American public health law. In one sense, compulsory vaccination orders, whether they applied only to schoolchildren or to the public at large, already socialized risk. The orders imposed a legal duty upon individuals (and also parents) to assume the risks of vaccination in order to protect the entire community from the presumably much greater danger of smallpox. Spreading the risk of vaccination across the community made its social benefit (immunity of the herd) seem a great bargain. As any good progressive knew, the inescapable interdependence of modern social life required just such sacrifices for the public welfare and the health of the state. Still, the state did almost nothing to ensure vaccine quality. The bacteriological revolution spawned a proliferating array of “biologics”—vaccines, antitoxins, and sera of endless variety—that were manufactured in unregulated establishments and distributed, by the companies’ druggist representatives and traveling detail men, in unregulated markets. The risks of these products lay where they fell—on the person left unprotected by an inert vaccine or poisoned by a tainted one.6

The situation illustrates the larger dualism of American law at the turn of the century. Ordinary Americans, particularly working-class people, were caught between the increasingly strong state presence in their everyday social lives and the relatively weak state regulation of the economy. And the government insulated itself from liability. In a leading decision, handed down just three years before the Camden crisis, the Georgia Supreme Court took up the question of whether a municipal government could be sued for injuries caused by bad vaccine used by its public vaccinators. The answer was an unblinking No. Citing “a principle as old as English law, that ‘the King can do no wrong,’” the court refused to allow a resident of Rome, who had submitted to vaccination “under protest,” to sue the government for using “vaccine matter which was bad, poisonous and injurious, and from which blood poisoning resulted.” To allow such a case to proceed, the court warned, “would be to paralyze the arm of the municipal government, and either render it incapable of acting for the public weal, or would render such action so dangerous that the possible evil consequences to it, resulting from the multiplicity of suits, might be as great as the smallpox itself.” The arm of the state was protected; the arm of the citizen was not.7

Supporters of compulsory vaccination defended the policy in a quasi-scientific rhetoric of risk assessment. From the expert point of view, lay concerns about vaccine safety were steeped in ignorance and fear, which should have evaporated in the face of hard statistical evidence. Officials assured the public that vaccines were safer than ever: “the preparation of glycerinized vaccine lymph has now been brought to such perfection that there should be no fear of untoward results in its use,” Surgeon General Walter Wyman said three years before Camden. Even if untoward results did arise, the social benefits of vaccination outweighed the costs. As the Cleveland Medical Journal put it, “Better [by] far two score and ten sore arms than a city devastated by a plague that it is within our power to avert.”8

The vaccine crisis of 1901–2 revealed that cost-benefit analysis was not the only way Americans thought about risk. When the Times observed that Camden parents reasonably concluded that vaccination had become more dangerous than smallpox, turning the public health argument on its head, the paper made a rare concession to vaccination critics. As the Times said, the incidents were “furnishing the anti-vaccinationists with the only good argument they have ever had.” But most worried parents would not have called themselves “anti-vaccinationists.” And much more was involved in the rising popular resistance to vaccination in 1901 than a cool-headed consideration of quantifiable facts.9

Perceptions of risk—the intuitive judgments that people make about the hazards of their world—can be stubbornly resistant to the evidence of experts. This is because risk perceptions are mediated by experience, by culture, and by relations of power. Certain factors tend to elevate the sense of risk that a person associates with a specific thing or activity, even in the face of countervailing statistical data. A mysterious phenomenon whose workings defy the comprehension of laypeople causes more dread than a commonplace hazard. A hazard whose adverse effects may be delayed, rather than immediate, heightens perceived risk. Significantly, perceived risk tends to spike when the hazard is not voluntarily undertaken. This is especially true when the social benefits claimed for a potentially hazardous activity are not readily apparent to those ordered to undertake it.10

All of which helps to explain why in the fall of 1901 popular perceptions diverged so radically from the official line on vaccine safety. A century after the introduction of Jennerian vaccination, vaccines remained mysterious entities—even to the companies that made them and the physicians who used them. Many American communities had experienced neither a small-pox epidemic nor a general vaccination in over fifteen years, increasing both the public’s sense of complacency about the disease and its unfamiliarity with the prophylactic. By force of law, local health boards and school boards ordered citizens to assume the risks of vaccination. Many did, some eagerly, some grudgingly, some only with a billy club against their back. Then the St. Louis and Camden tragedies shocked the nation. Public confidence in the vaccine supply, already shaky, plummeted. Opposition to compulsory vaccination, already strong, surged. Ultimately, these events pierced the veil of official certitude and corporate confidence. Vaccine companies publicly accused each other of peddling poisonous virus. Some health boards suspended vaccination orders. Others launched investigations of vaccine purity and potency. In medical meetings, newspaper columns, and statehouse floors across the country, the debate increasingly turned on a single issue: the right of the state to regulate vaccines. In the fall of 1901, regulation was a controversial idea. A few months later, it was federal law.11

A South Jersey industrial city of 76,000 people, Camden lay just across the sewage-choked Delaware River from Philadelphia. Times were good. Camden’s population had grown by 30 percent during the 1890s. Decent jobs could be had at the Pennsylvania Railroad and in the city’s ironworks, chemical plants, shoe factories, cigar companies, lumber mills, oil cloth factories, and woolen mills. Though the presence of immigrants and other newcomers was more keenly felt than in the past, Camden people remained overwhelmingly white and American-born, a generation or more removed from Europe. Crowded tenements of the sort found in New York and Chicago were scarce. Wage earners lived in low-slung neighborhoods of single-family homes. Like most communities, the people of Camden invested their pride and dreams in the rising generation. In September 1901, eight thousand children took their seats in the city’s thirty-two public schools. By mid-November, half of those desks would be empty.12

The trouble started on October 7. Eight-year-old Pearl Ludwick took ill with smallpox, followed, in quick succession, by her father, an oil cloth printer, and all seven of her brothers and sisters. Only Pearl’s mother was spared the pox; those days must have been among the most trying of her life. Then Pearl’s father and eldest brother rose from bed one night and, both delirious with the fever, bumped a table, which knocked over a lamp. The ensuing blaze burned the Ludwick house to the ground—but not before the Ludwicks got out and hundreds of neighbors rushed to the scene. All, of course, were exposed to smallpox. With this improbable chain of events commenced the Camden smallpox epidemic of 1901–2.13

New Jersey had seen little smallpox during the past sixteen years, and vaccination had fallen out of practice. But in 1901 smallpox seemed to be causing trouble everywhere in the United States, including Philadelphia. That summer, anticipating an epidemic year, the New Jersey Board of Health issued a public warning. “An extensive outbreak of small-pox can be prevented with absolute certainty if vaccination of all susceptible persons is secured,” the board declared. “[T]he question now arises, Shall general vaccination be done before a great calamity compels resort to this preventive measure, or must there first be startling losses of life to arouse parents, guardians, school boards, the public, and in too many instances the health authorities also, to a realizing sense of their duty to institute precautions against the spread of this pestilential disease?” No matter how you parsed that question, the message was dead serious. But it took the Ludwick family fire to bring its meaning home to Camden.14

Camden authorities ordered a municipal pesthouse built, and physicians worked long hours to meet the “rush to get vaccinated.” For those families who still needed convincing, the Camden Board of Education announced that it would enforce an 1887 state law that authorized local boards to exclude unvaccinated children. The Camden Board of Health president, Dr. Henry H. Davis, who happened also to be the medical director of the school board, dispatched vaccinators to the city schools. The Camden Medical Society opened a free vaccine station on Federal Street, in the heart of the city. And many residents were vaccinated by private physicians or, on the cheap, by the neighborhood druggist. Within a month, an estimated 27,000 people—more than one third of the city’s residents—had undergone vaccination, including five thousand public schoolchildren. And the scraping continued. Across the city, children and adults alike had the sore arms and fresh scars to show for it.15

The state board advised physicians to exercise care when performing the procedure. “The operation of vaccination should be conducted with aseptic precautions,” the board instructed, “and none but glycerinized lymph from a trustworthy producer should be employed.” The board was referring to liquid vaccine that had been treated with glycerin, which acted as a preservative and killed bacteria in the product. Glycerinized vaccine was the state of the art. Whether from a sense of political propriety or fair play to the Philadelphia area’s many vaccine companies—including H. M. Alexander’s Vaccine Farm, H. K. Mulford Company, and John Wyeth & Brother—the board refrained from endorsing any make of vaccine and offered no advice as to how anyone might distinguish the “trustworthy” from the more dubious products on the market. Trust was a commercial transaction, not a public dispensation.16

In early November, word spread in Camden that a sixteen-year-old boy named William Brower had come down with tetanus. Few of life’s hazards caused parents more worry than the infectious disease most folks called lockjaw. The New York writer W. J. Lampton called it “one of the strangest and most horrible maladies known to man.” In 1900, more than 2,200 Americans died from it. The tetanus bacillus was discovered in 1884 in a Göttingen laboratory. Since then, scientists had found germs in hay dust, crumbling masonry, garden soil, and, especially, horse manure. Turn-of-the-century America—from the farms to the cities—crawled with the stuff. Even so, as Army Surgeon General Sternberg noted in his treatise, Infection and Immunity , simply ingesting bacilli-rich filth would not cause infection. Nor was tetanus contagious. The bacilli did not grow in the presence of oxygen. It usually took a traumatic event—a wound of some kind, the narrower and deeper the better—to introduce bacilli into a human body in a way that could cause infection. The classic culprit was a rusty nail—not because of the chemical composition of the rust itself, but because it made the surface of the nail rough enough to hold an abundance of bacilli which the sharp, skinny nail could drive home without much bleeding. Every Fourth of July, hundreds of American boys caught tetanus after cutting their hands with toy pistols.17

The symptoms of lockjaw were terrible. William Brower suffered them all. The son of a plumber, the boy had seemed in fine health until he fell suddenly ill around November 1. He suffered a high fever. He felt the telltale stiffness in his face. His jaws tightened like a vise. Excruciating contractions spread from the jaw and neck to all the muscles of the body. His spine arched, as convulsions racked his body. The doctors administered the tetanus antitoxin, a relatively new product with a low rate of success. No one expected the boy to survive. According to the Philadelphia North American, William’s mother Sarah said, in her grief, “Never, never again shall I have one of my children vaccinated.” William had been vaccinated nineteen days earlier. To his parents there seemed no better explanation for his misery. The trusted family physician who had vaccinated William, Dr. William H. Kensinger, disagreed. “Vaccination doesn’t produce tetanus; that I know,” he said.18

Then came the news that sixteen-year-old Lillian Carty was critically ill with tetanus. The daughter of a railroad clerk, Lillian had been vaccinated twenty-one days earlier by Dr. S. G. Bushey, the city coroner and a prominent member of the Camden Board of Health. Lillian’s parents posted a sign at their front door, asking passersby to keep quiet, because the slightest noise agitated her and sent her into convulsions. Antitoxin was administered. No one expected her to survive.19

Neither Brower nor Carty was the first child to die. On November 11, Thomas B. Hazelton, age eleven, the son of a shipping clerk, was in the street playing when he started to feel ill, with a pronounced stiffness in his jaw. Someone called for Dr. Bushey, who as the Hazeltons’ family physician had vaccinated the boy about three weeks earlier. Never had Bushey seen a patient suffer such “terrible agony.” Less than twenty hours after Thomas took to his bed, he was dead. According to the New York Tribune, now covering the Camden story, Bushey moved to set the record straight. “[T]he boy’s death was not the result of vaccination,” the coroner declared. But Thomas’s parents had doubts. Mr. Hazelton said he might seek legal advice. He wanted to know whether the vaccine used on his boy was pure and, if it was not, whether the manufacturer could be held responsible for his death.20

The next day, November 13, tetanus struck nine-year-old Anna Cochran, the daughter of a teamster. She had been vaccinated about three weeks earlier. The story of little Anna’s courage, as convulsions shook her small frame, was, as the New York Sun told it, “particularly sad.” Just before she died, on November 14, Anna “turned to her parents and whispered through her clenched teeth: ‘Don’t worry, papa and mamma, I’m going to get well.’”21

As parents’ initial suspicions swelled into a panic, Dr. Davis of the board of health made a statement to the press. Camden’s most prominent physician attributed the tetanus cases to a period of unusually dry and dusty weather. “I am satisfied that none of them have been caused by vaccination,” said Davis, “but by the tetanus germs in the air.” Local physicians formed a unified public front with Davis and the board, insisting that the vaccine they had used was safe. But a few expressed doubts. Dr. Dowling Benjamin, considered a local authority on tetanus, broke ranks. “This talk of germs being in the air is all absurd,” he said. “If that were so there would be more lockjaw than there now is. I think it is highly probable the tetanus germs were in the vaccine tubes before they were sealed.”22

Local newspapermen turned up three more dead children whose deaths by tetanus had previously gone unreported. Eleven-year-old Anna Warrington, the only child of an illiterate ship carpenter and his wife, had died on November 8, after suffering in “great agony.” Six-year-old Frank Cavallo, the child of Italian immigrants (his father was an illiterate rag dealer), had been vaccinated in Philadelphia during a visit to his grandmother; he died three weeks later, on November 9. The other new victim, unnamed, lay buried in the Evergreen Cemetery, believed to have died on November 5. A growing distrust of the authorities strengthened the public’s fears. Why hadn’t public health officials reported these cases earlier?23

On the night of November 15, Lillian Carty gave up her fight. The doctors had done all they could, the newspapers said, administering antitoxin and trying to ease her suffering as her muscles contracted. “Conscious through it all,” the New York Tribune reported, “she suffered frightfully for two days.” Her parents, exhausted from the long ordeal at her bedside, were prostrated in their grief. Remarkably, William Brower was still alive, but in critical condition. The bad news kept coming. The day Lillian died, another child had been diagnosed with tetanus following vaccination. Her name was Mamie Winters. She was eight years old.24

Camden was now in a full panic, and regional newspapers had taken notice. With the tetanus outbreak now weighing far more heavily on people’s minds than the continuing smallpox epidemic, city health officials and parents searched, in their own ways, for connections between the lockjaw cases. They found few. The children ranged in age from six to sixteen. No two of them lived in the same ward of the city. None had visited the free vaccination station, and no more than two had been vaccinated by the same physician. As the Camden Board of Health saw things, though, there were significant commonalities. Board representatives observed that most of the children were from “lower class” families (a dubious claim, as Hazelton’s father was a shipping clerk; Brower’s, a plumber; Carty’s, a railroad clerk); that the parents were “ignorant” (also unfair, for most of the parents were at least literate); and that they inhabited a dirty city that had experienced a spell of dusty weather (demonstrably true). For the lay public, the salient commonalities had nothing to do with social status or the weather. All of the children had been healthy until they were vaccinated. Roughly three weeks later each fell ill with lockjaw. Now six of them were dead. Most of the children had received glycerinated vaccine. To these links, the New York Sun, in a November 17 report, added another: most of the vaccine used in Camden had apparently come from a single, trusted Philadelphia firm, H. K. Mulford Company.25

It was probably inevitable that suspicion would fall upon the Mulford Company vaccine farm and laboratory in Glenolden, Pennsylvania, just outside Philadelphia. Mulford marketing materials boasted of the company’s vaccine sales in eastern Pennsylvania and New Jersey. When the Camden Board of Health announced its plan for wholesale vaccination, Mulford and Marietta-based Alexander Vaccine Farm vied to corner the market. According to the Sun, a local chemist who represented Alexander approached the Camden Medical Society and seemed poised to win the contract for the vaccine station. Mulford countered by offering the society a thousand free points. Demand quickly exhausted that gratis supply, and the society bought more vaccine from Mulford, as did many private physicians. Almost all of the afflicted children had received Mulford virus. Company executives insisted the vaccine was pure. The allegations, they said, had come from pharmacists who served as agents for their rival companies, Alexander and Parke, Davis.26

The parents of Camden demanded a public investigation of the tetanus outbreak. James B. Cochran, Anna’s father, swore that if the authorities did not “fix the blame,” he would “spend his last dollar doing it himself.” Every family in the city had cause for concern. Parents whose sons and daughters had dutifully submitted to vaccination were terrified they would be the next to fall ill. (The children were afraid, too. At Lillian Carty’s funeral, her schoolmates cried for her and worried for themselves.) Parents whose children had not yet been vaccinated feared that submitting now would expose them to an unacceptable risk of lockjaw.27

Camden families launched a school strike, hundreds of parents declaring that their children would not return to the classroom until the school board rescinded its vaccination order. Some parents also talked about litigation, considering whether to sue the vaccine company or seek a court order to open the schools to unvaccinated children. To a knowledgeable lawyer, neither avenue would have looked promising in 1901. One prevailing principle in tort law (“privity of contract”) insulated manufacturers from liability for injuries to anyone other than those to whom the makers sold the vaccine directly; while another principle (“contributory negligence”) limited a defendant’s liability if he could show that the plaintiff had negligently contributed to his own injury (for example, by carelessly letting dirt enter a vaccination wound). Moreover, under New Jersey’s wrongful death statute, if the plaintiff’s lawyer somehow proved the manufacturer’s liability in court, the child’s next of kin (normally, the father) would have been entitled only to compensation for his direct pecuniary loss: the child’s wages, if any. As for the other legal strategy—seeking a judicial writ to compel school officials to admit their unvaccinated children—two circumstances would have hampered that claim: the school board was acting in accordance with a state law, not merely at its own discretion, and the board had promulgated the order in the midst of an actual smallpox epidemic. In the American legal environment of the era, a school strike was a far more viable option than a lawsuit. But even that option carried a risk: school officials could have had the parents prosecuted for violating the compulsory education law.28

Increasingly, people in Camden asked if the compulsory order had really been necessary. On the day Anna Cochran died, the Camden Board of Health had released its monthly report. There had been just fourteen cases of smallpox since October, with only one fatality. The toll from tetanus was much higher. “Camden people are demanding to know where the benefits of vaccination come in,” said the Sun. According to the Times, some citizens now saw the health board as an “autocratic” institution, unaccountable to the people.29

Events came to a head on November 18, six days after Thomas Hazelton’s death. Camden’s vaccine crisis was no longer just a local or regional story. It was a national event. Reports of isolated postvaccination tetanus deaths—more schoolchildren—surfaced from Atlantic City and Bristol, Pennsylvania. Philadelphia, too, reported “several cases of tetanus following vaccination, but no official action has been taken.” As telegraph wires fed newspapers from Charlotte to San Francisco the latest from Camden, journalists dusted off other stories from the past year. “The tetanus bacillus has admittedly found its way into commercial virus to such an extent as to have given serious trouble in at least five widely separated districts, and probably in isolated cases wherever vaccination is practiced,” said the Times. Cleveland had lost four people to postvaccination tetanus during the past year. Previously, postvaccination tetanus was a rare complication. One investigator would turn up more than sixty U.S. cases from 1901 alone; most had occurred in November. All of those local events and stories seemed connected, like an epidemic, creating a widening sense of collective connectedness and complicity that transcended local political boundaries.30

Also on November 18, the St. Louis coroner announced his verdict regarding the first seven deaths from tetanus that had followed the administration of diphtheria antitoxin to children in that city. Citing bacteriological tests, the coroner said the cause of the deaths was the administration of antitoxin containing tetanus toxin. The city health department, not a private firm, had prepared the antitoxin—an experiment in public production that had won the department no small amount of criticism from private companies and druggists. All of the tainted antitoxin had been produced from the blood of a single animal, a horse named Jim, “stabled at the Poorhouse Farm.” Jim had developed tetanus in October and was put down. But serum had been drawn from Jim before his symptoms became apparent, and the serum had not been destroyed. Compounding the public relations disaster was the revelation that the job of bottling the serum had been entrusted to a janitor. The coroner charged the health department with negligence. American newspapers readily extrapolated from the coroner’s findings to the vaccine cases. “No other suggestion is reasonable,” said the Duluth News-Tribune, “than that the unwelcome bacilli secured a lodging place in the virus and the antitoxin in the laboratory.”31

The tetanus scares triggered opposition to vaccination in many American communities. In Rochester, New York, in the midst of its own small-pox outbreak, parents responded to the news from Camden, 350 miles away, by refusing to allow their children to be vaccinated. Two schools were “practically closed for want of attendance.” In response, the city health officer, according to the New York Tribune, “deprecated the displaying of the Camden news.” His peers in many other American communities shared his frustration.32

From the beginning of the crisis, Camden health officials and doctors had maintained a united front in defense of vaccination. But with six children dead so far, the schools half-empty, and a national scandal brewing, the board of health called a halt. On the night of November 18, the members passed a resolution ordering physicians to cease vaccination until further notice. The board advised the school board to suspend enforcement of the vaccination law, which that body did the following day. The health board launched a scientific investigation to determine the causes of the tetanus outbreak and, as James Cochran had demanded, fix the blame. The Mulford Company promised its full cooperation.33

The board members were not the only medical men determined to settle these same questions. Working on their own, three other men had quietly begun their own investigations—inquiries that would push the limits of medical science. Two of them, Robert Willson and Joseph McFarland, were physicians from neighboring Philadelphia. Willson had recently lost a patient to postvaccination tetanus. McFarland was one of America’s leading bacteriologists; his work with diphtheria antitoxin had put Mulford on the map, but he had left the company for academia and a consulting job with Mulford’s rival, Parke, Davis. The third investigator, Milton J. Rosenau, was an officer of the federal government, working in a small Washington laboratory, within the U.S. Marine-Hospital Service, that would one day be known as the National Institutes of Health.

All three men believed vaccination was medical science’s greatest gift to humanity. All sought an answer to the crisis that had discredited that operation during the most serious visitation of smallpox the nation had seen in years. Their investigations ensured that the Camden Board of Health would not have the last word on the matter.

The Camden tragedy cast unwanted light upon a hitherto little-known sector of the U.S. economy. Part animal husbandry, part laboratory science, the vaccine industry exemplified the distinctive historical inbetweenness of life at the century’s turn. On city streets, automobiles and streetcars vied for the road with horse-drawn carriages. In the public sphere, a new scientific rhetoric of social statistics and structures pressed against the older Protestant moralism of individuals and strictures. And in one of the most profitable manufacturing sectors of the U.S. economy, future giants of the nation’s pharmaceutical industry—companies such as Wyeth and Parke, Davis—were making names for themselves by harvesting pus from the undersides of barnyard animals. Poised between the stable and the laboratory, the farm and the firm, the vaccine industry embodied a world in transition. Of course, the vaccine makers had no way of knowing what their industry would one day become, but the most innovative among them dared to dream big. They forged close ties with government health departments and universities. And they embraced medical science—not just for the technical innovations that science enabled but for the credibility it offered to an industry built upon incredible promises.34

Although vaccination arrived in America in 1800, vaccine manufacturing did not emerge as a commercial industry until the 1870s, with the shift from “humanized” to “bovine” virus. Of course, Edward Jenner had obtained his original vaccine material from a cow, albeit by an indirect method: he took the “lymph” from a pustule on the hand of a milkmaid infected with cowpox. Uncertain about the origin of this disease, the doctor named it “variolae vaccinae,” smallpox of the cow. And though Jenner speculated that the disease might have originated in an affliction of horses (and he may have been right), the name vaccine stuck.35

Naturally occurring cases of cowpox were rare. Fortunately, Jenner established that vaccine could be serially reproduced in humans. The method entailed taking fluid directly from the vaccination vesicle on the arm of a donor (“vaccinifier”), usually a healthy young child, and applying the virus to the scratched arms of an assembly of recipients. Humanized virus: vaccine without the vache. The possibilities were breathtaking, as the Balmis expedition showed the world in 1803–5, transporting vaccine in the arms of orphans to the Spanish colonies of the Americas and the Philippines. In England, the National Vaccine Establishment assumed responsibility in 1808 for maintaining a supply of humanized virus, through serial “arm-to-arm” transfers. The virus could also be preserved and transported by drying the fluid on pieces of thread, quills, or ivory points; or by peeling the crust from a vaccination sore. The lymph-saturated crust could be carried or even sent in the mail; the vaccinator would triturate (crush) and moisten the crust, producing a pasty vaccine material, and then set to work. A North Carolina physician recalled vaccinating his entire town in 1854 with a “very ugly little scab” that he received by post from Wilmington.36

Humanized virus worked. When properly collected and used, it took “with great regularity” and produced immunity for years. But there were problems. If the vaccinifier was not as healthy as she appeared, the virus could communicate other human diseases, including erysipelas (an acute skin infection) and syphilis. In Rivalta, Italy, in 1861, sixty-three children were vaccinated with material from the vaccinal sore of a single, seemingly healthy infant. Forty-six of the children fell ill with syphilis, several died, and some passed the disease to their mothers and nurses. The risks of arm-to-arm transfer inflamed antivaccination sentiment almost everywhere it was practiced. Herbert Spencer called it “wholesale syphilization.”37

A second disadvantage of humanized virus was the challenge, even in a densely populated community, of keeping a fresh supply on hand. For a small town or sparsely settled rural area, keeping up an arm-to-arm relay or a good stock of crusts might prove impossible. Moreover, some physicians believed that humanized virus became attenuated or compromised over time, with the ever increasing distance from the original bovine source. Humanized virus had one other major shortcoming, which only became fully apparent in retrospect. There was never much of a market in it.38

The idea of using cows, instead of people, to manufacture cowpox first took hold in Italy. Throughout the century, in Europe and America, some vaccine propagators practiced what came to be called “retrovaccination”: inoculating heifers with humanized virus, either to mystically restore some bovine quality to the vaccine or to simply make animals do the work. But the production of the stuff that came to be known as “true animal vaccine” or “bovine virus”—and that would launch a new industry and market—did not catch on in Europe until the 1860s. The idea was to inoculate a heifer with seed virus obtained from a naturally occurring case of cowpox (not with humanized virus) and to keep the strain running from calf to calf in a continuous relay, all the while harvesting vaccine for use in humans.39

Bovine vaccine had none of the problems that plagued humanized virus. As the Italian practice was adopted by France (1864), Belgium (1865), Japan (1874), and Germany (1884), government officials and private entrepreneurs greeted each newly discovered outbreak of cowpox as a wellspring of vaccine. One of the most famous cases of “spontaneous cowpox” came to light in 1866 in Beaugency, in France’s Loire Valley. Although vulnerable to contamination, bovine virus did not spread syphilis. A calf could produce vaccine in far greater quantity than an infant could (while raising fewer qualms). And as doctors, farmers, and druggists soon realized, there was money to be made in bovine vaccine.40

Dr. Henry A. Martin of Boston introduced bovine vaccine to the United States in 1870. Using seed lymph from the Beaugency strain, which by that time had already passed through 260 heifers in France, Martin established a vaccine farm in suburban Roxbury. Martin may also deserve credit for initiating the American vaccine makers’ practice of tarring rivals’ products. An early advertisement said Martin virus should not be “confounded with the feeble, uncertain, and generally quite worthless product of retrovaccination.” Martin’s family-run establishment operated continuously and with good reputation into the early twentieth century. Others quickly followed in Martin’s footsteps, most notably Dr. E. L. Griffin of Fond du Lac, Wisconsin, and Dr. Frank P. Foster of New York. By the mid-1870s, vaccine farms were sprouting up all over the country.41

Many of the earliest vaccine producers were men much like Martin and Griffin—reputable local physicians who knew their way around a stable. Some traded on their prominence as members of state or local boards of health. But so low were the barriers to entry—a bit of seed virus, a few cows, and some ivory points—that men on the make from many walks of life entered the business. With equal parts admiration and distaste, the Brooklyn Eagle captured the spirit of the new enterprise. “If it be true that what is one man’s meat is another man’s poison, it is equally true, of course, that what is one man’s poison is another man’s meat,” the Eagle said. “The axiom, as amended, is fully verified in this good city of Brooklyn, where men are deriving handsome incomes from that most disgusting and abhorrent of all diseases, small-pox. A new business of vital importance to the community has been started, and hundreds of thousands of men, women and children are walking about with its badge on their arms.” In 1871, the New York Department of Health became the first municipal agency in the United States to produce its own vaccine. But elsewhere private makers had the field almost entirely to themselves. And as compulsory vaccination and its handmaiden, compulsory education, spread in the late nineteenth century, the opportunities for profit expanded apace.42

To distinguish their products on the open market, vaccine makers appealed to the late nineteenth-century romance of the pastoral and the era’s penchant for pedigree. Americans had a fascination with animal breeding and family genealogies, informed by the transatlantic flourishing of hereditarian ideas in the age of Darwin and Galton. Dr. W. E. Griffiths of Brooklyn boasted that his stock derived from a case of spontaneous cowpox discovered in Central New York. Dr. J. W. Compton & Son of Indiana advertised “pure Beaugency cow-pox lymph, non-humanized.” In 1885, John Wyeth & Brother, Philadelphia druggists, announced their entry into the field with a full-page advertisement in Drugs and Medicines of North America. Calling its new Chester County farm “the model vaccine propagating establishment of the United States,” the Wyeth Company obtained its seed virus from the Belgian government. Like many vaccine ads, this one pictured a cow: a healthy looking heifer, bound to a table beneath a lace-curtained window; on the calf’s lower belly were several rows of incisions, where the seed had been introduced. Vaccine companies’ claims to exalted origins for their products were greeted with jeering from some quarters. Dr. J. W. Hodge insisted, to everyone who would listen, that no vaccine maker had “any definite or exact knowledge as to the real nature, composition or original source of the complex poisonous mixture which they foist upon gullible doctors as ‘pure calf lymph.’”43

Hodge had a point. In an era when neither smallpox nor cowpox could be seen under the most powerful microscope, the manufacturers’ genealogical claims were beyond verification. It was not until 1939 that a British scientist established that most vaccines contained neither smallpox nor cowpox, but a related orthopoxvirus called vaccinia. At some time between Jenner’s first experiments with cowpox in 1796 and the 1930s, vaccine makers had started working with this different virus, which also occurs naturally in cows. No one knows when the exchange occurred, though the late nineteenth century would seem a good bet. In any event, vaccinia worked. Like cowpox, when introduced in the human system it caused an immune response, usually mild, that conferred a lasting (though not permanent) immunity to smallpox.44

The makers’ claims to product purity were easier to test than their pedigree claims. In 1895, Walter Reed of the Army Medical Department presented a paper to the District of Columbia Medical Society entitled “What Credence Should Be Given to the Statements of Those Who Claim to Furnish Vaccine Lymph Free of Bacteria?” His answer: none at all. Reed had examined points from several leading U.S. makers. The number of bacteria per point ranged widely, from 43 to 89,000. Most of those germs appeared to be harmless, but others were pathogenic, capable of causing sore arms and infections. Bovine virus was liable to contamination from the common bacteria, such as streptococci and staphylococci, that thrived upon calves’ skin or in the stable.45

The following year, the Pennsylvania State Board of Health dispatched one of its own bacteriologists, Dr. R. L. Pitfield, to inspect American vaccine farms. Of the fourteen farms Pitfield visited, he could recommend only four. Amidst the wildly various production standards, Pitfield found a common ground of rank commercialism and “a tenacious adherence to original and old and rather preaseptic measures.” In one Missouri establishment, a worker used his own fingernail to remove the crust; in another, the heifers were kept in a “dusty and dirty apartment,” with urine streaming in from the operating room on the floor above them. Even at the New York City Health Department, one of America’s leading scientific establishments, Pitfield found “the accommodations are not as good as they should be.” Among the many troubling statements in Dr. Pitfield’s detailed report was this one: “In many establishments, tetanus bacilli might find their way to the vesicle and thence to the points and tubes, because dust in large quantities abounds in the incubating stables.”46

In the late 1890s, American vaccine makers adopted a new production technique that reduced the problem of bacteria-ridden vaccine. For decades, European makers had added glycerin to their product to keep it from decomposing. In 1891, the Englishman Sydney Monckton Copeman established that glycerin not only preserved vaccine but gradually killed unwanted bacteria without damaging the virus. Glycerin also acted as a diluent, allowing makers to stretch lymph and thus greatly increase the number of vaccine units that could be produced from a single calf. By 1898, glycerinated calf lymph had become the international standard of vaccine, widely preferred by the leading local, state, and federal health officers in the United States.47

While some companies (such as the Martin Vaccine Farm and the Washington, D.C.–based National Vaccine Establishment) still dealt chiefly or exclusively in smallpox vaccines, other industry leaders had a much larger footprint in the marketplace. Firms like Parke, Davis and Wyeth Company sold a growing number of biological products as well as compounded drugs of almost infinite variety. Even as firms opened branch houses in major U.S. cities and overseas, their vaccine lines required that they keep one foot planted on the farm. H. K. Mulford Company, one of America’s most reputable manufacturers of biologics and drugs, still adorned its vaccine ads in 1901 with a healthy heifer standing contentedly by a gentle stream and thought nothing of running those vaccine ads directly beneath another for pint bottles of “Mulford’s Pre-Digested Beef.” That both products might come from precisely the same source was a fact worth publicizing. Field, laboratory, and slaughterhouse were stages of an industrial life cycle that bound urban life, as ever, to the domestication of rural animals and landscapes.48

For the H. K. Mulford Company, “Manufacturing Chemists,” a newcomer to the vaccine market in 1898, dealing in biologics meant reversing the expected American trajectory. Mulford was born in the city and moved to the country. The company got its start in the late 1880s when twenty-one-year-old Henry K. Mulford bought the “Old Simes” corner drugstore in downtown Philadelphia. At first, Mulford seemed poised to follow the conventional road of the entrepreneur in Philadelphia’s robust drug trade, the largest in the United States outside of New York. He introduced his own line of medical preparations, including elixirs, lozenges, liquors, tinctures, antiseptics, and soda fountain syrups. In 1891, with new financial backing, Mulford incorporated and began its swift transformation from retail druggist to nationally prominent manufacturing firm. Henry Mulford and an associate patented their own machine for tableting watersoluble pills, and by 1893 the company, with two Philadelphia laboratories and a branch office in Chicago, was marketing no fewer than eight hundred medical products.49

In 1894, the Mulford Company entered the biologics market at its cutting edge, racing to become the first U.S. firm to produce diphtheria antitoxin. Germany’s Koch Institute was already preparing the lifesaving antitoxin, which like smallpox vaccine was an animal product. (A horse was inoculated with diphtheria toxin and given time to produce antibodies; later the horse was bled and the antibodies separated from the serum.) The New York City Health Department was developing its own antitoxin. To develop a commercial product, Mulford hired Dr. Joseph McFarland, a bacteriologist who had trained in Heidelberg and Vienna and who was at that time employed by both the University of Pennsylvania and the Philadelphia Board of Health. In a display of the public-private cooperation that drove biologics innovation in the 1890s, the New York City Health Department bacteriologist, Dr. William Park, provided McFarland with the cultures necessary to start his laboratory in a West Philadelphia stable. The University of Pennsylvania’s new Laboratory of Hygiene agreed to test lots of McFarland’s antitoxin. By 1895, Mulford had placed America’s first commercial diphtheria antitoxin on the market. The following year, the company moved its biologics department to newly constructed stables and laboratories in rural Glenolden, eight miles outside Philadelphia. In 1898, the company hired Dr. W. F. Elgin from the National Vaccine Establishment and put him to work making glycerinated vaccine. By 1902, Mulford’s annual sales topped $1 million.50

Mulford benefited from all of the innovations that had taken place since Martin brought bovine virus to America in 1870. According to Mulford marketing details, the company’s stables and laboratories were state-of-the-art operations modeled after “the leading vaccine establishments in Europe and America.” The company used suckling female calves, just four to eight weeks old, tested for tuberculosis. “The animals are kept at all times under the most rigid sanitary surroundings in buildings all the materials of which—stone, cement, metal, slate, and porcelain-finish—permit of immediate and thorough disinfection.” The calves were fed sterilized milk, their excretions “disinfected and removed as soon as voided.” The inoculations and collection of the virus took place in a special operating room set apart from the stables.51

Dr. Elgin detailed his procedures in a presentation, complete with lantern slides, to the 1900 meeting of the Conference of State and Provincial Boards of Health of North America. After having its underside shaved, the calf was strapped to an operating table where “the operator,” clad in a sterilized gown and wielding an aseptic scalpel, made a series of linear incisions along its lower body. Glycerinated lymph (harvested from a previous calf) was slathered over the entire area and rubbed into the incisions. A worker removed the animal to the sanitary stable, returning the calf to the operating table six days later. Along the incisions had risen a line of vesicles covered with “a slight crust or scab.” Using sterilized water, the crust was softened and then removed, “leaving behind rows of pearly white vesicles,” which the operator scooped out (using a tool of the trade called “Volkman’s spoon”) and deposited in a sterilized box. This “pulp” was then placed on glass rollers in a grinding machine and mixed with glycerin. The mixture was stored in large stock tubes and placed in an icebox while the glycerin did its work. Finally, glycerinated lymph was placed in capillary tubes (each containing enough for a single vaccination), hermetically sealed, and prepared for shipping. Mulford followed the practice at the best firms of killing the calves immediately after the collection of vaccine and conducting a postmortem examination to ensure that the animal was in fact healthy; if the exam showed otherwise, the company discarded the vaccine. The postmortem was a costly practice, not universally followed. Some makers still sold their used calves to the local stockyards.52

In most European countries, the government controlled vaccine production, either through licensing or through outright government manufacture. Regulating the manufacture of potentially hazardous goods fell well within the ambit of the American police power. But little regulation of vaccines existed. Just seven states had laws providing for some supervision of the vaccine manufactured or used in the state. The Massachusetts statute, the nation’s strongest, declared that “All vaccine institutions in the commonwealth shall be under the supervision of the state board of health”; but even that law specified no penalties for bad practices. Several of the states governed the use of humanized virus, which had fallen out of favor in most places anyway. Even these measures showed a narrow conception of the rightful powers of government in this area. Florida banned humanized virus outright; Maryland made physicians liable for “knowingly or willfully” using humanized virus that spread disease to a patient; and Michigan required that only bovine product be used in public vaccinations.53

In the late 1890s, the first glimmerings of a new regulatory approach appeared when a few state and local governments started inspecting vaccine farms, testing the virus in the laboratory, and publishing the findings. In an industry that had relied on the endorsements of public health boards to market their products, this must have been for many makers a most unwelcome intrusion. In addition to the Pennsylvania Board of Health, some of the nation’s most advanced municipal health departments—Chicago, Minneapolis, Brooklyn, Charleston, Denver, and St. Louis—began using their new bacteriological laboratories to test vaccines for potency or purity. At the most dramatic level of control, the city of New York manufactured its own virus—still in 1901 the only city in the country to do so. The New York health board rankled commercial makers by selling limited amounts of its well-regarded virus on the open market.54

American vaccine makers were regulated almost exclusively by their reputations, their commercial “trustworthiness.” This encouraged nasty advertising practices. In a 1900 article titled “The Pot Calls the Kettle Black,” the Indianapolis physician W. B. Clarke observed that vaccine advertisements published in the medical journals had become a scandalous “squabble literature.” “In their greedy commercialism,” Clarke wrote, the manufacturers “have fallen out, and are each vying with the others in desperate attempts to show the medical profession that theirs is the one pure virus, and that all the others are dangerous and unfit for the use even of a health (?) officer.” Dr. Clarke was an antivaccinationist (the giveaway was that parenthetical question mark), but he did not exaggerate. As early as 1898, an Alexander Company advertisement had thrown down the gauntlet: “At the beginning of this second century [of vaccination] the tendency is to propagate impure virus, in order to meet the demand for great discounts, thus lowering the price and making it impossible to propagate in a proper manner.” Even assertions of product purity were phrased so as to condemn rivals’ wares by implication: “This Vaccine is Entirely Free from Blood Corpuscles,” a distributor for the New-England Vaccine Company proudly announced. Parke, Davis did not wait for the Camden crisis to cool down before taking a swipe at Mulford (unnamed) in its advertising. “The most successful vaccination is not the vaccination that inflicts the most suffering upon the patient,” said one Parke, Davis ad. “The best virus is our Aseptic Vaccine. It effectually protects against smallpox—it does not infect with disease-breeding organisms.” And then came the kicker, underscored and printed in boldface type: “Not a single fatality was ever charged to our Vaccine Virus.”55

Still, negative advertising was a risky marketing strategy: the American public did not need much encouragement to think that vaccines were vile and dangerous.

The American newspapers followed the Camden vaccine investigation like a criminal trial. The story certainly had the elements of a good police procedural: dead schoolchildren, intimations of a corporate cover-up, and men in laboratory coats keeping a sober vigil over culture dishes and white rats.

To lead its investigation, the Camden Board of Health secured the services of a young Philadelphia physician named Albert C. Barnes. A brilliant and eccentric man who never shrank from a fight, Barnes grew up in the hardscrabble section of South Philadelphia known as “The Neck.” Educated at Philadelphia’s renowned Central High, he paid his way through the University of Pennsylvania Medical School by boxing and playing semiprofessional baseball. An M.D. at twenty, he studied chemistry at the University of Berlin (and later at Heidelberg). Returning to Philadelphia in 1896, Barnes began working as a consulting chemist for Mulford Company and quickly rose to a full-time position as advertising and sales manager. Placing a Mulford man in charge of an investigation of Mulford products may seem scandalous today. But the move raised few eyebrows at a time when business, medicine, and public health authority often moved in unison. (Not long after the Camden episode, Barnes began amassing his own pharmaceutical fortune by inventing, with a German colleague, the antiseptic Argyrol; he spent that fortune building one of the great private collections of modern French art, the Barnes Foundation in Merion, Pennsylvania.)56

Barnes’s unique combination of talents made him an able defender of the Camden Board of Health and Mulford. The doctor traveled to New York to keep the city’s leading papers and their wire services apprised of the ongoing investigation. Barnes pressed the point that the same virus had been used on a million people living within thirty miles of Philadelphia, “and few, if any, fatal results were reported.” The Tribune praised the board’s man: “Dr. Barnes, the expert employed by the authorities of Camden to look into this trouble, throws a flood of light on its origin.” Papers that a few days earlier had impugned Mulford vaccine now lingered over local factors: Camden’s dry weather, dusty streets, filthy children, and negligent parents. As Barnes told his audience, the fatal cases had occurred “among the lower class of people, who by their own carelessness poisoned the wounds with tetanus, or lockjaw, bacteria.” The vaccine and the physicians who used it were “perfectly blameless.” The Sun, one of the first newspapers to implicate Mulford, now told its readers that tetanus was simply “in the air,” just waiting for “any cut or scratch . . . to give it a lodging place.” It was “highly unfortunate,” the paper added, “that a period of prevalence of tetanus germs should have coincided with a period of vaccination.” The chemist cum adman had sold the press the oldest story in the annals of public health: the poor begot filth, and filth begot disease.57

The Camden Board of Health finally released its full report, on November 29, 1901. By then all of the major findings had already been delivered by Barnes to the New York papers. The terse report combined bacteriology and epidemiology with an older emphasis on atmospheric and environmental factors. The board had tracked down samples of the various makes of vaccine used in the city and sent them to the New Jersey state bacteriologist. Laboratory tests failed to detect tetanus in the samples. Meanwhile, physicians at Camden’s Cooper Hospital had used vaccine purchased from fifteen separate Camden pharmacies to inoculate white rats, known to be highly susceptible to tetanus. Not one developed the disease. Epidemiological evidence supported the laboratory data. According to standard medical treatises, acute tetanus occurred within five to nine days after the introduction of bacilli in the body. But the Camden children had not fallen ill for about three weeks after vaccination. All of this constituted “indisputable evidence,” in the words of the report, “that the tetanus germs were not introduced at the time of vaccination.” Following Henry Davis’s original suggestion, the report attributed the tetanus outbreak to the peculiar “atmospheric and telluric conditions” (the dry, dusty weather) that had prevailed in Camden that fall. To demonstrate that the germs were present “in the atmosphere,” the board cited the case of a boy who got tetanus from a gunshot wound that fall. Other germs had made their way onto the vaccine wounds of a few luckless children who had left their wounds uncovered, wrapped them in filthy rags, or, worse, “scratched the vaccinated area with their dirty fingers and nails and infected the wound.” At the conclusion of the report, the board expressed its “unanimous opinion” that compulsory vaccination should resume in Camden.58

Medical science and public relations know-how had come to the defense of the vaccine industry at its hour of greatest need. The American Druggist and Pharmaceutical Record, an industry trade journal, expressed relief. The Record urged “every intelligent person” to “do all that is possible to prevent the spread of unnecessary and ill-founded alarm from the accidental occurrence of tetanus following, but in no wise due to vaccination.”59

And yet there were doubts. The board’s “vigorous ex parte denial,” as a New York Times editorial skeptically referred to the report, did not silence the public narrative that tied the suffering of little children to tainted commercial vaccine. The Philadelphia North American agreed: “The prima facie evidence of connection between vaccination and tetanus is too strong to be refuted by mere assertion of opinion by the vaccinators.” Addressing the New York Academy of Medicine, W. R. Inge Dalton, a physician and professor, said he was not persuaded by the report. “In Camden the manufacturer and the medical men have co-operated in exonerating themselves, and have thrown all the blame on the parents of the children,” Dalton said. If tetanus bacilli were simply “in the air,” it was remarkable that they had a “selective predilection for sores produced by particular kinds of vaccine virus.”60

In Philadelphia, a scientific debate on the merits of the board’s argument had begun even before the report’s release. Addressing the Philadelphia Medical Society on November 27, Dr. Robert N. Willson presented a paper about a case he had recently handled. An eleven-month-old child had died of tetanus following vaccination. Willson concluded that the child’s father, a coachman, had carried the tetanus from the stable to the nursery. Insisting there had never been any connection between vaccine and tetanus, Willson told his audience that the only cure for “rampant” opposition to vaccination was a “new scrupulousness” toward the vaccination wound. No doubt many of his listeners applauded. But at least one remained unconvinced. Dr. Joseph McFarland, the man who had built Mulford’s antitoxin laboratory, took the floor. He had been following the tetanus cases closely, he said, and was conducting his own study of the subject. He had learned enough already to suspect that Willson’s “extremely optimistic view . . . concerning the harmlessness of vaccine virus might not be correct.” Five months later, the two physicians would meet again in that same room to debate the issue.61

Nor had the board’s investigation stopped the pain and death in Camden. On the night of November 25, thirteen-year-old Ada Heath died of lockjaw. Her parents had paid a local druggist twenty-five cents to vaccinate her. On November 26 came the death of nine-year-old Georgiana Overby, the first African American child among the afflicted, and the first of the tetanus victims to have received her vaccination in the free dispensary. “[S]he, too, died in agony,” the Tribune reported. From nearby Jordantown came the news that four-year-old Flora Johnson, also African American, had died, “apparently suffering from tetanus, following vaccination.” The final Camden case was reported on December 4. Three days later Bessie Rosevelt, age seven, the daughter of a local horse dealer, died at her home on Ferry Avenue. No two of the new cases had been vaccinated by the same physician. The Tribune awkwardly noted that Bessie’s was the “fourteenth case since the epidemic [of tetanus] made its appearance, and despite the fact that it has been found that the disease does not come from vaccination lymph, all of the victims have been vaccinated.” A headline in the Omaha World-Herald suggested that for much of the public, the story was more straightforward: “Poisoned Vaccine Still Proving Fatal at Camden, N.J.”62

If the mere assertion of expert opinion could not restore public confidence in vaccine, at the height of the most extensive U.S. epidemic of small-pox in recent memory, then what could? The Times warned that this was “not a momentary sensation.” St. Louis and Camden had done “incalculable injury” to medical progress, while the profession whose “pride and business interest” were most closely tied to that cause stood idly by. In the coming months, the American medical profession would be anything but idle. The New York County Medical Society resolved to investigate the “entire subject” and to determine “the steps that should be taken to guard against the possibility of a repetition of such deplorable disasters.” Other societies followed suit, as one local and state organization after another called for investigations of the vaccine industry and debated the need for government control. Physicians stepped away from both the biologics makers and the public health boards, seriously considered their own interests, and worked to restore public confidence in vaccination.63

Physicians knew better than anyone that even under the best of circumstances vaccination carried health risks. The same late nineteenth-century developments in bacteriology that had made U.S. military medicine a much safer and more ambitious enterprise had introduced a heightened concern for aseptic practices in routine medical procedures, including vaccination. As Arthur Van Harlingen, a Philadelphia doctor, noted approvingly in 1902, “few men will now come to the delicate infant with the odor of stable and animal on the unwashed hands, or will moisten their instruments with their own saliva.” And still physicians knew that introducing animal vaccine into the human system could produce unpredictable results, especially if the patient did not have the constitution for it, or if the vaccine itself was impure.64

American doctors had been concerned about vaccine quality since the first wave of the turn-of-the-century smallpox epidemics spread across the southern states in 1898 and 1899. But the doctors had kept their worries mostly to themselves, maintaining a solid (if occasionally splintery) defense of vaccine before the public. Their own medical society minutes and journals told a different story. Physicians and health officials—including a few federal officials such as C. P. Wertenbaker of the U.S. Marine-Hospital Service—complained that contaminated tubes and points were producing sore arms and open rebellions. At a meeting of the North Carolina Medical Society, local physicians swapped stories about “the violent results” caused by the vaccines they were receiving from northern manufacturers. “The popular prejudice against vaccination is not wholly without justification,” one doctor confessed. He recalled many “very sore arms” and lamented the suffering of his “own little daughter [who] was for three days violently ill” after he vaccinated her. As the epidemics spread north, the stories were much the same. From Omaha, Dr. F. T. Campbell wrote of the “vile vaccine” found on the shelves of grocery stores. “[A]nd so the ‘sores’ ran wild with contiguous and constitutional infection. From such cases came complaints that vaccination was ‘worse than smallpox.’” By the time tetanus broke out in Camden, American physicians had good reason to wonder what was really in those skinny tubes and points they carried around in their pockets.65

At the Marine-Hospital Service’s Hygienic Laboratory in Washington, Milton Joseph Rosenau was wondering the same thing. In the winter of 1901–2, he determined to find out, secretly buying up samples from eight different vaccine makers on the open market and taking them back to his laboratory. The thirty-three-year-old scientist knew Philadelphia and its environs well: a native of the city, like Albert Barnes and Joseph McFarland, he had received his education in its public schools and at the University of Pennsylvania. After completing his medical training in 1889, Rosenau joined the Marine-Hospital Service, serving as a quarantine officer in San Francisco and, at the close of the Spanish-American War, in Santiago. After a decade in the field, he took over the Hygienic Laboratory, which he transformed from a one-man outfit into a leading government scientific institution. A brilliant scientist with the heart of a reformer, Rosenau’s scientific interests ranged across bacteriology, chemistry, and pharmacology. As early as April 1900, Wertenbaker had focused Rosenau’s attention on the problem of vaccine purity by sending him some points and lymph for testing. A few teeming cultures and one dead mouse later, Rosenau confirmed Wertenbaker’s suspicion that the dry points on sale in the South crawled with pathogens. In a private letter, Surgeon General Wyman had cautioned Wertenbaker against reading too much into Rosenau’s report. “The work confirms the well known fact that glycerinized lymph is superior to dry points and no other conclusion should be drawn from the report,” Wyman advised.66

A broader conclusion was inescapable after Rosenau tested the vaccine samples he collected on the open market, at the height of the national vaccine crisis, in the winter of 1901–2. The federal scientist presented his preliminary findings to the New York Academy of Medicine in February 1902. Like Walter Reed before him, Rosenau found a great unevenness in the quality of vaccine on the market. On average, each nonglycerinated dry point Rosenau examined had 4,809 bacterial colonies, while the glycerinated lymph averaged 2,865 colonies per sample. (The journal Pediatrics recoiled at this “ridiculous amount of impurity.”) The contaminants included staphylococci, pus cocci, and an assortment of molds common to the hay and dust of the stable. What made Rosenau’s report news was his argument that vaccine makers placed too much confidence in the germicidal powers of glycerin. The makers had “become careless of contamination, trusting to the glycerin to purify their product.” And in their haste to meet the high demand for vaccine during the national wave of smallpox epidemics, makers had not given the glycerin sufficient time to work, flooding the market with “green” virus.67

Rosenau did not shy away from the political implications of his data. He told Wyman, “Our results so far have plainly indicated that the manufacture of vaccines is too important a subject to leave to commercial enterprise without restrictions.” Many in the medical profession agreed. As the Medical News observed, “The enforcement of government inspection with power to prevent the sale of improper material seems to be the desideratum.”68

Rosenau’s paper accomplished what only a federal report could do. Coming so fast on the heels of St. Louis and Camden, it persuaded American doctors and public health officials, working in local communities across the United States, that defective vaccine was a national problem that required a national solution. Many had seen the hideous effects of bad vaccine in their own patients, and their consciences troubled them. “The inoculation of such vaccine is followed by severe reaction, including fever, erysipelatous dermatitis, a deep, sloughing sore, and great swelling of the arm,” the Cleveland Journal of Medicine reported. And after all of that, some vaccine still produced “no immunity to subsequent smallpox.” The Sanitarian , a leading voice of the public health profession, lamented “the poisonous character of much of the vaccine that is put upon the market at the present day.” Nine tenths of that vaccine might be fine, but there was “no telling how much harm may be done by the remaining one-tenth . . . or how many anti-vaccinationists it may produce.” “Something will have to be done,” the Sanitarian concluded, “to rehabilitate vaccine virus in the estimation of the medical profession as well as of the general public.”69

The old rhetoric of the vaccination argument had lost its persuasive powers, even for some of the measure’s strongest supporters. Cost-benefit arguments were not enough. Vaccination was a political measure, ordered for the most benevolent of purposes. But vaccine was a commercial product, and like all such wares, its success depended upon the confidence of consumers. Public confidence in the market—and thus in the measure—had collapsed. Vaccination itself was, as one New York physician observed, “at a crisis.” And that crisis exposed to all the fundamental contradiction characterizing the procedure: the government compelled vaccination, but it would not vouch for vaccine.70

Dr. Theobald Smith, a scientist with the Massachusetts Board of Health, was one of the growing number of officials and physicians who demanded reform in 1902. “Without the specific protection given by vaccination, small-pox cannot be efficiently controlled and suppressed,” Smith said. “The acceptance of this proposition by the medical profession and the State creates the responsibility of supplying as pure and efficient vaccine virus as can be made under present conditions.”71

The vaccine crisis seemed to require a new role for the state in controlling production. But what sort of control? Like their European social-democratic counterparts, progressive reformers in the United States insisted that certain areas of life were too precious to leave entirely to the unregulated market. This call for a sort of decommodification—to replace capitalist price with government discipline—was a common thread running through a great many otherwise disparate reform causes, from the movement for public ownership of streetcars to the campaign to ban child labor. The disasters in St. Louis and Camden convinced many physicians and health officials that vaccine production had been left to the free market for too long. “The lesson we have principally to learn from these catastrophes,” said Dr. Dalton of New York, “is the necessity of eliminating commercialism from matters pertaining to public health.”72

The professional debate centered on two options. The first was for states to manufacture their own vaccines, in effect socializing the industry (as Japan had done in 1896). Eugene A. Darling, director of the Cambridge, Massachusetts, Bacteriological Laboratory, noted the ethical clarity in this approach. He said, “The State compels the child to be vaccinated, and should furnish the lymph for the operation, guaranteed to be pure and efficient.” The other option was to bring commercial vaccine makers under the discipline of a new regime of licensing and inspection. Since vaccines were an interstate business, most supporters of regulation called for the involvement of the federal government. This, too, was a bold idea: the federal government did not regulate drugs or biologics manufactured in the United States. (Since 1848, federal law had banned the importation of adulterated or spurious drugs, but that law did not touch domestic manufactures.) The entire professional debate took place in the context of rising antivaccination sentiment. In early February, the Massachusetts legislature held hearings on a bill to repeal the state’s compulsory vaccination law. The committee heard an emotional appeal from the mother of Annie Caswell, a five-year-old Cambridge girl who had died the previous month from tetanus after vaccination. The bill failed. But that effort and others like it helped keep the vaccine purity question before the press.73

The idea of government production, which American Medicine dismissed as “almost out of the question,” met with powerful opposition from vaccine makers and the druggists who sold their goods. The makers had long enjoyed a cozy relationship with state and local health boards, aggressively seeking their contracts and endorsements. And, of course, every vaccination order created a demand for commercial products. Not surprisingly, the makers did not welcome competition from their longtime sponsors. “A Board has no right to enter into commercial enterprises,” the St. Louis Medical Journal declared in 1898, a few years after the city health department introduced its ill-fated antitoxin. That same year, the New York County Medical Society sponsored a state bill that would have forbidden the Tammany-controlled New York City Health Department to sell its surplus biologics; the bill failed. In 1900 and 1901, manufacturers and druggists urged Congress to stop the Department of Agriculture from providing ranchers with free vaccine for blackleg, a disease of cattle and sheep. At a time when some of America’s more progressive municipal governments were taking steps to provide their citizens with necessary services—including water, electricity, and gas—production of vaccines and antitoxins by local health boards was met with slippery-slope charges of “municipal socialism.” (Bona fide socialists bristled at the association. Socialist Labor Party leader Daniel De Leon countered, “The vaccination laws are capitalist laws: they were framed by capitalist legislatures; they have been passed upon by capitalist courts; they are enforced by capitalist officials. From first to last the spirit of capitalism has dominated the whole procedure.”)74

In the winter of 1901–2, druggists and vaccine manufacturers waged a protracted campaign to beat back government production in the few places it already existed. (The great exceptions were in the new U.S. colonies in Puerto Rico and the Philippines.) On the U.S. mainland, the vaccine interests held up the St. Louis tetanus outbreak as the tragic but inevitable result of placing production in the hands of a political machine. “It is difficult enough to keep politics pure,” said the Minneapolis-based Medical Dial, “but it is impossible to make pure political antitoxin.” Seizing the moment, the makers and druggists pressed Mayor Seth Low of New York to stop the city health department from the “destructively competitive” practice of selling its highly regarded vaccine and antitoxin on the market. Even reformers worried that municipal governments controlled by political machines would produce products more dangerous than those already available on the commercial market. Others insisted there was something un-American about the whole idea. “No government has the right, morally, legally, or commercially to enter into any business for pecuniary profit,” declared the Medical Record. Neither purity of product nor cheapness to consumers could justify it. “A municipal laboratory is not a shop.”75

The so-called Continental method of monopolistic government production was not going to happen in the United States. Government regulation was controversial enough. Here, too, there were European models. In Italy, which had the most extensive system of regulation in Europe, would-be makers of any biologics (including antitoxin and vaccine) had to first secure the consent of the interior minister. (Germany, France, and Russia also had national systems of control covering specific biologics). In the United States, some commentators objected that any such system was impractical and contrary to the American way. “In a country as large as ours, and with our republican form of government,” American Medicine commented, “it would be very difficult, if not impossible, to carry out the supervision suggested.” In the United States a dozen commercial establishments made diphtheria antitoxin. Each had 25 to 250 horses. Was the government really prepared to “test the serums of 100 or more bleedings a day” at sites around the country?76

But by the spring of 1902, it increasingly seemed clear to the medical profession that a national licensing and inspection regime was an idea whose time had come. The events in Camden and St. Louis had made such a move seem inevitable to organized physicians and vaccine makers alike. In late March, the Medical Record described the emerging professional consensus. “Of late, owing chiefly to the accidents which have occurred recently in this country from the use of diphtheria antitoxin and vaccine virus, there has been a movement in favor of Government control of such products,” the journal said. “This proposition is not only highly proper under present circumstances, but absolutely imperative.” But regulation was as far as this journal, or the profession, was ready to go. Government competition with free enterprise was unacceptable. Much the same conclusion was reached in an informal discussion at the annual meeting of the American Medical Association that spring. The old arrangement in American public health law—which allowed compulsory vaccination with unregulated products—was no longer tenable. A resolution introduced to the Homeopathic Medical Society of New York caught the spirit of many others: “when the State or local authorities enforce vaccination they are in justice bound to surround it with all the modern safeguards.”77

There were a few precedents for such state-level regulation. In the most ambitious effort, Pitfield’s grand tour of American vaccine farms for the Pennsylvania board in 1896 had demonstrated just how revealing on-site inspections could be. But the power of a state health board only reached so far; it could only use such information to control vaccine sold or produced within the state. The vaccine business was an interstate trade; the larger firms like Parke, Davis even manufactured and marketed their wares beyond the nation’s borders. An effective system of government regulation, many reform-minded physicians concluded, would have to be a federal government responsibility. Rosenau’s study of the vaccine market had shown the potential of that idea; in fact, Rosenau did not conceal his belief that the Marine-Hospital Service (with his laboratory) was the natural agency for the job.

On April 4, 1902, a bill was introduced simultaneously in the U.S. House and Senate, sponsored by the Medical Society of the District of Columbia, to create a new regime of federal regulation of biologics. The commissioners of the District drafted the bill, which received a strong endorsement from the District’s health officer, William C. Woodward. The District was not home to a single biologics manufacturer. But Woodward noted that there was “no legal reason why any person whosoever should not enter into the business at any time.” In the nation’s capital, as in most American states, no restrictions at all governed the production and traffic in biologics. Woodward explained that the “manner in which these substances are produced and marketed” made it impossible to efficiently control them by inspecting only the finished product. The nature of biologics production justified a more intrusive system of licensing and unannounced inspections of manufacturers.78

As if anyone needed reminding, Dr. George M. Kober, chairman of the D.C. Medical Society, advised Congress of the moral urgency of the biologics bill and its connection to the “unfortunate accidents” in St. Louis and Camden that had brought so much discredit upon antitoxin and vaccine. The social value of these lifesaving products—and the considerable risks that attended their manufacture and sale—demanded “that action be taken to preserve the confidence of the medical profession and of the community generally in them.” Like Woodward, Kober expressed dismay at the low barriers to entry in this industry of vital national importance: “Any kind of a stable, a little technical skill, and a fair amount of nerve are all that is needed.” Individual states were “powerless to protect themselves against impure and impotent materials,” especially since most of them consumed biologics made out-of-state. Testing a vial here or a package there was not enough; the whole industry required continuous government surveillance. “For these reasons Federal supervision is necessary,” Kober declared. The House and Senate committees on the District of Columbia went to work on the bill.79

Memories of a city and its nine lost schoolchildren lingered in the air of the vaccine debate. The report of the Camden Board of Health had not sat well with everyone. Many Americans refused to accept that the vaccine makers were blameless or that public health officials understood the risks of vaccination better than they. Conscientious physicians entertained doubts about the purity of the vaccine in their hands, and considered the possibility, however remote, that they might infect a patient with tetanus. Even some leading vaccine makers found the circumstantial evidence difficult to dismiss. “I am inclined to believe that the New Jersey cases were due to after infection and that the vaccine was not at fault,” confided Ralph Walsh of the National Vaccine Establishment in a private letter, “yet the fact that the cases in Philadelphia, Camden and Atlantic City occurred almost simultaneously and from vaccine propagated by the same party staggers me.”80

Ultimately, the Mulford Company’s complicity in the deaths of the nine Camden children (not to mention scattered other fatalities) was a scientific question. As men of science, Robert Willson and Joseph McFarland determined not to let that question go unanswered. On April 23, 1902, as the two congressional committees considered the biologics bill, the Philadelphia County Medical Society assembled to hear Willson and McFarland present their findings.

Dr. Willson spoke first, taking up the gauntlet Dr. McFarland had thrown at his feet back in November. Since then, Willson had prepared abstracts on fifty-two cases of postvaccinal tetanus, which he had found in the medical literature and through personal correspondence with physicians and health officials. The cases dated as far back as 1839, but the majority of them were in children who had fallen ill between October 1, 1901, and March 30, 1902. Willson had discerned, as well as he was able, the circumstances surrounding the production of the vaccine used in each case, as well as the method of vaccination and the care of the wound. Laboratory tests had never detected evidence of tetanus in vaccine virus. And most physicians now understood the importance of following the best aseptic practices during vaccination. That left the patients. Mulling over his abstracts, Willson observed that in almost every case there had been “some gross breach in the care of the wound.” For Willson, the evidence pointing to secondary infections was too strong to dismiss. As he reminded his audience, the streets of American cities were blanketed with tetanus bacilli. The Camden outbreak was unique: there had never been such a cluster of well-marked cases implicating a single maker of vaccine. But Willson concluded this was nothing more than a coincidence. “That vaccine virus may be infected with tetanus no one will deny,” he conceded. “But that it has been, and in such cases as here come to view, deserves the full denial that has been given by the clinical symptoms and a careful scientific study.”81

Joseph McFarland took the floor. Dr. McFarland was anything but a disinterested party. The highly regarded scientist had built the Mulford Company’s biologics department back in the 1890s, though his work was primarily in antitoxins, not vaccine. He had left Mulford for a position as professor of pathology and bacteriology at the Medico-Chirurgical College of Philadelphia. McFarland had also been employed, since early 1901, as a consultant for Parke, Davis, Mulford’s greatest rival. McFarland’s conflict of interest was apparent (Mulford executives certainly thought so). But in the cozy medical world of turn-of-the-century Philadelphia, his position did not discredit his investigation, any more than had the Camden Board of Health’s decision to place its investigation in the hands of Mulford’s man Albert Barnes. And who in McFarland’s audience could resist the chance to hear his paper? It remains to this day a pioneering study in the epidemiology of a pharmaceutical disaster. The quality of the paper is indicated by the fact that it was republished, with only a few significant changes, in The Lancet, the preeminent British medical journal of the era and an unwavering advocate of vaccination.82

McFarland spoke as a friend of vaccination, not a critic. Since the first reports of postvaccination tetanus from Cleveland and Camden, he had recognized in this complication “a matter of the gravest importance”—not only because tetanus increased the risk of vaccination but because it aroused “the animosity of those who have banded themselves together for organized opposition against this well recognized and only safeguard against smallpox.” (In the Lancet version, the doctor would insert the words “misguided persons” after “those.”) Nor was McFarland above the class prejudices of his peers. Though many Camden parents were still in mourning, he casually observed that the deceased had been “ignorant and filthy children.”83

Like Willson, McFarland had spent the past few months tracking down American cases of postvaccination tetanus. He had found just fifteen in the medical literature, dating back to the 1850s. All had been attributed to secondary infection of the wound. Through correspondence with physicians and health officials, McFarland had turned up eighty more cases, for a total of ninety-five. (Had McFarland access to modern newspaper search engines, he would have found still more.) The first significant fact about these cases, McFarland said, was that sixty-three of them had occurred in a single year, 1901. Most of those had occurred in a single month, November. “Some exceptional condition,” McFarland observed, had “changed an unimportant and infrequent complication into a very important and frequent one.”84

The scientist proceeded to consider, in turn, each of the conventional explanations for the occurrence of tetanus after vaccination. To the argument (espoused by Willson and the Camden Board of Health) that tetanus was an “accidental secondary infection of the vaccination sore,” McFarland conceded that such cases might occasionally occur. But “to content one’s self with such a simple explanation may be to fall into egregious error, for if tetanus can thus occur it should do so in all parts of the world, with more or less regularity.” According to McFarland’s correspondence with the Imperial Health Office in Berlin and the Pasteur Institute at Paris, the complication was unknown in either Germany or France. Evidently the complication was “chiefly American” and had only become important within a single year.85

McFarland had still less patience for the argument, made by the board of health, that the Camden epidemic was caused by “atmospheric and telluric conditions.” If tetanus were simply “in the air,” Camden and the other afflicted areas should have been plagued by more than the usual incidence of ordinary traumatic tetanus. Instead, the board of health reports of both Camden and Philadelphia showed fewer tetanus cases than usual in 1901 (not counting the vaccination-related cases).

To the argument that secondary infections were caused by careless treatment of the vaccination wound, McFarland again raised the question, But why now? Vaccination had been practiced for more than a hundred years, for most of that time “with a total disregard to cleanliness and asepsis.” Why was the complication so prevalent now—decades after Koch and Pasteur—when vaccination was practiced with greater aseptic precautions than ever? And why was postvaccination tetanus epidemic only among Americans, rather than, say, among “the densely ignorant and filthy people of the island of Puerto Rico,” where the Army had performed 860,000 vaccinations in 1899, with only two or three cases of tetanus reported?86

McFarland proceeded to the tougher part of his argument: to show that tetanus must have been present in the vaccine itself. The Camden health board investigators had tested samples of the locally available makes of vaccine and had found no evidence of tetanus in any of them. McFarland, who had made his name in the laboratory, did not present fresh laboratory evidence. What he did offer was evidence, gathered presumably from his correspondents, as to precisely which vaccines had been used in the ill-fated procedures. The rumors had been right. The “great majority of the cases” in 1901—thirty out of the forty cases that he was able to document—had followed the use of a single make of virus. Cleveland, Camden, Atlantic City, Philadelphia—in every locale, the closely clustered cases implicated “chiefly if not exclusively” one vaccine. McFarland named no names (he labeled the offending vaccine “virus E”), but as everyone in that room knew (and as McFarland’s personal papers confirm), the maker was his former employer, Mulford. McFarland was ready to stipulate that “no care or expense” had been spared to produce these products. But the evidence, he said, “leads me to conclude that tetanus bacilli may be contained in the virus and distributed with it.” In the Lancetversion, McFarland would strike that “may be” and write “is.”87

McFarland’s most powerful piece of evidence—also epidemiological, rather than bacteriological—came from the Philadelphia Hospital. Small-pox had broken out among the hospital’s 4,500 “inmates.” Physicians went through the hospital vaccinating everyone, the sick and the well, with the exception of one section, the Men’s Insane Department. The inmates of that department “were obliged to wait until a new consignment of the virus arrived.” The new consignment was “virus E.” All of the men were vaccinated. Now, McFarland had done some digging in the hospital records. Not a single case of spontaneous traumatic tetanus had occurred in the Insane Department for at least twelve years. As vaccination proceeded, though, five men in the department developed tetanus. All of them died. The outbreak caused a great deal of alarm in the hospital, and afterward, the doctors took additional precautions in dealing with suspicious vaccination wounds. Eleven more men fell ill with tetanus; after receiving “enormous doses of antitoxin,” all recovered. With one possible exception, every patient who developed tetanus had been vaccinated with “virus E.” At this moment, McFarland must have looked out at his audience. “There is something about virus E,” he said.88

As to how the vaccine of one of the nation’s most reputable and scientific makers might have been so terribly corrupted, McFarland invoked the world of the biologics stables that he knew so well: the manure of the calves, the hay, the dusts. . . . Glycerin seemed powerless before tetanus, as the cases implicated all of Mulford’s vaccine products: dry points (unglycerinated), glycerinated points, and glycerinated lymph. (Later that year, Milton Rosenau would report that glycerin preserved tetanus spores.)89

Good scientist that he was, McFarland conceded to his audience that his argument had a “sole weakness.” And that was the incubation period. Tetanus usually set in within ten days after an injury. Everyone cited William Osler’s standard medical treatise on this point; McFarland had studied under the man at the University of Pennsylvania. In the vaccination cases, though, the average time elapsed between the procedure and the onset of tetanus was twenty-two days. But McFarland had a theory. He suggested that while the tetanus bacilli had been “ingrafted into the skin at the time of vaccination,” they did not start to grow until “the development of the vaccine lesion pave[d] the way by the local destruction of tissue.” This hypothesis would add about two weeks to the usual incubation period, for a total duration of about three weeks.90

We may never know for certain what caused the deadly outbreaks of postvaccination tetanus in Camden and other American communities in the fall of 1901. McFarland put forth compelling evidence to implicate Mulford’s vaccine, but the argument’s weak point—the incubation period—does leave a remainder of doubt. Still, there is no mistaking the political repercussions of these events. The vaccine crisis that erupted at Camden shocked the nation, roused the medical profession, and, ten weeks after Willson and McFarland presented their findings, ushered in a major change in American political institutions: the creation of the first effective system for regulating the production and sale of biologics.

On July 1, 1902, President Theodore Roosevelt signed the bill now known as the Biologics Control Act. Drafted by the District of Columbia Medical Society, the bill had been introduced by the Republican senator John Coit Spooner of Wisconsin. Although born of a great public controversy, the bill itself seems to have provoked little. Spooner’s papers contain little correspondence regarding the legislation, and both houses of Congress enacted it without debate. As The New York Times noted, “The bill . . . would involve a dangerous expansion of Federal authority were it not aimed to correct an evil yet more dangerous as directly and immediately affecting the public health.” The case for government regulation, the Times observed, “has been emphasized by recent experiences with virus and serums charged with tetanus germs and pus organisms.”91

Although the law originated in the District, its provisions reached the nation. Effective January 1, 1903, the law established a system of licensing and inspection for all biologics sold in interstate commerce or imported from abroad. Practically speaking, this meant that all substantial makers of vaccines, antitoxins, serums, and toxins in the United States would need to seek a federal license to continue to trade in biologics. The act empowered a federal board—composed of the surgeon generals of the Army, Navy, and Marine-Hospital Service—to promulgate regulations to be enforced by the Treasury Department. Unannounced inspections would be carried out at the discretion of the treasury secretary. The act also required manufacturers to plainly label each product with the maker’s address and license number and the date “beyond which the contents cannot be expected beyond reasonable doubt to yield their specific results.” Penalties included suspension of the license, a maximum fine of $500, and up to one year’s imprisonment.92

On the same day, Congress passed another law that enlarged the authority of the Marine-Hospital Service and gave it a commensurately bigger name: the U.S. Public Health and Marine-Hospital Service. Service medical officers would serve as the frontline inspectors of the new biologics licensing regime, and Milton Rosenau’s Hygienic Laboratory would administer the act. The federal biologics board promulgated its first regulations in February 1903; they became effective that August. To receive a license, makers had to submit to an inspection by a medical officer from the Service. Licenses were good for just one year and could be reissued only after another inspection. If an inspector turned up any problems—bad production standards, impure or impotent products—the government could suspend a maker’s license for thirty days; if the maker did not correct the problem, the government could revoke its license. Parke, Davis received license no. 1; H. K. Mulford no. 2; and H. M. Alexander no. 3. By 1904, the government had inspected and licensed thirteen biologics establishments, mostly for the manufacture and sale of diphtheria antitoxin and smallpox vaccine. Forty-one companies would hold licenses by 1921; all told, those companies marketed more than a hundred different biological products.93

The new law had an immediate impact on the biologics industry. The government refused to license some shoddy makers and suspended the licenses of others. Some smaller companies simply shut down, knowing they could not afford to meet the new standards. In the first few years of the new regime, Mulford’s Pennsylvania rival H. M. Alexander had its license suspended and was twice ordered to remove tainted products from the market. In 1908 and 1909, Mulford and Parke, Davis had their licenses suspended when hoof and mouth disease broke out among their antitoxin horses and vaccine cows. Rosenau’s Hygienic Laboratory continued his old practice of secretly buying up biologics on the open market and testing them for potency and purity. Vaccine quality in the United States rose dramatically. Between 1902 and 1915, laboratory staff routinely tested smallpox vaccine for tetanus bacilli; none were found. The Hygienic Laboratory grew apace with its new responsibilities and powers. From Milton Rosenau’s one-man operation in 1902, by 1904, the laboratory had a staff of thirteen, and it would continue to grow. In 1930, the Laboratory would be given a new name: the National Institute (later Institutes) of Health.94

The vaccine crisis of 1901–2 also prompted local and state health boards to increase their interventions in the vaccine market. In 1903, the Massachusetts legislature authorized the state board of health to manufacture its own lymph and antitoxin under the supervision of Theobald Smith. “Having provided for compulsory vaccination in this state,” The Boston Globe commented, “the authorities are at least bound to see to it that the humblest citizen is provided with as perfect vaccine as it is possible to secure.” Other state and local boards—including Cleveland’s—regularly inspected vaccine lots in their own bacteriological laboratories.95

Leading biologics makers, particularly the largest firms, welcomed the new regime. The new system defused the vaccine crisis and gradually strengthened public confidence in vaccine and antitoxin. The new regulatory system, like other progressive business regulations instituted during the early twentieth century, fostered corporate consolidation by driving many small competitors out of the industry altogether (a welcome benefit to the likes of Parke, Davis and Mulford). Government licensing conferred a federal stamp of approval upon commercial vaccines, and the law established the government as a cooperative partner rather than a rival manufacturer (or, worse, a monopolistic one) in the brave new world of biologics. The National Hygienic Laboratory shared its research with private firms, ultimately saving those firms a great deal of money. The revolving door between government, academia, and the pharmaceutical industry continued to spin, as Joseph J. Kinyoun, Rosenau’s predecessor at the Hygienic Laboratory, left the government in 1903 for a position as director of the Mulford laboratories in Glenolden. Rosenau himself would leave the laboratory six years later for a position at Harvard.96

Over time, all of this government activity increased the quality of American-made vaccines (not to mention other biologics) and assured the physicians and the public that they were not being compelled to undertake unnecessary risks in the name of the public health. The public would accept that assurance only gradually, and never fully. Four years after the passage of the Biologics Control Act, Congress would enact another, much better remembered statute modeled closely after it, the Pure Food and Drug Act. Together, the two laws introduced an unprecedented level of federal regulatory authority over one of the most profitable areas of American commerce and manufacturing, the pharmaceutical industry.97

The Biologics Control Act resolved one of the greatest contradictions in the practice of the nation’s burgeoning public health systems: compulsory vaccination of the people without any governmental review of product safety. The new inspection regime saved compulsory vaccination at its moment of greatest crisis in the United States. Testifying before a House committee in 1910, Dr. C. T. Sowers of Washington, D.C., recalled the days, before the Biologics Control Act of 1902, when anyone who had a few cows could start up a vaccine farm. “There was no government inspection at that time of these farms, and the consequence was a very impure product,” he said. “For us to have enforced vaccination before government inspection I have always regarded as extremely wrong, but now we can do it with the utmost propriety in stopping epidemics of smallpox.”98

It is possible that Dr. Sowers had always regarded the old arrangements as fundamentally unjust. Or maybe he, like most doctors and public health officials, only came to appreciate that injustice—and its political untenability—after nine children died at Camden and the parents of that city, echoed by the protests of ordinary Americans in communities across the country, demanded a new dispensation of coercion and risk in American law.

For Camden, the new era arrived too late. The tetanus outbreak of November and December 1901 had sharpened public fears of that mysterious product of the stable and the laboratory called vaccine. So many parents revolted against vaccination that school officials delayed reopening the schools after Christmas break. Many residents continued through the winter to tell their doctors that they viewed vaccination as an unacceptable health risk for them and their children. They preferred to take their chances with smallpox, rather than risk exposing their loved ones to tetanus.

No more postvaccination tetanus deaths occurred in Camden after Bessie Rosevelt’s death in December. But the toll from smallpox rose. By March 1902, smallpox had struck 165 people in the city, killing 15. Few among the dead in Camden had ever been vaccinated—none of them within the past three years. By the time the epidemic wound down that spring, smallpox had indeed proved more fatal there than vaccination.99

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!