2
No one will be able to stand against you all the days of your life. As I was with Moses, so I will be with you; I will never leave you nor forsake you.
—JOSHUA 1:5
A COMBINATION of three conditions made the United States from its early years unique in the historically Christian world: (1) a religiously engaged culture of “dissent,” (2) church-state separation, and (3) massive immigration. No other nation or empire, including Canada, which came the closest, experienced this particular combination with equal intensity.
The society of British North America and the nation created out of its thirteen colonies was dominated by Protestant “dissenters,” named for the religious communities they built outside the established churches of England and Continental Europe. The Congregationalists, Presbyterians, Baptists, Methodists, and Quakers dissented from the Church of England. In the New World, members of these groups lived alongside Mennonites, Moravians, Huguenots, and others who had dissented from Lutheran and Catholic establishments on the Continent. Dissenting Protestantism entailed a more intense religiosity than was the norm among nondissenting Protestants. The latter did not strive, as dissenters did, to distinguish themselves from some empowered religious regime. This heightened religious consciousness was then reinforced by the act of leaving an ancestral domain to help populate a new land that was defined in part by its greater opportunities for religious particularism. In a settler society that pushed aside or killed the Indigenous population, these dissenting Protestants easily achieved a measure of control over public life that endured well into the twentieth century. They took the land and on it built a country largely in their own image.
Some of these dissenting Protestants would have been happy with an established church, if it could be their own. But when the United States was founded in a society with so many different confessions, any ecclesiastical establishment was out of the question. The First Amendment to the US Constitution was prompted in part by the secular leanings of some of the founders, but the devout realized that it was a practical necessity: “Congress shall make no law respecting an established religion, or prohibiting the free exercise thereof.” This restraint on the national government did not prevent state governments from using tax money to support particular churches, as several states did until 1833, when the last of these, Massachusetts, ceased to support the Congregationalists. Even with the end of established churches at the level of individual states, however, a generalized Protestantism saturated public life. What historian David Sehat calls a “moral establishment” was firmly in place in every state, compromising church-state separation to a degree not recognized until well into the twentieth century.1
Yet, by formally separating the federal government from religion, the First Amendment left churches free to perform a function they were less able to perform in countries where religion was part of the state. Churches in the United States could serve as intermediate solidarities, mediating between the kinship network and the nation.
Immigration made this mediation more important than it otherwise would have been. The United States received wave after wave of people who left behind, in their societies of origin, the structures for intimacy and belonging on which they had previously depended. Immigrants were obliged to build replacement structures for themselves. Independent churches were ideal for this purpose. Immigrant groups organized churches by ethnicity, providing individuals with opportunities to be deeply part of something larger than their extended families yet smaller and much more intimate than the country as a whole. These churches were the voluntary associations that, in Tocqueville’s view, made the United States unique in world history.2
Lutherans, while emerging from a European experience of establishment, became de facto dissenters, accountable to no state church and dividing themselves up according to ethnicity—a Swedish Lutheran synod, a Danish Lutheran synod, a German Lutheran synod, and so forth. Thus in America Lutherans became independent communities, like the Congregationalists and the Baptists. The Episcopalians inherited the traditions of the established Church of England, but they too adjusted to the pluralism of American Protestantism.
Catholic immigrants and their descendants also looked to their churches as intermediate solidarities in Tocqueville’s sense, often organized on an ethnic basis, as Polish, Italian, or Irish parishes. These parish-centered communities were all the more vital given the frequent hostility Catholics experienced at the hands of the majority Protestants. New confessions formed, too, creating communities based on experiences in the New World. These included churches formed by the descendants of captured and enslaved Africans, including the African Methodist Episcopal Church, the largest of the predominantly Black denominations. By 1906, the AME counted nearly half a million members, had created Wilberforce University in Ohio, and was cooperating with Black Protestants abroad. Other new denominations included the Mormons, the Seventh-day Adventists, and the Disciples of Christ. Denominationalism flourished in the United States as nowhere else.
The need for intimacy and belonging led to many schisms, as smaller collectives formed to reflect their own perceived priorities, rendering the nation an ever denser expanse of religiously defined communities. By the time of the Civil War, the largest Baptist, Methodist, and Presbyterian bodies had divided themselves not only into northern and southern denominations in relation to the conflict over slavery, but into numerous smaller confessions defined by often minute differences in ritual practice and doctrine. The federal census of 1890 found seventeen Methodist bodies (four of which were African American), thirteen Baptist bodies (one African American), and twelve Presbyterian bodies. Even the less numerous Mennonites divided themselves into a dozen groups wanting to be identified as distinctive denominations.3
At the start of the twentieth century the United States was a much more churchgoing, Christianity-affirming country than it had been at the end of the eighteenth.4 Church-state separation, immigration, and a culture of dissent had created a country Protestant on steroids. It is no wonder that Max Weber, visiting the United States in 1904, was struck by how extensively the lives of Americans were affected by membership in voluntary religious groups.5 Like Tocqueville seven decades before, the great German scholar understood that religion made America different.
The religious imagery of the Civil War illustrates how the United States gradually became more intensely Christian than it had been at the time of the nation’s birth. The founding generation invoked the deity as a matter of routine, but by the 1860s the Bible was a more substantial part of American culture, North and South. Lincoln took for granted something he flagged explicitly in his Second Inaugural Address: that both sides “read the same Bible” and prayed “to the same God.” Only in an America much changed from the 1780s could partisans of the Union welcome Julia Ward Howe’s incandescently Christian “Battle Hymn of the Republic”: “He is trampling out the vintage where the grapes of wrath are stored … I have read a fiery gospel writ in burnished rows of steel.” As Christ “died to make men holy, let us die to make men free.”
A peculiarity in the structure of authority in most American denominations protected Christianity from some of the critical scrutiny it received from the intelligentsias of Western Europe. Governance arrangements in churches in the dissenting tradition were relatively democratic, giving the people in pews more power than was the case not only in the Roman Catholic Church but in the top-down polities of the Lutherans and Anglicans. The dissenting churches conferred large measures of juridical authority on local congregations and on regional para-church associations. This was especially notable in the many Baptist bodies, as well as among the uniquely influential Congregationalists of New England and to only a slightly lesser extent among the Presbyterians. Preachers and professors had to pay attention to the dispositions of rank-and-file church members and regional associations. The clergy could not get too far out in front of their constituencies without endangering their standing as leaders and their livelihoods as salaried employees.
Preachers had to be especially cautious about two major intellectual developments of the nineteenth century that potentially threatened the faith of churchgoers. One was the Darwinian revolution in natural history. Many clergymen denounced evolution altogether. Others assured the faithful that Genesis need not be taken literally, that evolution was consistent with God’s supervision of a steadily improving world, and that Darwin’s harsh mechanism of natural selection was only part of the picture. But until the rapid expansion of high school education early in the twentieth century, much of the public paid little attention to evolution. Preachers were able to keep the topic at the margins, even while it was debated in the seminaries and in the colleges and universities.
The second development was much more important, and harder to handle. An innovative approach to the Bible developed in German universities known as the Higher Criticism threated to weaken the authority of revelation, the very standard by which evolution was problematic. By the early nineteenth century, breakthroughs in archaeology and philology showed that each snippet of the Bible had been composed under specific conditions. The book of Genesis, it turned out, was cobbled together from as many as four documents, written during different phases in the consolidation of the ancient Hebrews as a single people. The book of Isaiah was written by at least two authors living centuries apart. This new historical-critical approach was also applied to the New Testament. The letters of the Apostle Paul were designed to meet particular needs and circumstances, which needed to be taken into account in order to understand their true meaning. Some of the letters were not even written by Paul. Ephesians and Hebrews were the work of unknown authors. This study of the historical origins of texts, including the linguistic, archaeological, and cultural conditions, was called the Higher Criticism to distinguish it from Lower Criticism, which referred to the physical piecing together of the early manuscripts that had been recognized as biblical. Just as the Darwinian revolution embedded human beings directly into physical nature, so the Higher Criticism embedded the Holy Scriptures directly into human history. Even the soundest of faiths was developed by real people living in real time.
Justifiably fearful of how the people in their pews might react to this apparent diminution of the supernatural authority of the Bible, most preachers avoided talking about the actual history of the scriptures. President Grover Cleveland voiced a common sentiment: “The Bible is good enough for me: just the old book under which I was brought up. I do not want notes or criticism, or explanations about authorship or origin or even cross-references. I do not need them, and they confuse me.”6 The Unitarian minister and transcendentalist thinker Theodore Parker was unusual for writing extensively in a higher critical mode before the Civil War. The seminaries paid increasing but cautious attention to it later, during the last third of the nineteenth century. A historical perspective on the Bible was a delicate topic for nearly all of American Protestantism all the way to the end of the nineteenth century and beyond.
Even in the practice of science, where church members had less authority, Christian commitment influenced perspectives on evolution and on biblical scholarship. American scholars did not need the policing of church members to make them loyal to the faith, at least in its liberal constructions. The great majority of American scientists resisted natural selection as the sole mechanism of evolution, preferring an emphasis on the heritability of characteristics acquired during an organism’s lifetime and making more room for divine supervision of the process. This “Lamarckian” school of evolutionary science, named for the French naturalist Jean-Baptiste Lamarck, came to be known in European circles as the “American school.” The random character of species change as described by Darwin offended the persistent faith that a benevolent deity oversaw the whole process. The popularity down through the early twentieth century of this less rigorously naturalistic version of evolution marked the enduring authority of Protestantism even within the community of American scientists.
In America, as historian Henry F. May established, the Enlightenment took place largely within Protestantism rather than as a critique of it.7 In Western Europe, by contrast, the critical spirit of the Enlightenment often took form as a critique of religion. Spinoza, Voltaire, and Hume exemplified this sensibility in the eighteenth century, followed in the nineteenth by Feuerbach, Mill, and Marx. But nineteenth-century America never developed anything comparable to the famous Victorian cohort of British free thinkers that included Mill, George Eliot, T. H. Huxley, John Tyndall, and W. K. Clifford.
The greatest rationalist voices in antebellum America were the Unitarian divines Parker and William Ellery Channing. Scorned or forgotten was Thomas Paine, whose radicalism was still part of the conversation at the time of the American Revolution. The transcendentalist philosophy of Ralph Waldo Emerson, by contrast, allowed plenty of room for God. Margaret Fuller, an even bolder, more radical thinker than Emerson, always remained loosely connected to her natal Unitarianism. An important exception was the pugnacious agnostic Robert Ingersoll, but even in the late nineteenth century great, freethinking orators never gained the respectability in America that their British counterparts did in their country.
A clear indication of what had happened to the Enlightenment in nineteenth-century America appeared in 1896 with the publication of A History of the Warfare of Science with Theology in Christendom. This formidable tome was written by Andrew Dickson White, a figure of national stature who was president of Cornell University and a former US ambassador to Russia. A devoted Episcopalian layman, White explained that the great conflict of modern times took place within “Christendom,” where good, liberal Christians fought off the bad Christians who fell afoul of the theological dogmatism that prevented people from appreciating the advances of science.8 Written explicitly to refute the popular concern that religion and science were in conflict, White’s thesis registered the Protestantization of the Enlightenment in the United States.
By the time White exemplified the American intelligentsia’s tendency to shelter Christianity from the more skeptical scrutiny it received in Western Europe, a liberalized Protestantism was largely in place in America’s extensive system of private and public higher education. This was true in 1900 at the most distinguished of the new public universities, including Michigan, Wisconsin, Illinois, and California. Religion was more upfront at the leading private universities, especially at those that had been founded and sustained by the major denominations, including Yale (Congregationalist), Columbia (Episcopalian), Princeton (Presbyterian), Brown (Baptist), and Northwestern (Methodist). In 1892 the religiously serious Baptist John D. Rockefeller chose an eminent biblical scholar, William Rainey Harper, as the first president of the greatest of the new, nonsectarian universities, the University of Chicago. Even on that campus, enlightened religion and Wissenschaft lived quite comfortably together.
Beyond these cosmopolitan campuses, private and public, hundreds of denominationally focused, regional colleges educated many more Americans than did the universities. As late as 1900, college education was still a rarity in the United States. Only about 2 percent of Americans between the ages of eighteen and twenty-four were enrolled, but more than half of these attended church-related liberal arts colleges.9 Not all confessions were equally committed to higher education, but the Presbyterians, Quakers, Congregationalists, Lutherans, and Methodists planted their own colleges wherever they had a substantial number of churches. The Methodists, for example, established a string of “Wesleyans” all over the middle section of the country. Ohio Wesleyan and Iowa Wesleyan were both founded in 1842, Illinois Wesleyan in 1850, and Kentucky Wesleyan in 1858, followed after the Civil War by several others, including Kansas Wesleyan (1886) and Nebraska Wesleyan (1887). These colleges exist to this day, as do many of the colleges founded by the various Lutheran synods, including Gettysburg (1832), Muhlenberg (1848), Augustana (1860), Luther (1861), St. Olaf (1874), Bethany (1881), and Pacific Lutheran (1890).
The standing and scope of education differed greatly by region, especially at the level of colleges and universities. The slaveholding elites of the pre–Civil War South were less committed to education than their socioeconomic counterparts in the rest of the nation. The Civil War’s economic and social consequences then weakened what secondary education there had been and created further obstacles to the building of substantial postsecondary educational systems. The Confederate states were excluded from the Morrill Act of 1862, which provided funding for public universities and gave unprecedented impetus to higher education. At the end of the nineteenth century, none of the southern states had developed public universities comparable to those of Michigan, Wisconsin, Illinois, and California. Even the University of Virginia, founded by Thomas Jefferson, did not become a major player in the academic world until the second half of the twentieth century.
Of the private universities that had emerged in America as centers of academic distinction by 1900, only Johns Hopkins was in a former slave state, and Maryland never joined the Confederacy. In the late twentieth century a number of southern states built strong public universities and many religiously affiliated colleges and universities (especially Methodist-sponsored Duke, Emory, and Vanderbilt). But these regional differences persisted and help to explain the enduring appeal of evangelical Protestantism to southern populations. As late as 1970, 18 percent of pastors of congregations affiliated with the Southern Baptist Convention had no schooling beyond high school.10
Differences in education contributed to a division between two families of Protestants that persisted throughout the twentieth century and became more important than ever in the twenty-first. In what historian Martin E. Marty called American Protestantism’s informal “two-party system,” one cluster of Protestants focused on individual salvation and personal morality, while another “lost faith in revivalism and worked instead … for some transformation of the world.”11 The first party was adamant that it understood an unchanging gospel literally, while the second was more willing to grant that people read the Bible in the light of worldly experience. The first party eventually fostered the Fundamentalist movement and a great variety of Pentecostal and Holiness churches that were unconnected to the traditional denominations, while the second flowered in the Social Gospel and Modernist movements.
Richard Hofstadter, in his classic study of 1964, ascribed to the revivalist-evangelical tradition “a one-hundred per cent mentality—a mind totally committed to the full range of the dominant popular fatuities and determined [to] tolerate no ambiguities, no equivocations, no reservations, and no criticism.”12 Later scholars have agreed that millions of Americans welcomed the firm biblical guidance the revivalist preachers claimed to offer. “In a world where violence, cheating, and unrest were common,” notes Amanda Porterfield, “appeals to the authority of the Bible and punishments for sin proved more effective means of discipline” in many communities than did “appeals to the rational nature of mankind.”13 Most denominations were home to adherents of both parties, although the Unitarians and Episcopalians were always aloof from revivalism. After World War II, the two-party system became the ecumenical-evangelical divide, flagged by the two prominent magazines, the Christian Century and Christianity Today, and by two transdenominational organizations, the National Council of Churches and the National Association of Evangelicals.
Protestantism’s two-party system developed independently of the two-party system of electoral politics. Federalists, Jeffersonians, Whigs, Republicans, and Democrats were from time to time more favored than not by members of one Protestant party or another. Just as the electoral two-party system shifted in ideological emphasis over the decades while remaining binary—consider how different the Republican Party of today is from what it was in Lincoln’s time—so, too, did the two-party Protestant system experience pushes and pulls in different directions. The two kinds of Protestants sometimes engaged one another directly, but in many locations simply coexisted, practicing their own versions of the faith and quietly feeling superior to their neighbors.
Although both Protestant parties had vibrant constituencies in every section of the nation, there were striking regional variations. At the time of the Civil War, many northerners of both parties were antislavery, but southern whites—three-quarters of whom were on the revivalist-evangelical side of the religious divide—“believed,” as one historian has explained, “that they were living in a Christian society precisely because they upheld the institution of slavery.”14 In the post–Civil War era, evangelicalism deepened its hold on the majority of southern whites. Both families flourished in the northern states following the war, but the liberals were able to strengthen their control over the denominations from which the more conservative southerners had departed in defense of slavery. Not until 1939 did the northern Methodists compromise their relative liberalism in order to unite again with their southern kin, at the significant cost of administratively segregating predominantly Black congregations, and not until 1983 did the northern and southern Presbyterians unite again. The yet more conservative Southern Baptists never rejoined their northern siblings.
For all their differences, the two families formed a single ethnoreligious tribe that inherited—and until relatively recently was committed to—the mission of helping their shared country achieve a Christian destiny, however differently they construed that goal. Not every empowered American was a white Protestant, but most of the people in control of social institutions were, and they assumed the same of each other, whether associated with tent meetings or with enlightened lecture halls.
William Jennings Bryan’s “cross of gold” speech of 1896 worked so well because the scripture-centered Nebraska politician caught the connection between Jesus and the American population perfectly: “You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold.” President William McKinley thought and spoke conventionally when he concluded prayerfully in 1898 that Christian duty entailed his taking of the Philippines. When President Woodrow Wilson propelled the nation into global leadership in 1917, the connection between Christianity’s missionary impulse and that of the secular American nation was obvious. When Wilson’s rival, Theodore Roosevelt, shouted in a campaign speech of 1912, “We stand at Armageddon and we battle for the Lord,” he spoke in keeping with popular understandings of religion and nationality. Even Elizabeth Cady Stanton, the freethinking feminist who rejected vast portions of the Bible as misogynist, spoke casually of “our Protestant” outlook on life.
This Anglo-Protestant tribe ran the country for a very long time. Even as late as 1960, someone in the upper echelons of any of the three branches of government, or otherwise in a position to influence the direction of society, was more likely than not—there were exceptions of course—to be at least nominally affiliated with a church belonging to one of the classic denominations: Methodists, Congregationalists, Presbyterians, Episcopalians, Disciples, Lutherans, Reformed, Unitarians, Baptists, and Quakers.
By 1960, however, ethnoreligious diversification had made the country more responsive to inclusive notions of community and more welcoming of the critical approaches to the world being advanced elsewhere in the wake of the Enlightenment. Diversity challenged the hegemony of a generic Protestantism and placed the old “Protestant establishment” on the edge of what turned out to be its precipitous decline after 1960.
Two changes in the ethnoreligious demography of the nation, taken up in the next two chapters, facilitated this opening up of public life. One was Jewish immigration. No other modern nation experienced Jewish immigration on the American scale, and no other immigrant-receiving national population was so thoroughly dominated by Protestants before the arrival of Jews. Jewish immigrants and their descendants confronted the well-established Protestants with something genuinely new: a substantial body of American citizens who achieved rapid upward class mobility and cultural influence but did not share a Christian background at all. The second change in ethnoreligious demography took place by long distance: Protestant missionaries and their children engaged non-Christian peoples face-to-face and for long periods of time, then returned home as vigorous critics of American cultural and religious parochialism. Insistently and persistently, missionary-connected Americans bore witness to the humanity of non-Christian and nonwhite peoples. Adherents of non-Christian faiths were not “heathens,” according to this relatively novel line of thought, but were “brothers and sisters.” The missionaries expanded the scope of factual knowledge about the world and also achieved a substantial measure of “sentimental education,” enabling greater empathic identification with previously exotic peoples.
Other influences, too, opened up a world beyond even the most capacious varieties of Protestantism. Urbanization put more Americans in contact with each other than before. Non-Jewish immigrants, especially Catholics, made society even more diverse. The rapid increase of secondary education from the Progressive Era onward made millions of Americans more aware of life beyond their own communities. Radio, motion pictures, and the greater circulation of national magazines from the 1920s onward brought more of the world into popular consciousness. The enlarged role of the “foreign correspondent” made a striking difference in what readers of newspapers learned about how lives were lived abroad. The rise of the United States as a power in international affairs also had a deprovincializing effect. Both world wars diminished ignorance of the breadth and complexity of the globe. The decolonization of the Global South after World War II made it harder to remain unaware of how small a percentage of the world was constituted by Anglo-Protestants.
All of these developments are well known and have been extensively studied. They would have diminished Protestant cultural hegemony even without substantial Jewish immigration, and even if missionaries had not campaigned against American cultural provinciality. But none of these other developments, with the exception of Catholic immigration, is ethnoreligiously defined. Jewish and missionary cosmopolitanism affected Christianity’s place in the United States more directly. Each of these two cosmopolitanisms had its greatest impact, moreover, during the interregnum between the Johnson-Reed Act’s ending of large-scale immigration in 1924 and the return of massive immigration in the 1970s following the Hart-Celler Act of 1965. That interregnum was a unique period of American history, during which the lack of massive immigration rendered Jewish and missionary diversifying influences all the more important.
Neither of these two influences has been sufficiently incorporated into either academic or popular narratives of the twentieth century. The Jewish impact has been recognized as important only within the subfield of American Jewish history and sometimes within intellectual history. Missionary cosmopolitanism, when recognized at all, has been treated almost exclusively within the subfield of American religious history. But each episode broadened the cultural horizons of millions of Americans.
Other non-Christian peoples were part of the country, too, but did not seriously threaten Protestant cultural hegemony. The Indigenous population that survived genocide was kept at a distance through the reservation system. Enslaved Africans brought their own religions with them, but they were not in a social position to challenge the ruling white Protestants, and eventually most of their descendants became Protestants themselves. Once that happened, African American Protestants threatened white supremacy in every national arena where they could make themselves known, including within the edifice of American Christianity. But the white Protestants of both major families proved capable of ignoring the interests and creativity of Black people, much more often than not, until well into the twentieth century. Thereafter, African American Protestants challenged and greatly broadened the ecumenical family and shamed the evangelical family for its slower response to their presence as Christians, as Americans, and as human beings.