15

The Great Miscalculation

I warned readers in the first chapter of this book that Israel occupies a continent in the American mind—and that telling the story of the relationship would involve an exploratory journey into the heart of that continent. Already our journey has taken us into topics like theology and cultural history that foreign policy studies often try to avoid. As the story approaches the present, the connections between America’s Israel policy and the inner life of the American people play an ever more decisive and ever more surprising role both in the story of America’s Israel policy and in the story of American domestic politics.

From George H. W. Bush’s campaign to drive Saddam Hussein out of Kuwait to the Abraham Accords signed while Donald Trump was in the White House, events in the Middle East were more central to American politics than at any other time in our history. At the same time, events ranging from the evangelical religious revivals of the 1990s to the attacks of 9/11 put the Middle East and Israel into the heart of America’s escalating culture wars and internal debates. To complicate matters even further, the post–Cold War years brought a torrent of economic and social changes that precipitated something like an identity crisis in the United States as long-accepted narratives about America’s place in the world came under attack. Debates over Israel policy in these years both reflected and helped to shape a profound internal debate taking place in a nation that was no longer certain of its role in world history or the value of its inherited institutions and beliefs. We cannot really understand American policy toward Israel in this era if we do not see how the ideas that shape American perceptions of Israel, ideas that also shape America’s perceptions of itself, were changing as the United States struggled to navigate the unexpectedly choppy waters of the post–Cold War world.

Understanding American politics and perceptions in these years also matters because to a degree not seen since the 1920s and 1930s, the post–Cold War decades were a time when American foreign policy was based on ideas about the state of the country and the state of the world that turned out not to be true. The American foreign policy consensus of the era was, like the 1920s and 1930s Lodge consensus before it, based on heroic assumptions about the power of liberal ideology more in accordance with American wishes than with external facts. The old foreign policy establishment, mostly based among prominent East Coast banks, newspapers, law firms, and organizations like the Council on Foreign Relations, had changed profoundly since Lodge’s day. The new American foreign policy universe—a loose group that included journalists, academics, activists, community and civil society leaders, leaders in the world of business and labor as well as career officials in the vast national security complex and elected officials—had never been as large, as diverse, as well traveled or as credentialed as it had become by the end of the Cold War. It would only grow larger, better credentialed, and more diverse in the years that followed.

But despite all these advantages, with some notable individual exceptions the broad group of Americans who concerned themselves with guiding the country’s foreign policy was not very good at understanding either the state of American society or the state of the post–Cold War world. The gap between the world as these Americans conceived it and the world as it was seemed relatively narrow in the 1990s, but as time went by the gap between the intellectual basis of American foreign policy and the forces driving world events would widen until American foreign policy became increasingly controversial at home even as it became less successful overseas.

What can fairly be called the Great Miscalculation would have a profound impact on American politics and American foreign policy. While the miscalculation only occasionally involved Israel policy directly (for example, by promoting the persistent illusion that an Israeli-Palestinian peace agreement was just one more summit away), the consequences of a fundamentally misguided American national strategy, and a national self-perception increasingly dissociated from reality, would inevitably affect American policy in Israel and in the Middle East more broadly.

This complex and frustrating post–Cold War era is also the time in American history when Israel policy loomed largest, both in American foreign policy and in American politics at home. The thirty years after the end of the Cold War saw the rise and fall of the Israeli-Palestinian peace process. These years also witnessed both the eruption of the Middle East into the center of American foreign policy after 9/11, and the subsequent shift in American foreign policy priorities to the Indo-Pacific.

As the U.S.-Israel cooperation intensified the alliance became more controversial. Prominent scholars and advocates of “realist” foreign policy blamed Israel and its American supporters for a series of disastrous foreign policy steps in the Middle East, even as public support for Israel reached all-time highs. These were also years in which the politics of Israel policy changed dramatically in the United States. Support for Israel had once been a left-wing, internationalist cause in the United States; by the Trump years Israel’s most vociferous and uncritical advocates were on the right, while left-wing Democrats and groups like Human Rights Watch had become some of Israel’s most dedicated critics.


Americans were, to use a phrase that Joseph Stalin made famous in 1930, “dizzy with success”[1] after the fall of the Berlin Wall. Many agreed that the end of the Cold War and the collapse of the Soviet Union meant what Francis Fukuyama called the “end of history.” They believed, for the most part, that the unipolar moment in world politics that followed the USSR’s demise could be converted into a lasting democratic peace around the world. American power deployed in the service of a rules-based international order could secure vital American and western security and economic interests at minimal risk and with low costs. Opening the system to rising powers like China and India would improve economic outcomes for all countries, integrate the new powers into not just the structures but also the values of the American system, and promote rising standards of living around the world.

These policies, American elites generally felt, would not just promote a peaceful world, they would enhance the prosperity of the American people. In what some called “a great moderation,” on economic issues Democrats were moving right and Republicans were moving left. The chief instrument of social progress, leading Americans believed, was the rapid growth facilitated by a deregulated economy and a dynamic financial sector. What was good for Wall Street, with relatively few exceptions, was good for the country. And what was good for the country was good for the world. While the parties differed on the details, they agreed that on the whole only modest interventions would be required to make the United States a more just society. Racial inequality was believed to be on the road to extinction. While social problems remained, progress was possible without wrenching changes to the status quo.

The first two decades of the new century, however, witnessed a series of shocks to the optimism and confidence of post–Cold War era triumphalism. Beginning with the 9/11 attacks in 2001, continuing with the unhappy wars in Afghanistan and Iraq and the financial crisis of 2007–09, world events diverged from the path many Americans once confidently assumed they would take. As the global wave of democratization subsided and went into reverse, as Russia and China moved increasingly aggressively to counter American power, as automation and losses to foreign competition created new problems for the American middle class, and as the Covid-19 pandemic stunned the world and sent the global economy spiraling into its sharpest contraction on record, the gap between the future that the American establishment had expected and prepared for and the future that had actually arrived gradually became too wide to ignore.

Over time these developments led to an erosion of political support for post–Cold War global policy, a rise in antiestablishment political movements on the right and the left, and to a rupture in the previously dominant Sun Belt coalition that broke the continuity of Republican Party leadership and policy. The rise of a socialist left and an ultranationalist right were disquieting signs that many Americans had lost faith in the American economic and political system.

Disenchantment came in waves. When Bill Clinton left office in 2001, many Americans still felt as if they were inhabiting a unipolar moment at the end of history. That changed with the shocks of the George W. Bush years, but many Americans believed that if only Bush could be replaced, the world would return to the calmer conditions of the 1990s. Barack Obama swept into office on a tsunami of optimism and his supporters confidently expected that a new approach to foreign policy would bring back happier times.

By the time the Obama administration ended, these hopes had largely faded away. Donald Trump was elected in large part because he appeared to represent the kind of deep change in both domestic policy (“Drain the Swamp!”) and foreign affairs (“America First!”) that many Americans had come to believe was required. With the swamp still undrained and the world less stable than ever, by 2020 many Americans were wondering if the problems of the United States, much less those of the world system, could ever be solved.


The end of the Cold War left American policymakers and their colleagues in the European Union and Japan committed to a set of ideas that proved to be less than satisfactory as guides to the future.

First, they believed that the wrinkles had been smoothed out of capitalism, that the institutions and ideas that were generally accepted among the financial and business elite of the day would serve as adequate guides into the future, and that the global capitalist system they were building was adequately protected against the great economic storms that had marked the preceding three hundred years of capitalist development, sometimes with catastrophic social and geopolitical consequences. They were wrong, and the global financial crisis of 2007–09 shook the United States and its order-building efforts to their foundations.

Second, they believed that the social and economic system that took shape in the Atlantic world and Japan after World War II represented the culmination of the historical process, that the prosperity and social harmony of the postwar era were permanent and unassailable, and that no further serious political or economic challenges would threaten the institutional foundations of the so-called developed world. They were wrong about this, and rising resentment over income inequality and other economic conditions would progressively undermine the ability of political leaders in the United States to carry out the domestic and foreign policies they believed that the times demanded.

Third, they believed that the economic and political model that had taken shape in the Atlantic world was easily exportable—to Russia, to China, to the Middle East, to South Asia and around the world. They did not believe that culture really mattered all that much. People everywhere wanted more or less the same things, and the laws of economics were the same all over. Now that the best policies were known, and the technical skill to apply them was becoming common all over the world, progress would be rapid, and nobody would ever want to go back. They were wrong. There were large parts of the world where the model did not work as advertised, and within a generation of the triumph of 1990, the world would once again be filled with those who believed that the American model would never and perhaps should never work for them.

Fourth, they believed that the identity wars were over. Nationalism and religious fanaticism, they believed, would quickly fade away in the postmodern, post-historical, and post–Cold War world. It was the old Age of Hope all over again. As their nineteenth-century predecessors had done, Americans at the end of the Cold War failed to grasp the power of identity politics and the drive for cultural recognition. As in the nineteenth century, rapid and far-reaching social, political, and technological changes brought on by the advance of capitalism would combine with the resentments and hatreds that seethed in much of the world to create a much more dangerous international environment than optimistic Americans expected.

Finally, as a result of all of these errors of analysis and judgment, America’s elite opinionmakers and foreign policy gurus fundamentally misread the international position of the United States. They seriously underestimated the difficulties in their path and did not understand just how ambitious their goal of global transformation was. They believed, like the proponents of the Lodge consensus, that Americans could steer world history in their preferred direction on the cheap.

As a result, the United States and its principal allies set off on a noble and hopeful adventure in the early 1990s. Hamiltonians believed that the time had come at last when the United States could build a truly global economic order that would ensure both American prosperity and world peace. Wilsonians believed that the time had come at last when tyranny and oppression could be banished from the world. The United States and its allies could overthrow the customs of many generations to bring equal rights to women and sexual minorities, replace personalistic, premodern forms of government with sleekly modern legal-bureaucratic structures, engineer transitions to democracy, and impose a model of international relations that was deeply grounded in Euro-American history on countries like China, Russia, India, and Pakistan without a lot of trouble. Along the way, we could stop global warming, religious persecution, poverty, and war.

Incredible as it may appear in hindsight, most of America’s foreign policy makers and commentators sincerely and passionately believed that all this could be achieved without a lot of hard work. There was no need to mobilize public opinion behind this ambitious program of global transformation. The job did not require a major arms buildup, huge foreign aid budgets, or massive American military commitments around the world. Indeed, even as Americans undertook the most far-reaching and challenging program of global transformation in the history of the human race, they debated how best to use the “peace dividend”—the anticipated savings that would come from cutting defense, foreign aid, and public diplomacy budgets after the demise of the Soviet Union.

What makes this pandemic of hubris at least partially explicable is that the end of the Cold War seemed like part of a larger story. From 1945 to 1990 the world seemed to be finding its footing. Societies first in the West and then in much of the rest of the world progressively mastered the art of managing the problems of industrial society. The bitter class struggle that defined European politics from the time of Marx to the end of World War II softened into a political competition between center-left social democrats who accepted the basic assumptions of capitalist society and center-right parties that accepted the basic features of the welfare state. Big business, big labor, and big government learned to negotiate with one another. Across the world of NATO and Japan, most workers were assured of stable employment, rising wages, and opportunity for their children. Inherited problems from the frenzied industrialization of the past were gradually being addressed. The air and water in the great cities of the West began to clear. The gates of higher education opened to the children of the working class. Old scourges like child labor and dangerous working conditions faded into memory.

The end of the Soviet Union, it did not seem unreasonable to suppose, would accelerate this benign trend toward global prosperity and peace. If Americans and their allies had the vision and the courage to seize the opportunity before them, the world could move to what people called a “post-historical” state of abundance and peace.

Even in hindsight, with so many of the ambitious plans of the early 1990s in disarray, and the enticing illusions of the era now painfully exposed, it is impossible to write off thirty years of American and western foreign policy based on this vision as a total failure. The post–Cold War world order gave unprecedented economic opportunities to billions of people in Asia, Africa, and Latin America. Living standards rose, life expectancies increased, and medical emergencies like the HIV-AIDS pandemic met with an effective response unthinkable in past times. More children survived the diseases of infancy and fewer people lived in the presence of chronic hunger and famine. Yet despite some of the most dramatic economic progress in the history of the world, both America and the world seemed less secure and less stable a generation after the Cold War than in those halcyon days when the Berlin Wall had fallen and history was thought to be over.

What so many Americans and others missed in those years was that the interlude of relatively stable social and political life that characterized the post–World War II decades was moving toward a close as the twentieth century neared its end. This was not only because the growing competition from low-wage manufacturing economies undercut wages and union power in the advanced industrial democracies. The Information Revolution was going to be at least as disruptive, as transformative, and as challenging as the Industrial Revolution itself had been. Just as the Industrial Revolution had shaken every society on earth to its core, transforming the most powerful and embedded institutions and bringing waves of political and social revolutions in its train, so the Information Revolution would test the foundations of the social and political order in countries around the world, the United States of America very much included.

The American leadership class was not blind to the existence of the Information Revolution. Indeed in 1990 information technology was already triggering massive economic and social changes in the United States and beyond, and more change clearly lay ahead. But this change would, the American establishment generally assumed, reinforce the foundations of American stability at home and power abroad, not challenge them. After all, the collapse of the Soviet Union was driven in large part by the inability of the Soviet system to adjust to the requirements of the Information Age. What Americans and many others missed was the degree to which the Information Revolution would challenge and even reverse what many political scientists and economists, to say nothing of policymakers, considered one of the most fundamental features of the modernization process: convergence.

The Industrial Revolution had been an era of convergence. As countries around the world mastered the secrets of industrial production, they tended to converge toward similar bureaucratic and economic models, and cultural and religious differences tended to fade into the background. This process accelerated dramatically after World War II as more and more postcolonial states were learning the secrets of industrial development. The export-oriented economic growth model based on exports of light consumer industrial products from low-wage, loosely regulated economies into the rich consumer markets of the industrialized global North ignited extraordinary growth across much of East and Southeast Asia. Over time, economies like those of Taiwan, South Korea, Vietnam, and mainland China followed the Japanese path.

Most development economists and political scientists who studied these phenomena in the 1980s and 1990s believed that the process would continue indefinitely and that as Asian laggards like Bangladesh and Pakistan jumped on the light manufacturing conveyor belt to prosperity, the earlier adopters would continue to advance toward higher value-added manufacturing and higher levels of prosperity and peace. Meanwhile, countries in Africa, Latin America, and the Arab world could and would jump onto the prosperity train, and all these countries would ultimately achieve western-style affluence. As these countries prospered, they would achieve the kind of social stability that characterized the advanced industrial democracies after World War II. That stability in turn would promote the peaceful democratization of emerging economies (as happened in places like Taiwan and South Korea), and the stable new democratic governments springing up around the world would eagerly sign up to be “responsible stakeholders” in a peaceful and democratic world order.

This vision of global progress and gradual, voluntary convergence was the foundation on which Americans and not only Americans thought the twenty-first century would rest. It was a beautiful dream, but history had other plans.

Like the Industrial Revolution before it, the Information Revolution would arrive on the wings of a storm. Waves of economic and social disruption would test the stability of the advanced industrial democracies as old industries disappeared or fled offshore, as manufacturing and clerical jobs disappeared, as tensions over migration grew, and as a threatening new great power rose in the Far East. Those same waves washing across the postcolonial world would undermine the forces powering their social and political convergence with the West.

It is important to grasp three large facts that became more evident and more disruptive in the three decades following the fall of the Berlin Wall. The first was that instead of becoming more manageable and tractable and turning toward the kind of global liberal order Americans hoped to build, the world beyond America’s borders was becoming a more geopolitically competitive and less liberal place. The second of these inconvenient truths, as former vice president Al Gore might have called them, was that American society, which many at home and abroad believed to be the solid rock on which a permanent global economic and political order could safely be founded, revealed previously unsuspected fissures and cracks. Far from stabilizing the rest of the world, American society lost much of its unity and élan. The third bitter reality was that the sense of existential danger and apocalyptic potential that briefly faded when the terrifying nuclear standoff between the two hostile Cold War superpowers came to an end would gradually return even as it shifted its shape.

These three disagreeable truths were not of course the only features of the post–Cold War era, but over time they exerted an increasingly powerful and, for the most part, baleful influence over American foreign policy worldwide. They would also elevate the importance of the U.S.-Israel relationship in the minds of many Americans, and the relationship would become more fraught, more visible, and perhaps also more mysterious than ever before. These three inconvenient realities put American foreign policy onto increasingly dangerous terrain.

The Return of Great Power Rivalry

As the Cold War drew to a close, political leaders and intellectuals from Seattle to the Saarland thought that they were witnessing the end of the great power rivalries of the past. Now that communism was thoroughly discredited by the Soviet Union’s ignominious collapse, the United States and its prosperous allies dominated the world, and other countries would need to become like them to enjoy the prosperity and power they had. Countries who rejected the path of democratic industrial democracy would remain backward and weak—too weak to challenge the world system taking root around them.

There would still be rivalries and jostling between countries, much as the nations of the European Union continued to advance their national interests within the framework of common European institutions, but the nature of international competition would change. We would not see a new Wilhelm II or Napoleon seeking dominance, and the days of powerful antiliberal and antidemocratic empires like both tsarist Russia and the Soviet Union were past. Giving credence to this line of thought was the behavior of both Russia and China in the years immediately following the fall of the Soviet Union. Russia in the early years of the Boris Yeltsin presidency appeared to be introducing both political and economic reforms to convert itself into a western country. China, too, despite its continued emphasis on one-party communist rule, appeared to be borrowing as much from the West as it could absorb. Whether it was the African National Congress (ANC) seeking advice from the IMF and the World Bank as it took power in South Africa, or Brazil and Argentina embracing liberal economic ideas, the world seemed to be heading Washington’s way.

International politics took on a new and more peaceful aspect. Saddam Hussein’s 1990 invasion of Kuwait triggered the greatest display of global unity in the history of the world as Russia supported a resolution at the Security Council to reverse Saddam’s invasion under America’s lead.[2] Rogue states like North Korea and Iraq might defy the international consensus, but these were small, economically marginal powers. The storms of the twentieth century seemed to have burned themselves out, and the world appeared to be embracing the wisdom of the American Way.

The world, as it turned out, was a more complicated place than the architects of the post–Cold War order understood. The post–Cold War economic paradigm of free trade, deregulation, and financial deregulation, sometimes called the Washington Consensus, did not last. Too many countries were unwilling or unable to adopt it, and in those that tried, political resistance and unexpected economic upheavals often yielded results very different from those the textbooks predicted.

After a lost decade of floundering as it experimented with western-style “shock therapy” and privatization, the Kremlin turned to former KGB officer Vladimir Putin to employ more traditional Russian means to grapple with the new economic order. Putin would not turn Russia into an economic superpower, nor could he match Washington in conventional military power, but he succeeded brilliantly at returning Russia to the world stage as a significant force. Putin’s Russia did not offer the kind of comprehensive ideological challenge to western ideas that Soviet communism developed, but his Russia was unbendingly hostile to the kind of order that Washington sought to build. His ability to perceive western weakness and to act on it allowed him to stage the most dramatic revisions to the map of Europe since World War II, end NATO’s eastern expansion, and exploit the divisions within the European Union on a scale few in the West had foreseen.

Although economic weakness placed limits on Putin’s power, he was able to harness the power of the Information Revolution to his national ambitions. Russia is not a vital cog in global manufacturing, nor has it created a Wall Street or Silicon Valley on the Volga, but the Russians have developed some of the most advanced cyber capabilities in the world and have become masters at exploiting global financial markets, employing dark money networks, and at using information warfare to disorient, divide, and in many cases defeat their opponents.

The other great power that rose to challenge the American vision of a post–Cold War order was of course the People’s Republic of China. Defying the American belief that political liberalization would necessarily follow economic modernization, China would ultimately strengthen the stranglehold of the Chinese Communist Party on national life even as its economy exploded in size and sophistication.

Russia and China did not see eye-to-eye on every issue, but both governments see Washington’s vision of post–Cold War order as a mortal threat to their regimes and as a strategic threat to their national power. Some in both countries fear that democratic reforms would lead to the kind of collapse that the Soviet Union experienced under Mikhail Gorbachev, followed by the weakness, misery, and national humiliation Russia suffered under Yeltsin. They fear that their countries are neither ready for nor suited to western institutions, and that efforts to introduce them would have catastrophic results. More pragmatically, leading figures in both governments do not believe that their personal wealth and power would survive the transition away from the political status quo.

As both countries took stock of the changed 1990s world, they were initially unsure how to respond. Over time, however, as China’s economy surged and as Putin consolidated the power of his new state, they began to test the limits of American and western power. The West, convinced of the superiority of its political and economic models, failed to take the two Eurasian giants seriously for many years, giving them both the opportunity they needed to prepare for the competition that, unlike the West, they were certain was on the way.

The return of traditional great power competition to world politics did not happen all at once. In the 1990s, Russia and China appeared relatively quiescent and compliant. In the 2000s, as the United States bogged down in Iraq and Afghanistan and as the European Union struggled with the consequences of its poorly designed currency union, the Eurasian giants began to reassess. Beginning with the 2008 Russian attack on Georgia, which fell between the Fannie Mae bailout and the collapse of Lehman Brothers when an exhausted and unpopular George W. Bush administration was struggling with the greatest American financial crisis since the Great Depression,[3] Russia moved to a policy of open defiance and contestation against what westerners considered the most sacred tenets of the “rules-based international order.” China, which interpreted the financial crisis to mean that American economic power was entering an era of decline, abandoned the “peaceful rise” policy of Deng Xiaoping to assert its status as a rising superpower around the world. Aligned rather than allied, the Eurasian powers brought complementary strengths and focuses to the game of thrones. Neither the Bush, Obama, nor Trump administrations managed to develop an effective counterstrategy, leaving a sense of failure and drift to take hold in the United States.

The sense that the country’s foreign policy was adrift, and that other powers were pushing successfully against American designs, would contribute to the crisis of confidence within the United States that, a few short decades after the triumphant end of the Cold War, left many Americans wondering whether their basic institutions were strong enough to survive.

American Transformation

The series of disruptions that shook American society between 1990 and 2021 were as unexpected as they were, for many, unwelcome. Since World War II the United States had built the largest and most prosperous middle-class society the world had ever seen. The majority of Americans owned their own homes.[4] Two successive generations had seen their living standards rise far beyond what their parents had known, and the next generation was expected to see more of the same. I call this the “blue model” economy and society because of its abiding popularity in “blue state” and “blue city” America: a highly regulated system of economic and political order based on a mature industrial economy in a liberal political system.

In the United States and in many other “advanced industrial democracies,” as these countries were known at the time, it seemed that the historic problems of capitalism had been solved once and for all. Competent, independent central banks and sophisticated financial regulation had eliminated the financial crises and panics that periodically cast the capitalist world into turmoil from the times of the seventeenth-century boom and bust known historically as the Dutch Tulip Bubble to the Great Depression.

For working people, this mid-century system offered stable employment, reasonable wages, and the opportunity for social mobility. While women and minorities experienced systematic discrimination in education and employment, for white males—along with the women and children who depended on a white male breadwinner—the new system represented a significant advance over Depression and prewar-era conditions. The unprecedented productivity of America’s vast industrial capacity, greatly expanded during World War II and largely converted to civilian production after 1945, produced a cornucopia of consumer goods that made life-changing products like cars, television sets, washing machines, and dishwashers affordable for a growing proportion of the public. The expansion of the road and highway network opened the possibility of suburban home ownership to blue-collar as well as white-collar workers; the 1950s saw millions of Americans who grew up in crowded cities, often in slums, move their young families to leafy suburbs.[5]

At the time, most Americans believed that this system of organized industrial capitalism represented the final point of human social evolution. The future, they believed, would bring more of the same. Over time, as labor productivity rose and technology advanced, real wages would continue to rise, and the quality of consumer products would improve, but the fundamental features of this elegant social model would not change. Intellectuals bemoaned the conformity that seemed to be overtaking American life, and radicals grumbled about the emptiness of a society organized around the production and marketing of an endless succession of consumer goods, but these problems were much less distressing to the average person than the poverty and insecurity that they remembered from the Depression.

This was the American example that inspired imitators around the world. The American model appeared to demonstrate that capitalism plus democracy led to mass prosperity and deep social stability. Both during and after the Cold War, it was this model of a developed industrial society that Americans sought to export. In the 1950s the successful adaptation of liberal democratic capitalism across Western Europe and Japan anchored the western alliances and blocked the expansionist dreams of Stalin and his heirs. In the 1980s the success of East Asian countries that built advanced industrial societies of their own increased American confidence that a bright new era in world history was well under way.

The end of the Cold War, however, did not usher in the era of tranquil progress for which so many hoped. Capitalism is a revolutionary social and economic force and it does not stop being revolutionary merely because some members of a capitalist society would like to get off the train. At the end of the Cold War, fundamental changes, many directly rooted in the Information Revolution, were transforming American society and would upend assumptions and challenge institutions that many Americans thought were solidly planted. The foundations of the post-1945 economic system were already beginning to totter by 1990, and the golden age of the American middle class was drawing toward a close even as the nation’s intellectuals and politicians celebrated the victory of American capitalism over the Soviet Union.


Nostalgia is myopic; the blue model society of the 1950s and 1960s was far from the utopia that later generations sometimes imagine it to be. Americans born into the twenty-first century for all its problems would be horrified if transported back to this supposed middle-class paradise. Filthy air and bad food, shoddy and unsafe gas-guzzling cars, clunky phones and bad service, tiny houses, sclerotic bureaucracies, inflexibly hierarchical workplaces, racism and xenophobia at levels inconceivable in a later time, stultifying social conformity, stifling oppression of women and their talents, openly expressed hatred and contempt for sexual minorities: it was not, by later standards, a model to emulate. Yet at the time blue model America was the wonder of the world, and the shining example of American success that it offered was the foundation not only of American world power but of the domestic stability and political coherence of the United States itself.

Americans often attributed the country’s global “soft power” to what they saw as its compelling moral example. Our dedication to human freedom, our inspiring record of international leadership, the self-evident excellence of our political institutions: these are, many otherwise quite intelligent and sensible people believe, the vital foundations of American power around the world and the reasons why other countries have so often been willing to work with us and follow our lead.

To travel and engage with people around the world is to learn that others do not see Americans in quite these flattering terms. Our ideals may be inspiring, but being human, we betray them as often as other people do. Other people note our betrayals and think harder about them than we do. As for international leadership, while our thoughts often turn to such triumphs as the Marshall Plan, others around the world find less-inspiring episodes in American history to be equally compelling. Our institutions, however satisfying we find them, can seem quaint, funny, or even barbarous to people grounded in different cultural and political traditions.

The primary foundation of American soft power after World War II and again at the end of the Cold War was something more basic: success. America won the major international conflicts of the twentieth century, and mastered the challenges of the Industrial Revolution to create the kind of stable, prosperous, powerful society that people all over the world wanted to live in. American institutions and ideas were appealing in large part because they appeared to work. People listened to American economic, agricultural, medical, and industrial experts and consultants because they wanted their own societies to be as prosperous and stable as America was, and they studied American business methods because they wanted to build corporations as global, as innovative, as large, and as profitable as the American-based multinationals. People might resent American arrogance, repudiate American racism, recoil from the vulgarity of American culture, reject American hypocrisy, and mock American pomposity—but they wanted what America had. The cow might be smelly, but the milk was sweet.

As the Soviet empire imploded and American military and economic prestige reached new highs, it appeared to many around the world that the way forward for your country was to align with America politically and to adapt its economic and social model to your own situation as best you could. This in turn encouraged the American foreign policy universe to believe that its goal of transforming the world into something like America’s image could be achieved with relatively low levels of cost and risk. Rather than imposing a model, one was assisting a movement.

The gradual decline in American soft power as the American model ran into trouble was one of the principal factors turning American foreign policy from a merry summer ramble to a grim winter slog after the Cold War, but the growing difficulties of blue model society would have an even more profound, and equally unexpected, impact on American domestic order. The perception of American success and American progress wasn’t just politically important in the sense that happy voters tend to support centrist parties and stable policies. It wasn’t just important because prosperity helped legitimize the social order and marginalize radicals who might otherwise seek massive change. It was important because of the role that American success played in the construction of American identity in the twentieth century. That so many Americans embraced the idea that the gradual development of a (relatively) egalitarian and prosperous America based on the adherence to principles which had all along been part of the country’s cultural DNA helped unite a complex society around a common vision. The success of the blue social model was not just a piece of American history; it had become a building block of American identity. The gradual decay of blue model American society would eventually produce a profound crisis of American identity and ideology that is now the most important issue in American life.

All national identities are to some degree arbitrary and artificial, and the United States, unusually large, unusually diverse, and unusually young among the great nations is no exception to this rule. As a child growing up in the Carolinas in the 1950s and 1960s, I lived among people who were Americans the way French people are French. That is, we were Americans because we lived in America. We knew as a historical fact that our ancestors had come from other parts of the world, but no living members of our families had any memory of living anywhere but America or being anything but American. The America we knew was diverse: some Americans were white and some were Black, most but not all belonged to various Protestant denominations, and there were clear differences of accent, class, and culture even among whites. Southern whites also thought of themselves as southerners, very different from Yankees, who were also American, but who we were not particularly encouraged to like. However, when outsiders—Germans, Japanese, or Russian communists—attacked in one way or another the United States, they attacked all of us and we would stand together against them.

It was not until I came north for my higher education that I learned that there were people who experienced their American identity in ideological terms. For them, America was defined as a set of ideas about democracy, liberty, equality, and the rule of law. I believed then and still believe all these years later in those ideas, but neither then nor now does it seem to me that my identity or anyone else’s as an American rests on accepting these or indeed any ideas. To me, a person is an American because either by choice or by birth they are citizens of the American republic and therefore members of the American people.

But solid and self-evident as this conviction is for me, I have learned that the other, more ideological approach to American identity is as deeply grounded for many people I like and admire as mine is for me. American identity means different things to different people.

In the eighteenth and nineteenth centuries, many white Americans experienced their identity in the ways I did as a child. They were “American” because of who they were, not because of how they thought. People were aware that Americans generally thought in different ways from Europeans and others, but those differences were not seen as constitutive of American identity.

The development of an ideological vision—that those who accepted a certain set of ideas were “true Americans’’ while those who rejected them were “un-American” even if they were legal citizens—was the product of both a nativist suspicion of Roman Catholic immigrants, primarily Irish, in the mid-nineteenth century and attempts by said Catholic immigrants to demonstrate that their religious beliefs did not preclude an acceptance of the main articles of what came to be known as the American creed. The construction of an ideological American identity served both to integrate and to police immigrants. “Good” immigrants bought into the American creed; “bad” ones didn’t, and could be deported, as in the post–World War I Red Scare, or marginalized, as in the McCarthy era. That Congress convened the House Un-American Activities Committee is a sign of how important the creedal nature of American identity had become.

In substance, being American in this creedal sense meant assent to the principles of the Declaration of Independence, the institutions established by the Constitution, and to the ethics (though not the specific doctrines) of Judeo-Christian religion. The Pledge of Allegiance, “I pledge allegiance to the flag of the United States of American, and to the republic for which it stands, one nation, indivisible, with liberty and justice for all,” was seen as the American creed. (The words “under God” were added in the 1950s to underline the distinction between atheistic communists and good Americans.)[6]

The construction of a creedal identity was particularly important to groups whose integration into American society was otherwise questionable or controversial. It not only meant that Catholics, Jews, and Black Americans could lay claim to full membership in the American nation on the basis of the creed, it was a creed which seemed to cover all the basics to the old majority population while being broad and inclusive enough so that most (though never all) members of these groups could, with whatever reservations they may have had, generally subscribe to it.

The challenges of assimilating the waves of immigration culminating in the Great Wave of 1880–1924 made creedal nationalism more important than ever. Orthodox Greeks, Catholic Sicilians, Russian Jews, and Syrian Arab Christians did not have a lot in common with each other or with the “old stock” of mostly Anglo-Protestant settlers already ensconced in the United States. It was not just that they came from cultural and religious backgrounds far removed from the old American mainstream. The America into which they came—increasingly urban and industrial—was very different from the agrarian republic into which the earlier settlers had come. For new immigrants, it was increasingly difficult to see themselves in the portraits of silk-stockinged Anglo Founders in tricorn hats; for “old stock” Americans, it was also difficult to see the exotic newcomers and the bustling smoky cities in which they lived as recognizably related to a past that seemed more idyllic as it receded from living memory. As the European Great Wave subsided during World War I and was curtailed by the Johnson-Reed Act of 1924, the Black American Great Migration filled the vacant workstations of bustling factories across the North, and for the first time since Reconstruction a large, enfranchised Black population began to make its presence felt. “Blood and soil nationalism” could not cover this multitude. Developing a national identity strong enough to command enthusiastic adherence and elastic enough to cover the extraordinary diversity of this newly assembled multitude was a task that preoccupied American thinkers and social activists from the 1880s through the 1960s. The long pause in immigration, the national ordeals of the Depression and World War II, and, not least, the broadly shared prosperity of postwar, blue model America appeared to have answered the anxious petition of the Episcopal prayer book and fashioned “into one united people the multitudes brought hither out of many kindreds and tongues.”[7]

Creedal nationalism was and had to be progressive—not in a partisan or narrow way, but in the broad sense that an American idea that could hold the allegiance of its citizens had to be forward looking. America in the first half of the twentieth century was more affluent than any large country in the world, but it was a profoundly unequal society, and not only for people from racial minority groups. Life for working-class families could be and often was bitterly hard: long hours, unsafe working conditions, dismal and crowded living conditions, brutal policing, polluted water and air, poor-quality and often contaminated food. Social mobility existed, but for people from ethnic groups whose physical appearance or cultural habits differed widely from what “old stock” Americans considered “normal,” opportunities were limited and discrimination was both common and legal. Single women faced both social prejudice and low pay. Asian Americans, whose numbers remained limited at this time due to race-based immigration discrimination well antedating Johnson-Reed, faced extraordinary barriers in many states and met with suspicion almost everywhere. The majority of Black Americans still living in the old Confederacy suffered under Jim Crow laws,[8] while those who came north faced popular prejudice, unsympathetic teachers, hostile law enforcement as well as discrimination in housing, employment, and pay.

An American creed that could mean something to these Americans could not be a triumphalist celebration of the status quo any more than it could simply be the mythologization of an imagined past. The American creed needed to point to a better, fairer future. America was a place that had high ideals, and gradually approached them over time. For “old stock” Americans, this vision acknowledged the value of American heritage, but the vital principle of that heritage that made it valuable was its openness to progress and change. That was, like any historical myth, something of a cosmetic simplification of a complicated reality, but it rang true to enough people that it helped the country maintain a sense of continuity and groundedness even as it underwent massive social and demographic change.

Many of the problems of American identity seemed to sort themselves out after World War II. Second- and third-generation Americans descended from the Great Wave immigrants both assimilated into American material and cultural life and faced declining levels of discrimination. The 1950s and 1960s were the decades in which white Catholics, as well as Jews, saw their full entry into American life, and racial minorities experienced a significant opening. The election of John Fitzgerald Kennedy to the White House demonstrated to Catholics that they were equal members of the body politic—and reassured WASPs that American Catholics were pillars of, rather than dangers to, the American way of life.

The great postwar wave of middle-class prosperity helped cement the loyalty of the descendants of the Great Wave to the American system both because that prosperity demonstrated that the system worked and because the transition from ethnically homogeneous urban neighborhoods to intermixed suburbs broke up old ethnic enclaves and institutions and propelled the suburbanites into a new and more purely American pattern of life. The children of these suburban residents attended school with one another, worked at the same factories or offices, and intermarried. Families that were Polish on one side and Italian on the other were much more American than they were anything else. The rising tide of mid-century prosperity was the background against which a new American identity emerged, one suited to the multiethnic, largely Catholic character of the tens of millions of immigrants attracted to the United States by the booming factories of the Industrial Revolution. For most of them, the American Dream of secure prosperity and self-respect had become a reality. They believed in an America that they saw.

This was an America of fusion and convergence. It was not only that any differences in status and opportunity between “old stock” Americans and the descendants of the Great Wave were fading away. It was more broadly that the differences between classes were also fading. In 1900 the difference between the working class and the professional middle class was an unbridgeable chasm. The classes wore different clothes, ate different foods, lived in very different environments, got their information from different sources, and American class differences in speech were almost as marked as in the U.K.

Rising incomes, suburbanization, and the homogenizing effects of mass advertising and mass media changed all that. America’s stratified class society simplified during the twentieth century. At the fringes were the plutocrats and the marginalized poor. In between, the working class, middle class, and administrative professional elite merged into the vast American “middle class.” This class was not homogeneous; people spoke of lower middle (blue-collar factory workers), middle middle (post office workers, police and firefighters, K–12 teachers, and lower-level office employees), and upper middle (middle- and upper-level managers, doctors, professors, lawyers, and other “professional” workers) strata. Nevertheless, most Americans thought they lived in a relatively open and equal society, that this openness and equality were the natural consequence of American values, and that further progress would bring more leveling up and more fusion.

It was the context of convergence that gave so many hope that the deepest wound in American society might also heal. If different religious, ethnic, and regional groups could come together in an increasingly equal, prosperous, and homogeneous union, could not this assimilative principle in American life also bring Black America into its field of operation? There were signs of hope. As the white South lost some of its regional distinctiveness, southern whites clung to their ancient racial prejudices less obsessively than before. The barriers that once blocked the advancement of Irish, Italian, Asian, and Jewish Americans had fallen or were visibly melting. Would not the passage of more time extend this dynamic to Black Americans, and was not the collapse of Jim Crow and the passage of civil rights laws evidence that this was in fact happening? Beyond this, the rising tide of working-class prosperity that had brought American blue-collar workers to living standards that were the envy of the world would surely in due course elevate Black living standards as well. Here, too, there was evidence. Black wages and incomes rose dramatically after World War II.[9] The magic of the American Way was visibly working.


The progressive disintegration of blue model America under the blows of the Information Revolution and the globalization to which that revolution gave birth is the most important driver of American domestic politics and foreign policy today. But our concern here is the U.S.-Israel relationship, and we must restrict our focus to the aspects of these unexpected and disturbing social changes that most directly affected the domestic politics around that relationship, the fears and hopes that shaped the actions of American policymakers during this era, and the consequences of those policies in the Middle East and beyond.

The post-1990 developments in American life most relevant to our subject were the gradual loss of faith among many (by no means all) Americans in the ideological form of American identity that dominated the post–World War II generation, and the consequent splintering of American politics, culture, and identity that gradually appeared. As the sun set, the stars and the planets stepped out onstage, and not all of the planets were benign.

What happened was that a failure of convergence led to a crisis of creed and that, in turn, led to an identity crisis as more Americans doubted whether the mid-century vision of a diverse nation united around a set of values was possible or even desirable.

The failure of convergence was, first and foremost, a failure of class convergence. A mix of deindustrialization, the decline of clerical work, global competition, and other factors undermined the power of the (private sector) labor movement and saw a long stagnation and in some cases serious decline of blue-collar wages and living standards. The upper middle class, on the other hand, the professional and administrative elites, prospered. Those differences were magnified by the entry of more women into the professional workforce. Upper-middle-class families often had two six-figure incomes, and the gap between college and noncollege workers continued to widen.

The term “working class” came back into wider usage as these differences widened and as studies showed declining social mobility between classes in American life. After 2000, America in some ways started to look more like the America of 1900 with deep and easily visible class differences.

As economic inequality increased and social mobility became more difficult, the consequences for racial politics in the United States were profound. While there had long been a vibrant Black middle class, and while Black access to professional and managerial jobs widened significantly after the civil rights era, most Black Americans were in the lower economic strata of the postwar middle class. When high-wage manufacturing jobs began to disappear, and blue-collar wages generally stagnated or fell, Blacks were disproportionately affected by the widening class divide.

Worse, systemic racism in the pre–civil rights era had excluded many Black families from the most important engine of American middle-class wealth development: housing. Black borrowers suffered mortgage discrimination and Black homebuyers were often excluded from all but a handful of neighborhoods. For millions of American blue-collar families, buying affordable homes in suburbs on thirty-year mortgages at low fixed rates, often subsidized by G.I. benefits as well as generous tax deductions for interest and tax payments, served both as a wealth generator and as an introduction to the principles of investment and finance. As housing prices rose over time, homebuying families reaped substantial profits, adding to their growing equity as they paid down their mortgages. By 1989, American households owned $4.7 trillion worth of real estate,[10] and homeownership was 40 percent of the total worth of middle-class families.[11] In the second quarter of 2019, the rate of Black homeownership fell to 40.6 percent, its lowest since 1970, two years after the passage of the 1968 Fair Housing Act.[12]

Adjusting for inflation, the average middle-class Black household in 1968 had $6,674 in wealth compared to $70,786 for the average middle-class white household. In 2016 those numbers stood at $13,024 and $149,703, respectively.[13] Black households also inherited less wealth than white households, with 10 percent of Black households receiving an inheritance at an average level of $100,000 in 2019, compared to 30 percent of white households at an average level of $195,500.[14]

The failure to achieve something like equality of income and status was not just a Black problem. The failure undercut the perceived legitimacy of the American system. As long as Americans believed that racism was a legacy from the past that the country was progressively overcoming, awareness of the stain of racism could be squared with a faith in America’s ultimate goodness and success. But what if the country wasn’t leaving racism behind? If the United States was not ultimately an inclusive society but if, as some maintained, racism was so deeply encoded in the country’s DNA that it could not be eradicated, what made the country worthy of support? What was the meaning of common citizenship in an ineradicably racist nation?

The questions went deeper. The American creed was more than a belief that the United States was a prosperous country and a nice place to live. Americans going back to the colonial era had seen their country as more than just another nation enjoying its hour on the historical stage. The United States was a providential nation, a nation whose destiny it was to elevate the human race to a new and higher kind of life. The political, religious, and economic ideas that defined the American Way were relevant far beyond America’s shores and it was America’s special responsibility to bring those principles to the world. Like the ancient Hebrews, the American people had been entrusted with a providential message intended for the whole human race. By defeating fascism and communism in the twentieth century, by creating a society of mass middle-class abundance, and by moving from a past of discrimination and injustice to a new and better type of national existence, America was fulfilling its providential mission.

But what if none of that was true? What if American providential nationalism was an illusion grounded in the egotism of white supremacy? What if the whole inspiring idea was a kind of fever dream from which a disillusioned people was beginning to awake?

For a creedal nation, evidence that the national creed wasn’t true was an existential problem. If working-class Americans were falling behind the rest of the country with less opportunity to rise, and if Black Americans were also being left behind, and the legacies of slavery and Jim Crow were passing down to yet another generation, it was reasonable to ask whether the American Way really meant something anymore. And, if it didn’t, what did it mean to be an American?

Adding to questions about the nature of American identity and the health of the American project were the effects of a new Great Wave of migrants, ultimately much larger in absolute terms than the immigrants who arrived between 1880 and 1924. The 1965 Hart-Celler Act, which went into effect in 1968, raised the number of new immigrants, abolished the discriminatory quota system, and encouraged migration from all over the world. While substantial numbers of immigrants, often coming from highly educated backgrounds in their countries of origin, found quick economic success in the United States, many more joined the reemerging working class in American cities and metropolitan areas. As in the year 1900, an urban working class that seemed alien and perhaps ideologically or socially dangerous to significant American subcultures was a growing presence in American life.

The successful incorporation of the Great Wave migrants, though, led to historical amnesia about how difficult and contentious the process was—or how patchy it was as Asian and Latin migrants faced significantly greater barriers to acceptance and integration.

Also missing from the historical memory was a recognition of the political consequences of the Great Wave: a rise of socialist and anarchist movements among immigrants facing the shock of migration contributed to the emergence of a nativist backlash and eventually the first Red Scare. Limited economic opportunities in the United States added to these tensions and to the recent migrants’ disillusionment. The extensive history of violent suppression, mob action, and irresistible growth of the anti-immigrant consensus that led to the slamming shut of the nation’s doors was all but forgotten.

The new Great Wave saw a return of many of the tensions and social and economic conflicts that characterized the first Great Wave. These were going to be disruptive and difficult in any case—immigration is beneficial but never stress free—but the combination of major demographic shifts with the economic and social changes made the post–Cold War era a far more tumultuous time than many in the elite had expected.

That the new immigrant wave came at a time when the shape and value of American identity was hotly contested made it more difficult for Americans to think about questions like assimilation. Early in the twentieth century, it seemed obvious to “old stock” Americans that it was the duty of public institutions to “Americanize” newcomers, to make them fit and eager participants in the mission that the United States was appointed to carry out. That point of view was more controversial a century later; if native-born Americans were ambivalent about the value of their inherited ideas and institutions, why should these values be foisted on newcomers?

The widening income and class disparities, the persistence and even widening of the racial gap, and the cultural and social stress arising from a new Great Wave gradually opened up a crisis of confidence in the national creed and, therefore, the American project as a whole. This could not have been a more unexpected and unwelcome development for an establishment that essentially took the health and unity of its country for granted. While many countries were experiencing political difficulties as the Information Revolution swept across the world, America’s unique role in the international system made the stability and success of the American project of vital interest to people everywhere—and meant that the consequences of political failure in the United States would be incalculably large.

The thirty years that followed the Cold War saw the United States travel from the smug sense of success and self-confidence that followed the Soviet collapse to an era of questioning, conflict, and doubt. The growing disorder at home came as the world situation also darkened. In 1990 the United States seemed effortlessly dominant; by 2020 it faced hostile opponents in a tumultuous world. It is essential for the American elite, like elites anywhere, to be able to explain events to the public at large in ways that maintain public confidence in, and support for, the country’s direction. An elite must be able to justify and legitimate its privileges while effectively making the case either for the justice of existing social arrangements or for the wisdom of, and prospects for, reform efforts currently under way. Unfortunately, even as the American situation became more difficult and the world situation became more challenging, the American leadership class—among Democrats and Republicans alike—was becoming less able to fulfill this vital and necessary role. Part of the reason for this was that the relationship between the establishment and the polity had changed drastically over the course of several decades.

After World War II the American elite transformed from a small group, predominantly from a relatively narrow stratum of long-established families based largely in the Northeast, to a much larger group of administrators, managers, and researchers. This new elite was demographically and geographically more diverse than the old establishment and in its early decades more representative of class distribution in the United States. Rapid upward mobility meant that many members of the new establishment had blue-collar backgrounds. For members of the “new elite,” understanding the outlooks and concerns of the working class was intuitive because they had come from it.

But the social and economic forces that were separating the classes in American life meant that the social distance between the upper middle class and the rest of the country expanded as income stratification and sharp divides in educational opportunities grew. And while many Americans outside the elite bubble struggled to achieve or maintain a middle-class living standard, and the industrial working class suffered a long and painful decline, the post-1990 era on the whole was a time of unprecedented and even glorious opportunity for America’s upper middle class.

There were exceptions. Journalists were almost as hard hit as steelworkers as the internet disrupted the media oligopolies and business models on which the profession rested. Academics, except for a dwindling minority who still enjoyed the advantages of limited teaching loads and lifetime tenure, faced an increasingly precarious future as an oversupply of aspirants flooded the market and colleges staffed more positions with poorly paid, insecure adjunct teachers.

But on the whole, the burgeoning American upper middle class of professionals like bankers, lawyers, doctors, tech engineers, and consultants enjoyed a boom as long lived and as intoxicating as the postwar prosperity for the country as a whole. For these Americans, the problems of the lower middle class looked like sad but necessary shifts to respond to the ultimately healthy forces of globalization. They were for the most part only dimly aware of the growing discontent and alienation that, in due course, would power political movements as diverse as the Tea Party, Occupy Wall Street, the 2016 Trump campaign, the Bernie Sanders movement, and Black Lives Matter. Living in class-segregated suburbs, attending hothouse colleges with students mostly from homes and families like their own, the American elite increasingly shared all the characteristics of an aristocracy—with the exception in too many cases of a sense of noblesse oblige. Feeling that their accomplishments were the result of “merit” and hard work, they felt entitled to the widening income gap that separated them from the slackers and losers who occupied the lower rungs on the American ladder.

Meanwhile, more and more Americans were questioning institutions and ideas that their predecessors had seen as essential building blocks of American life. Rising levels of education and affluence, combined with the heightening of the endemic individualism of American society due to the increasing impact of an advertisement-rich consumer society, led many Americans to reject traditional social norms. Questions related to gender identity, sexual mores, and family structure had already appeared during the Cold War, but their impact on society grew dramatically in the 1990s and beyond. As previously unthinkable concepts like gay marriage gained widespread acceptance, and trans activists and others sought to push the shift in mores beyond the comfort zone of many, the impact on American politics went beyond the division of the country into factions in the “culture wars.”

If the western civilization in which America was rooted was not only irredeemably racist but entirely patriarchal, homophobic, and transphobic to boot, and if the Judeo-Christian morality at the foundation of traditional American values was little more than an assemblage of false and even harmful ideas, what exactly was the American creed? Was there enough of a “there” there to hold a political society together, or should Americans forge a new identity around a struggle to free themselves from the wreckage of the old one?

The widening gap between traditional values and contemporary practices caused political rifts in many countries. But the fact that for many Americans their sense of national identity came from adherence to a common creed meant that the culture wars found across the western world divided Americans more profoundly and potentially more dangerously than people in many other countries.

Belief in progress was getting harder to sustain, and that waning belief would contribute to another of the problems dogging the footsteps of those charged with the shaping of American foreign policy.

Steering the Apocalypse

In AD 1000, fears that the milestone date would mark the end of the world and the return of Christ are said to have swept across the medieval world. One thousand years later, as a second millennium approached, there was a similar round of foreboding, this time directed at the prospect that due to programming oversights, older computer devices would be unable to read the date in a new century, leading to a massively catastrophic global computer crash. The first millennial panic was good for church institutions, who received rich gifts and bequests from those hoping to square their accounts. The second was a gold mine for computer consultants. The best available records suggest that neither the church nor the consultants returned any funds when the predicted apocalypse failed to appear. But if the Y2K bug[*] did not bring the world to a juddering halt, apocalyptic concerns would play a large and growing role after the end of the Cold War.

We’ve seen how the historicization of the eschaton, the movement of the concept of the end of history and even the end of the world from the realms of mythology and religion to the realms of politics and statecraft, has been shaping world history from the Enlightenment onward, and how central the idea that America has a unique role to play in this historical drama has been to the construction of American identity. We’ve also noted an unexpected consequence of historicization. The idea that the human condition could be fundamentally changed through social progress and political action was originally seen as a secularizing force in world affairs. If human beings could either create utopia or destroy life on earth through their own actions, why spend a lot of time thinking about God?

In practice, however, things worked out differently. As the question of the survival of human civilization and the human race entered the realm of politics, politics became more intense, more confrontational, and more infused with moral and religious values. The historicization of the eschaton leads to the mythologization of politics, as ordinary political debates transform into arguments about the construction of utopias or the avoidance of massive catastrophe.

The twentieth century marked a new stage in the history of humanity’s encounter with the idea of apocalypse. From the Enlightenment forward, people believed that the historical and political processes they saw unfolding around them were leading to a fundamental change in the conditions of human existence as significant as those that various religious traditions had long predicted. In the twentieth century, more and more people came to believe that this great change was no longer something that would appear at some distant point in the future. Apocalypse was happening around us.

For communists, the Russian Revolution was the moment when the world transitioned from a time of anticipation and preparation to the apocalyptic era in which the great climax of world history was happening before their eyes. Soviet ideology held that the Soviet Union was a kind of living apocalypse, the great embodiment of a revolutionary movement designed to transform the human race and to end war and injustice forever. To study the apocalypse, communists read the morning news.

The nuclear explosions at Hiroshima and Nagasaki meant that humanity as a whole now lived in what can fairly be called an age of apocalypse. The Cold War was a new kind of conflict. With both sides building enough nuclear weapons and intercontinental missiles to annihilate not only the adversary but the whole human race, the potential for political misjudgments to have apocalyptic consequences had never felt more real. But that was not all. Neither of the great parties to the Cold War was what could be called an ordinary great power.

Both of the antagonists saw themselves as the agent of rival programs for bringing history to a successful conclusion. The Soviets believed that Leninism was the magic formula that would end war and establish universal prosperity forever. The Americans believed that liberal capitalist democracy was the magic bullet. Each saw the other as a rival faith whose precepts would lead to destruction. The Cold War was a religious conflict between secular states, fought under the shadow of the apocalypse, over conflicting visions of the path to utopia.

The apocalyptic nature of the Cold War is one reason that so many Americans interpreted their victory as the final vindication of the American Way and the beginning of a new era of universal freedom and peace. The apparent end of the long nuclear nightmare of the Cold War combined with the disappearance of the most formidable antagonist to liberal society to make a heady brew, and the American establishment drank the cup to the dregs. Future generations who had not lived under the shadow of the nuclear standoff, when serious, well-informed people understood that nuclear annihilation could happen by accident—if, for example, Soviet or American radar systems confused a flock of geese with a sneak missile attack, unleashing a retaliatory strike in response—cannot fully grasp how sweet the world suddenly seemed when that threat was removed.

However, the end of the Cold War did not end the age of apocalypse. If the specter of nuclear war faded in the 1990s, other visions of human suicide began to appear. The most prominent was the specter of climate change. The uncontrolled emission of carbon dioxide and other greenhouse gasses through human industrial and agricultural activity threatened to unleash a climate cascade. Warming temperatures, an effect expected to be especially marked near the poles, would melt the vast polar icecaps, leading to rapid and disastrous flooding as sea levels rose and cities and coastal areas sank beneath the waves. Vast areas of the tropics would become too hot for human habitation or for agriculture. Rivers would dry up as glaciers melted, and hundreds of millions of people would be on the march from their devastated homelands in search of subsistence. Some foresaw what climate scientist James Hansen called a “Venus syndrome” as feedback loops led to temperature increases that made the planet uninhabitable.[15] Others foresaw wars of survival leading ultimately to a nuclear holocaust on an exhausted, dying planet.

Climate change was only one in a series of existential threats that captured the imagination of the twenty-first century. The terror attacks of 9/11 raised the specter of a nondeterrable nuclear threat. If nonstate actors ranging from terrorists to blackmailers managed to acquire nuclear weapons, they could detonate a bomb without there being a homeland or government against which the aggrieved party could retaliate. The almost simultaneous anthrax attacks of the fall of 2001 demonstrated another peril. The steady progress of biology raised the possibility that biological warfare, either by a state or by a terror organization, could wreak untold havoc on the world. The 2020 Covid-19 pandemic showed what even a relatively mild contagious disease could accomplish in an unprepared world. As the century unfolded, what new and artificial horrors could be engineered in untraceable labs?

The internet, which had first appeared to many as a harmless phenomenon promoting the free exchange of ideas and information, quickly took on a more sinister aspect. Cyberwar offered new scenarios of attacks that paralyzed a country’s electrical grid, opened the floodgates of its dams, or wiped out its financial system. The marriage of information technology and totalitarian social control seemed to be consummated in China, where an all-seeing state looked determined to use the power of electronic surveillance to impose the kind of control over its citizens that no king, no religious dictatorship of the past, had ever been able to acquire. Applied ruthlessly in Tibet and Xinjiang, the new instruments of total social control demonstrated that the fall of Soviet communism had not exhausted the ability of human society to produce monstrous tyrannies capable of the most horrifying crimes.

The nuclear specter, meanwhile, never really disappeared as nuclear proliferation continued to dominate both popular and elite concerns about American foreign policy. The North Korean nuclear program was one of the thorniest policy issues that presidents from Bill Clinton to Joe Biden had to face. Millions of Americans supported George W. Bush’s invasion of Iraq because they believed that Saddam Hussein had a nuclear program that might provide terrorists with nuclear weapons. The controversy over how best to deal with the Iranian nuclear program was one of the most bitter partisan issues in the debates of the Obama and Trump administrations. The intensification of great power competition made nuclear arms talks with Russia a major issue in American politics and diplomacy. And China’s rapidly growing nuclear arsenal suggested that the nuclear balance of terror was about to return to the center stage of both world politics and the human imagination. The apocalyptic specter could not be wished away.


During the Cold War, the task of American foreign policy had been threefold: to defeat the Soviet Union, to avoid a catastrophic nuclear war, and, among the nations of what in those times was called the “Free World” of noncommunist countries, to further the construction of the peaceful international order which most American liberals believed would usher in an age of universal peace and abundance.

After the Cold War, American foreign policy makers initially thought that their lives had become much easier. They could concentrate on peaceful order building in a world without opponents. Yet the task was still herculean; the Soviet heresy had fallen to the wayside, and now the American faith would have to deliver. The alternative futures for humanity were as stark as ever. On one side a utopian future of freedom, abundance, and peace beckoned invitingly. On the other, a dystopian world of eco-catastrophe, conflict, and, quite possibly, the extinction of the human race sulphurously loomed. Was the human race headed toward a “hot” apocalypse of eco-catastrophe, nuclear holocaust, or some other hideous culmination, or were we on track toward a “cool” apocalypse of gradual progress toward a world of freedom and peace? Policymakers and national leaders had a new task in the twenty-first century: managing apocalypse.

The idea that America was on the path that would lead, not only locally but ultimately globally, toward a gloriously cool apocalypse had been central to American thought and identity since the eighteenth century. This view of the future corresponds both to the theological position in Christianity known as “postmillennialism” and to the secular, liberal ideas associated with the so-called Whig Narrative of peaceful and gradual progress. It was a fundamental element of the American creed that emerged in the twentieth century around which post–Great Wave America was able to unite.

The successful conclusion of the Cold War seemed to confirm that this vision of history was correct, as was the providential nation thesis that gave the United States a starring role in the historical drama. In sixty years, the United States had played a critical role in the destruction of fascism, ended generations of bitter class conflict in the West by building an affluent middle-class society, and defeated a nuclear superpower adversary through the power of its ideas and capacity for innovation—while also laying the foundations for unprecedented waves of economic prosperity and democratic governance across much of the world. The dragon was dead and the princess was saved; that is how many Americans interpreted the fall of the Soviet Union. Elites noted that it was their wise leadership that had brought about this grand result and faced the future with renewed confidence in their own capabilities, and with an assumption that these dramatic successes would teach the rest of the country to follow their lead in the future.

But American culture is complicated, and the expectation of a hot apocalypse also has a place in it. In Christian theology, the belief that the apocalypse will be a hot one—that divine wrath will destroy the world before divine mercy rebuilds it on a new and better foundation—is known as “premillennialism.” In secular politics, this view is associated with those who believe the status quo is so unjust or dysfunctional that only a great and cleansing catastrophe can bring about the kind of change that people need. In the twentieth century, Bolsheviks were Marxists who believed in a hot, revolutionary road to the utopian workers’ state while more moderate socialists hoped for a cooler, gentler road to the workers’ utopia through gradual reform. Belief in a hot apocalypse has often been strongest among the marginalized, the dispossessed, and the alienated.

The twentieth century, with its string of historical catastrophes stretching from World War I through the rise of communism and fascism, World War II, the Holocaust, Hiroshima, the upheavals associated with both the rise and the fall of European colonialism, and the terrifying nuclear standoff of the Cold War, saw a global surge in fears that a hot apocalypse was at hand. The establishment of an apparently stable and prosperous democratic order in the advanced industrial democracies after World War II calmed these concerns for many people, supporting faith in a cool apocalypse, and of course the peaceful end of the Cold War greatly strengthened that faith.

However, to an extent that few realized at the time, this confidence was fragile. As the post–World War II stable social order showed signs of decay, as the global political scene darkened, and as fears of climate change grew, hot apocalypse scenarios seemed more compelling to more people. People animated by a strong belief that the technocracy and the political establishment were failing and that under their leadership the world was slouching closer to some kind of existential crisis, are not reliable supporters of centrist, moderate political movements. The sense in much of the Islamic world that its institutions and leaders were failing and that hostile powers opposed to Islam were in charge of the world provided a powerful boost to apocalyptic cults and radical Islamist movements. Populist movements in Europe drew strength from the growing sense that the EU was unable to cope with problems ranging from economic growth to mass migration from the south and east. And in the United States itself, growing dissatisfaction with economic and social conditions, along with disenchantment with a globalist foreign policy that was costing more and providing less than many Americans felt they had been promised, helped stimulate the rise of anti-establishment populism among both Democrats and Republicans.

From visions of apocalypse to thoughts of the Middle East is always a short road in Abrahamic cultures. In much of the Islamic world, the seemingly inexorable rise of Israel and the progressive marginalization of Palestinians contributed to the sense of civilizational and religious crisis out of which radical Islamist movements grew. In much of Europe and for many Americans, what they saw as Israel’s defiance of international law seemed to pose a serious threat to the establishment of the liberal, rules-based order on which they placed their hopes for a quiet, cool, and liberal apocalypse. And among many other Americans, attacks against the United States by radical jihadi terrorists animated by a hatred of Israel and of America as Israel’s ally appeared to confirm their intuitions that the biblically prophesied hot apocalypse was well on its way.

More than ever before, America’s Israel policy after the Cold War would struggle to reconcile the convictions of differing American political movements about the national interest in the Middle East with the consequences of passionate internal debates about the nature of American identity, the role of the United States in the world, and the state of the human condition. From the relatively sunny days of the 1990s when the Arab-Israeli peace process dominated American policymaking through the consequences and aftermath of the George W. Bush administration’s “war on terror” in the region through the messy efforts of both the Obama and Trump administrations to reduce American commitments in the Middle East as the country turned its attention to the Indo-Pacific, America’s Israel policy would continue to occupy a uniquely important symbolic role in American politics.

The “Middle East peace process,” as the tangle of diplomatic activity around the Israeli-Palestinian conflict is frequently called, was the most sustained and most expensive single diplomatic effort in the history of the United States. In the 1990s, advocates of the peace process saw Middle East peace as a critical task for the new world order. Previous generations of Americans saw the United Nations as the only way to prevent a nuclear war and Palestine’s partition as the first test of the new peace; their successors saw Israel-Palestine negotiations as the most visible sign that peaceful negotiations could stop global violence, spread democratic values, and keep the apocalypse cool.

Skip Notes

* Y2K = Year Two Thousand

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!