(photo credit 1.1)
One need not be a devotee of astrology or sunspot theories to recognize that certain rhythms seem to exist in our history. Many different types of business cycles have been traced over the years. During the nineteenth century, for example, major economic slumps occurred with disturbing regularity: 1819, 1837, 1857, 1873, 1893. Another economic collapse was beginning “on schedule” in 1914, when the outbreak of war in Europe revived the American economy.
Less universally accepted, but hard to deny when our history is viewed in its entirety, are periodic swings of mood, opinion, and values among the American people. An oscillation between the dominance of reform sentiment and standpattism has taken place throughout American history, with each wave of reform and each trough of reaction usually lasting between ten and twenty years. The opposing moods have been called by many names: Jeffersonian and Hamiltonian, democratic and aristocratic, liberal and conservative. The terms inevitably lead to confusion, because their meanings are so imprecise that the opposing names are sometimes used to describe the same policy. Jefferson’s famous declaration in his first inaugural address, “We are all republicans—we are all federalists,” hinted at the ambiguity. In a speech during his 1928 presidential campaign, Herbert Hoover restated the point: “We are a nation of progressives; we differ as to what is the road to progress.” Precisely. But the differences over the meaning of progress and the best road to reach it are often great.
The fundamental difference was well stated by Jefferson. Some men, he declared, “fear the people, and wish to transfer all power to the higher classes of society.” Others “consider the people as the safest depository of power in the last resort.” Despite the frequent confusion over means, the goals of the former group are conveniently labeled conservative; those of the latter, who have been more concerned with the wrongs done to people than with the rights of property, are generally termed liberal. Given this admittedly loose distinction, American history in its broadest outline can be seen as consisting of periods of liberal reform punctuated by conservative breathing spaces. The peaks of reform were reached during the American Revolution, the ages of Jefferson and Jackson, the crusade against slavery, the Progressive era, the New Deal, and Lyndon Johnson’s Great Society. The motive forces in the decline of reform eras seem to have been principally three: First, people eventually grow weary of social problems. They choose instead to withdraw into more personal interests. This sentiment was best summed up by the slogan Republicans used in the 1946 congressional campaign and several times since: “Had enough?” When a majority is sufficiently tired of the activism of such leaders as Jefferson, Lincoln, the Roosevelts, Wilson, Kennedy, or Johnson to answer “yes,” a period of reaction sets in. Such notable ages in American history have been the Federalist era, the Era of Good Feelings, the 1840s and early 1850s, the Gilded Age, the 1920s, the 1950s, and the 1980s.
The second main factor in bringing a period of liberalism to a close is that the pace of change in our society outdistances a generation of reformers. Their ideas become dated. Much of their program has been put into effect, yet old problems remain and new ones arise. Some of those who seek change in their youth come to favor the status quo as they grow older.
A third reason for the decline of reform eras is that political leaders who start out committed to principles are likely, after time, to come to prefer, unlike Henry Clay, to be president (or congressman) to being right. Thus their zeal for the cause diminishes, and with it the reform era itself cools.
New reform eras arise for reasons similar to those that lead to the demise of their predecessors: a rejection of the dominant mood and a generational change. Just as people tire of calls for self-sacrifice, they eventually become upset with themselves after a period of hedonism and disregard for others. For some, at least, a time of reform offers an opportunity for personal redemption as well as for social change. If the public ultimately grows weary of activist leaders, it finally becomes bored or disgusted with the Adamses, Fillmores, Coolidges, Eisenhowers, and Reagans, too. Zealots of the right come to find office-holding as comfortable as do their liberal counterparts. And as a new generation arises the old battles are history and new configurations of problems bring forth new attempts at solutions.
The fact that at least some new solutions are sought in each wave of reform indicates that the often-used metaphor of a swinging pendulum is not quite suitable for describing changes in America’s mood. No conservative era succeeds in wiping out all the gains of the preceding period of reform. The Republicans in the Eisenhower years were unable (and in many cases unwilling) to repeal the New Deal. The Reagan Republicans appeared for a time to be on the verge of dismantling the Great Society, if not the New Deal, but the public has proved unwilling to follow its reactionary leaders that far. Each liberal age is able to build upon some accomplishments of those that came before it. Some positions of the Reagan administration that are today considered extremely conservative would have been denounced as liberal in the 1920s.
The limits of the pulsations of national opinion here under discussion must be made clear. They do not, of course, involve everyone. Nearly 17 million Americans voted against Franklin Roosevelt in 1936; more than 27 million voted for Barry Goldwater in 1964. We are speaking of shifts between about 40 and 60 percent of the electorate, which seem to be the lower and upper limits of support for one viewpoint or the other. Nor are the shifts of mood usually as dramatic as we sometimes make them seem. The continued strength of some items on the progressive agenda during the 1920s, the “white backlash” of the 1960s, and the surprising popularity in opinion polls of many social programs in the 1980s are cases in point. It should also be plain that this is no simple argument about “history repeating itself.” There has been a substantial variation in the length and intensity of periods of reform and reaction. All sorts of external factors ranging from economic conditions to wars to individual leadership can affect the timing and strength of reform waves. None of these qualifications, however, detracts from the usefulness of the concept of oscillations of public sentiment to an understanding of the American past. Those who remain skeptical of the idea may be persuaded by noting the startling accuracy of the following prediction Arthur Schlesinger, Sr., made in 1949: “We may expect the recession from liberalism which began in 1947 to last till 1962, with a possible margin of a year or two in one direction or the other. The next conservative epoch will then be due around 1978.” And, it might now be added, liberalism can be expected to be in the ascendancy again in the 1990s, if not before.1
I have taken the time to go into this discussion of tides of public sentiment in some detail because the concept is vitally important to an understanding of Depression America. It is notable that the cycles of opinion appear to be unrelated to those of the economy. The two worst depressions in American history prior to the 1930s occurred in the long conservative period of the late nineteenth century. Populism was already strong when the Panic of 1893 hit, but it remained an overwhelmingly agrarian movement and did not become nationally dominant during the nineties. When reform did win majority support after the turn of the century, the United States was again enjoying prosperity. Prosperity gives middle-class reformers the security they feel is necessary to undertake change. The last great liberal uprising we have experienced, that of the 1960s, also took place in a period of great prosperity for middle-class Americans. Clearly hard times are not a prerequisite for reform. Indeed, only one of the major reform eras in American history occurred during a depression and, to state the same point differently, all of the major economic collapses save one happened during conservative eras. The one exception is the Great Depression of the 1930s, our worst depression, during which the New Deal, our most significant era of liberalism, took place. The so-far unique coincidence of the nation’s economic and values cycles in the 1930s is a little-noted but critical fact about the history of Depression America.
Superimposed upon the economic and mood shifts are basic differences in class values. These are far from absolute, but as a rule of thumb, working-class people have been more likely to hold values centered on cooperation, sharing, equity, fairness, and justice than have their affluent countrymen. The latter have, throughout our history, been more likely to defend the marketplace as the sole determinant of the distribution of the economy’s fruits. The poor—whether farmers or industrial workers—have generally been less willing worshipers at the shrine of Adam Smith. So long as the marketplace economy provided reasonable opportunities, they were likely to accept it. But during hard times or when maldistribution was especially evident, working-class people have tended to call for community or government action to supplement—or counteract—the marketplace. In short, they have believed that morality ought to have a role in the workings of the economy.
The basic reason for this difference is not hard to find. The self-interest of have-nots is better served by a more equitable distribution, whereas the self-interest of the wealthy is obviously better served by keeping things as they are. This in no sense makes the two positions ethically equivalent. The self-interest of the poor coincides with justice, that of the rich with injustice. The significance of this for the Great Depression is that it, like previous depressions, led the lower classes to demand government action to help them. Workers and farmers had made similar protests in the 1870s and 1890s. The key differences in the 1930s were that the country was “due” for a swing to more humanitarian values and that the depression was so much deeper, wider, and longer than previous slumps that a far larger segment of the middle class was directly affected and hence came to identify its interests with those of the poor. The combination of all these ingredients made the 1930s the time in which the values of compassion, sharing, and social justice became the most dominant that they have ever been in American history.
Each periodic depression that plagued the United States from the early nineteenth century through the 1930s was worse than the one before it. Many reasons for this could be cited, but one stands out as by far the most important. As the United States became less agrarian and more industrial, less rural and more urban, an ever-increasing percentage of its population became susceptible to the vagaries of the market economy. This is not to say that farmers were not victims of economic collapse—much of the history of the late nineteenth century and of the 1920s and 1930s makes plain that they were—but that people who owned farms and were not deeply in debt could at least feed their families during hard times. Stated simply, more and more people became dependent as the nation industrialized. Urban working-class people who rented their living quarters and whose income was derived entirely from wages found themselves in desperate straits when they lost their jobs and could find no other employment.
The Panic of 1893 was America’s worst depression in the nineteenth century. Real income of Americans is estimated to have dropped some 18 percent between 1892 and 1894. That depression led to massive unemployment, which in turn fueled protest. Workers were angry and ready to undertake cooperative and, in some cases, radical action. These facts alarmed middle-class Americans, some of whom believed revolution might be imminent, particularly since the farm belts of the nation were seething with populism. Under the circumstances, middle America chose to cast its lot with business and dig in to defend economic orthodoxy. No quarrel on this would come from President Cleveland who, in vetoing the Texas Seed bill of 1887, which would have provided a scant $10,000 for relief of farmers hard-hit by drought, uttered the immortal words, “though the people support the Government the Government should not support the people.” Cleveland, like most others in power in the 1890s, believed that depressions were natural events, part of the working of the business cycle, and therefore the government could do little about them. His formula for recovery (which was continued in all its essentials by his Republican successor, William McKinley) was to maintain sound money, preserve the sanctity of contracts, and cut federal spending. It has an uncannily familiar ring.
If the effects of the Panic of 1893 on the economy, workers, and business were dramatic, the depression’s impact on American politics was equally spectacular. Although only one Democrat (Cleveland) had been elected president between 1860 and 1892, the Republicans had failed to establish themselves as a clear-cut majority party. Democrats won decisively in the House elections in 1890 and 1892, and Cleveland regained the White House in the latter year. This was the first time since Reconstruction that eitherparty had held the White House and both houses of Congress simultaneously. But what seemed a happy occasion for the Democrats soon proved to be quite otherwise. It meant that their party was fully “in charge” when the panic struck. They were blamed, as the Republicans would be four decades later, for the economic collapse and had to pay the price at the polls. The elections of 1894 and 1896 saw what has been called “one of the greatest bloodless political realignments that this country has ever experienced.” The shift of House seats from Democrats to Republicans in 1894 was the largest in modern history, exceeding even the transfer in the other direction in 1932. The political results of the Panic of 1893 clearly foreshadowed those of the Great Depression: the party in power was devastated and the opposition party became dominant for more than a generation. Woodrow Wilson’s two victories under unusual circumstances notwithstanding, the Republicans remained the majority party from 1894 until a new economic collapse turned the tables on them in the early 1930s. For an earlier generation, Grover Cleveland and the Democrats represented what Herbert Hoover and the Republicans have stood for to so many Americans in the last half century. “I was a child of four during the Panic of‧93,” Walter Lippmann wrote two decades later, “and Cleveland has always been a sinister figure to me. His name was uttered with monstrous dread in the household … And to this day I find myself with a subtle prejudice against Democrats that goes deeper than what we call political conviction.”2
Preindustrial America was, in historian Robert Wiebe’s apt phrase, a nation of “island communities.” What was dominant in the lives of people of all classes was the local community. Each community was to a large extent self-contained, its significant contacts with the rest of the nation infrequent. Given the large number of limited stages on which Americans performed, there was an abundance of leading roles. Each town had its local elite—its merchants, bankers, clergymen, attorneys, physicians, editors, and so forth—to whom the community looked for leadership. These elites, of course, enjoyed their status. But the existence of tight-knit communities with a shared set of values was also beneficial to other groups. Several social and labor historians have shown in recent years that nineteenth-century workers were often able to use shared values to gain community support for strikes and fair treatment.
The rapid industrialization of the United States during the late nineteenth century upset this world. The development of national systems of transportation and communications, of highly concentrated industrial and financial systems, and of mass circulation newspapers and magazines, along with the explosion of urban populations, submerged the independence of the nation’s separate local communities. In the useful concept developed by nineteenth-century German sociologist Ferdinand Tönnies, Gemeinschaft—the small, personal community of the nineteenth century—was replaced by Gesellschaft, a large, anonymous society in which the importance of one’s place and achievements in the local context nearly dissolved.
The Progressive era in the broadest sense was an attempt to regain the old values or construct new ones. Progressivism may have meant many different things, but what tied it together was that all its manifestations were responses to a common stimulus: the myriad effects of industrialization on traditional American values. Because of the vast changes, even true conservatives found it necessary to become reformers. The only way to preserve at least some of the old world in the face of the upheaval being wrought by industrialization was to try to make social changes that would compensate for some of the economic changes. Paradoxically, by the beginning of the twentieth century, it was necessary for a sincere conservative (that is, someone genuinely interested in preserving the old values, not simply an apologist for concentrated wealth) to become a reformer, a “progressive.” Such people saw “certain radical but strictly limited reforms as the only way to salvage the system.”
For the middle-class reformer, alterations might be needed, but they could not be undertaken safely during the economic crisis of the 1890s. By the turn of the century, though, both prosperity and optimism had returned. The United States had recovered from its worst depression to that time, had won a “splendid little war,” the Spanish-American War, that most believed was fought for the highest, most unselfish objectives, and was emerging as a world power. Under such conditions, it was possible for the “better sort” to attempt to guide the discontent that had been so evident among farmers and workers in the nineties into acceptable channels. Middle America could rediscover poverty, as it seems to do once a generation. Thus, while progressivism had important working-class, farmer, and corporate components, its dominant form represented a “cautious uprising of the better classes.” There can be little doubt that most of these people sincerely wanted the “uplift” of the lower classes and a better, more honest, just, and moral society. They also wanted the beneficiaries to know to whom they should be thankful. Improvements were not to be won by the struggles of the lower classes, but to be bestowed as gifts from the moral, disinterested people above them. Much of what passed for progressivism was pushed by people who saw themselves as altruistic—they wanted to do good for others, not themselves. As Richard Hofstadter pointed out, populism had been a reform dictated by empty stomachs, while progressivism was a reform movement guided by the mind and heart.3
World War I not only brought the culmination of the Progressive era and intensified the reaction that followed, it also provided in several important ways the soil in which the seeds of the Great Depression and New Deal would germinate.
When the basically pacifist Woodrow Wilson finally took his country into the war in April 1917 he justified it, as he had to in order to convince himself and the American people, in the moralistic rhetoric of progressivism. “There is not a single selfish element, so far as I can see,” Wilson said, “in the cause we are fighting for.” Unlike other belligerents, Wilson insisted, the United States fought for altruistic purposes.
In addition to demonstrating to himself and a majority of his countrymen that it was “right” for the United States to enter the war, Wilson’s use of the language of idealistic self-sacrifice as a rationale for participation had at least two other notable effects. It tied the war so tightly to progressive values that, if people became disillusioned with the war, they would also lose their faith in reform. Wilson’s rhetoric also made the war seem an unprecedented example of national self-sacrifice and hence it served finally to release the combined guilt and moral energy that had been at the base of much of middle-class progressivism. Having at last paid their dues and made their sacrifices, Americans after the war could turn with somewhat clear consciences to personal concerns. The hedonistic, greedy, self-centered norms of the twenties, which played no small part in the coming of the Depression, were at least intensified by the wrapping of the war in the mantle of progressive moralism.
The war also created an international economic situation that helped bring on the Great Depression. Following Versailles there was an “unquenchable jingoism” throughout the industrial world. The reparations and war debts questions fueled dangerous international tension and weakened both the domestic economies of some European nations and the international economic structure. European countries became debtors to, rather than creditors of, the United States. Among other factors, this meant that these nations were no longer in a position to provide a good market for the surplus farm production of the United States.
The contraction of Europe’s ability to buy American agricultural products aggravated another fundamental economic problem produced by the war. Wartime demand for food was great and the normal supply in Europe had been cut sharply. Both market forces and government policies led during the war to a vast increase in America’s farm output. When wartime Food Administrator Herbert Hoover allowed the price of wheat to be set at $2.20 a bushel, farmers increased wheat acreage by nearly 40 percent and output by almost 50 percent. This in turn meant chronic “overproduction” in the 1920s, a massive agricultural depression during America’s “prosperity decade,” and a further dislocation in the world economic structure.
The First World War had other results with a direct bearing on the Great Depression. It represented what historian Henry May has called “the end of American innocence.” The great optimism of the progressive years, as well as most of what was left of the old familiar values, was shattered by the war and the disillusionment of Versailles and the struggle over the League of Nations. Unmistakable clouds now hung over the previously sunny American horizon. “Those of you who did not live in that period before 1914, or are not old enough to remember it,” clergyman John Haynes Holmes reminisced years later, “cannot imagine the security we enjoyed and the serenity we felt in that old world.” Like most old worlds when they are gone, prewar America took on more of a glow than reality warranted, but the feeling of loss was a significant part of the mood of the twenties and the motive for the desire to return to “normalcy.” The word itself, coined accidentally by Warren Harding during his 1920 campaign for the presidency, symbolized perfectly what the nation sought after the war: a return to a golden age that, like Harding’s word, only resembled what had existed before.
To carry on the war effort, the United States government had embarked, for the first time, on serious social and economic planning. Indeed, it has been persuasively argued that the models for many programs and practices of the New Deal are to be found in wartime Washington rather than in the proposals of progressive reformers.
John Dewey, for one, took note in 1918 of “the social possibilities of war.” The American economy was controlled in a completely unprecedented fashion during World War I. The War Industries Board, when placed under the leadership of Wall Street speculator Bernard Baruch early in 1918, held extensive powers over business during the war. The WIB tightly regulated and managed the economy, achieving remarkable results that many observers believed were proof of the efficacy of a planned economy.
Other precedents for New Deal actions are readily discernible in wartime practices. The National War Labor Board mediated labor conflicts. Workers were guaranteed the right to bargain collectively. Union membership increased by more than 40 percent between 1915 and the Armistice, and real earnings for laborers increased a modest 4 percent during the same period.
Major wars are sufficiently grave that orthodox thinkers are willing to suspend their dogma for the duration. Most American leaders who would not consider having an unbalanced budget in peacetime saw no alternative during the war. About half of the more than $32 billion cost of the war was met by borrowing. The rest was paid for through sharply increased taxes, including levies on corporate profits and a sharply graduated income tax (the latter made possible by the ratification of the Sixteenth Amendment in 1913). All of these wartime precedents were rummaged through for useful ideas when a different but equally momentous crisis had to be faced more than a decade later.4
The fanatical patriotism of World War I left a legacy that would play an important part in the decade ahead. The insistence on “100 percent Americanism” helped turn many people against anything with a hint of “internationalism” about it. This included Communists, Catholics, and Jews, all of whom became prime targets of the new Ku Klux Klan. The Klan of the 1920s represented a carry-over of the intolerance and reaction against progressivism, the war, and internationalism that had become so popular in 1919–20. It cannot be blamed for creating these feelings. Indeed it was, like so much else in the twenties, a business dictated by the marketplace. Its organizers were interested in nothing so much as making money through the sale of memberships. It was organized rather like a modern fast-food franchise, but in this case the product was not hamburgers but hatred. Accordingly, the Klan had to be willing to be intolerant of anyone the potential consumers did not like. Basically it was: “You don’t like Group X? Well we don’t either. Join us.” This meant that while the KKK certainly exacerbated intergroup tensions in the twenties, it did not manufacture them. That a sizable part of the nation—principally in rural and small-town regions—continued for so long the passionate intolerance of the postwar period is closely related to another of the prominent realities of the twenties: the decline of rural America and its values.
The war itself was the last and greatest example of the progressive spirit of sacrifice. It was a mood that dissolved rapidly after the war. But the progressive-wartime state of mind carried through one last sacrifice before it expired: the Eighteenth Amendment. Although the fact would be forgotten quickly in the twenties, prohibition enjoyed solid majority support when the amendment sailed through state legislatures to ratification in January 1919. “Prohibition,” The New York Times accurately observed, “seems to be the fashion, just as drinking once was.”5
By definition, fashions change. Few have changed more rapidly than this one. The shift of opinion on drinking was one of the more dramatic examples of how, by the fall of 1920, much of America had tired of the whole idea of self-sacrifice. The temper of the twenties was to be vastly different from what had gone before.
Whenever the decade of the 1920s is mentioned, certain images come automatically to mind. Few periods are so clearly etched in the popular mind. It is difficult to think of the time as anything but the Roaring Twenties, the years of flappers, the Charleston, bathtub gin, petting parties, and the Stutz Bearcat. These were the days when America withdrew from the world and went into an orgy of self-indulgence. The decade has had more titles applied to it than any other similar time span in our history: the Jazz Age, the Prosperity Decade, the New Era, the Lost Generation, the Incredible Era, the Era of Excess, the Era of Wonderful Nonsense, the Dollar Decade, the Ballyhoo Years, and the Dry Decade.
Historians have been harsh in their views of the twenties. Typical are the comments of Henry Steele Commager and Richard B. Morris, editors of the generally authoritative New American Nation Series, in their introduction to the series’ volume on the 1920s. “We think of the second and fourth decades of our century in positive terms, but of the third largely in negative,” they rightly say. “The mark of failure is heavy on these years.” “Not since the fateful decade of the 1850’s,” Commager and Morris devastatingly conclude, “had there been so egregious a failure of leadership in American politics.” The reason for examining the decade, these noted historians suggest, is that “physicians study sickness rather than health.”
The decade, though, was far more complex than the catchy names and flickering images suggest. It was a time of enormous contrasts. The staid, proper Puritan Calvin Coolidge was immensely popular in the age of the speakeasy and the rumble seat. It was a time of political conservatism and moral latitude, of great prosperity and grinding poverty.
The complexity has been more difficult to perceive with the twenties than with other periods because that decade’s images are so sharp in the public mind. In view of what came after it, the Jazz Age took on the appearance of a carefree time. For some, this clouded hindsight with nostalgia; for others, it indicated a time in which the American people, and particularly their leaders, were treading water when they should have been swimming vigorously. Combining these images but emphasizing the latter, Malcolm Cowley remembered the twenties as “an easy, quick, adventurous age, good to be young in; and yet coming out of it one felt a sense of relief as on coming out of a room too full of talk and people into the sunlight of the winter streets.”
For others who were young in the twenties, it has remained a golden age of freedom and opportunity, before the New Deal began to “ruin” the country. Most of us tend to have fond but distorted memories of the period of our youth. A generation now approaching middle age looks back on the 1950s as “Happy Days.” Ronald Reagan was twelve years old when Calvin Coolidge succeeded to the presidency. Reagan’s admiration for Coolidge remained undiminished nearly sixty years later when, in one of his first acts as president, Reagan ordered a portrait of Thomas Jefferson in the East Wing of the White House removed to make way for one of Coolidge. Four months after his own inauguration, President Reagan made it clear that Coolidge’s were the presidential shoes he most hoped to fill. Reagan publicly praised his tight-lipped predecessor for cutting “taxes four times.” During the Coolidge years, Reagan continued, “we had probably the greatest growth in prosperity that we’ve ever known.”6
That President Reagan was overlooking what happened to “Coolidge Prosperity” less than eight months after Coolidge left office is plain. But the point here is that what Reagan—and, presumably, many if not most other Americans—remembers about the twenties is also misleading. The prosperity of the decade was real enough, yet it was anything but evenly spread.
With the timely passing of Warren Harding in 1923, Calvin Coolidge came to power, or more accurately, power came to Calvin Coolidge. He used little of it, though. Irving Stone put it well: Coolidge “aspired to be the least President the country had ever had; he attained his desire.” As long as the nation prospered, few could argue with the Coolidge approach. Indeed, most people had no quarrel with it. “Coolidge’s signal fault,” biographer Donald McCoy has shrewdly observed, “was his pursuit of the policies approved by most Americans.” There may be some questions about why Coolidge was so reassuring to the American public, but there is no doubt that he was.
The decline of Coolidge’s reputation was a result of the Great Depression that followed his nearly six-year presidency. His refusal to worry about the problems of the future, along with his complete deference to business, was perceived by people with the benefit of hindsight as causes of the economic collapse. “If you see ten troubles coming down the road,” Coolidge had said, “you can be sure that nine will run into the ditch before they reach you.” This attitude epitomized the twenties; the consequences it helped to produce were grim. In his obituary for Coolidge, H. L. Mencken guessed that the former President “would have responded to bad times precisely as he responded to good ones—that is, by pulling down the blinds, stretching his legs upon his desk, and snoozing away the lazy afternoons.” Mencken summed up the Coolidge years by saying: “There were no thrills while he reigned, but neither were there any headaches. He had no ideas, and he was not a nuisance.” Although President Coolidge’s policies were hardly thecause of the Depression, they made their contribution, and it would be difficult to disagree with Peter Levin’s conclusion that “Coolidge, the success of his times, is a failure in history.”7
In his veneration of businessmen, Calvin Coolidge was wholly representative of his times. Business leaders emerged from the war with greatly enhanced reputations. They had “lost some of the tarnish of the muckraking era” because of the excellent performance of American industry in war production. It was ironic, though, that the success of war production came under strict government control. That success was, nevertheless, used as an argument that the economy should be governed by the marketplace and the wisdom of independent businessmen.
The twenties cannot be comprehended without understanding that after two decades of reform sentiment, businessmen were once again in the saddle. But they were willing to be more flexible in the methods they used in dealing with workers, consumers, the general public, each other, and government. This attitude was reflected in a new vocabulary current among corporate leaders in the twenties. Such terms as “trusteeship,” “service,” “harmony,” “social responsibility,” “industrial statesman,” “enlightened entrepreneur,” and “industrial democracy” were heard frequently. “Welfare capitalism” was the order of the day.
One of the primary aims of welfare capitalism was to keep the workers happy. Contented workers were better workers and, if they saw the paternalism of the company as the source of their well-being, they could be kept subservient. The critical point about the welfare capitalism programs is that they were given by employers to “their” workers. Enlightened corporate leaders provided their employees with recreational facilities, athletic teams, company picnics, reading rooms, housing, insurance plans, opportunities to purchase company stock, and other fringe benefits. Since one of management’s major problems was dealing with still unacculturated immigrants, many employers included “Americanization” programs (English and citizenship classes) among the benefits they offered employees.
The experiment in corporate welfare collapsed early in the Depression, but it served to confirm among workers the idea that economic relations ought to be based on justice. When the Depression made it clear that business was not so benevolent, workers would seek their justice elsewhere. In the meantime, though, welfare capitalism seemed to be making great strides. It made even as unlikely a convert as former muckraker and later Communist Lincoln Steffens. “Big business in America,” Steffens wrote after the 1928 election, “is producing what the Socialists hold up as their goal: food, shelter, and clothing for all. You will see it during the Hoover administration.”
Business was proclaiming a New Era of eternal prosperity and people were believing it. It was wonderful while it lasted, but he who prospers by claiming responsibility for good times is likely to be wiped out in hard times. Business was credited with everything good in the twenties; it was blamed for everything bad in the thirties. As deflated as the economy was in the Depression, it was not so deflated as the reputations of businessmen. They were, journalist Elmer Davis wrote in 1933, “about as thoroughly discredited as any set of false prophets in history, and most of them know it.” Herbert Hoover may have said it best: “The only trouble with capitalism is capitalists. They’re too damned greedy.”8
From the end of the small depression of 1920–22 to the crash of October 1929, relatively few Americans would have dared to say such nasty things about capitalists. These, it seemed, were the biblical “seven years of great plenty.” What soon came to be known as Coolidge Prosperity was the greatest economic boom the world had yet seen. It was a time in which, economics writer Stuart Chase noted, the businessman had replaced “the statesman, the priest, the philosopher, as the creator of standards of ethics and behavior.” The businessman in the Coolidge years was “the final authority on the conduct of American society,” “the dictator of our destinies.”
The priest was not merely replaced, the very Object of his worship was converted into a businessman. In the number-one best-seller of 1925 and 1926, The Man Nobody Knows, advertising executive Bruce Barton related how Jesus had “picked up twelve men from the bottom ranks of business and forged them into an organization that conquered the world.” Christ was “the founder of modern business.” “The parable of the Good Samaritan,” Barton pointed out to his readers, “is the greatest advertisement of all time.” The worship of business verged on the literal. “The man who builds a factory,” President Coolidge intoned, presumably with his usual straight face, “builds a temple, the man who works there worships there.” And on it went. Moses was said to have “appointed himself ad-writer for Deity.”
It is not surprising that professional advertisers most often brought together business and religion. Those who wanted to market a product had, of course, to persuade consumers that it would do them some good. Advertisers planted and then harvested fears. Ads informed women that “health authorities … tell us that disease germs are everywhere.” Fortunately, Lysol was available to protect the children. Another ad played upon the fear of mothers that they might not succeed. A “famous scientist” said: “Tell me what your children eat and I will show you the kind of men and women they will be!” Grape-Nuts were the solution. ’Twas admen that taught consumers’ hearts to fear, and admen their fears relieved.
The growing power of advertising in the 1920s reflected more than the resurgence of business dominance. It was a sign of a fundamental change in the economy. The nation’s industries had seemingly conquered the age-old problems of production. John Stuart Mill had expressed the nineteenth-century economic view when he said “if we can only produce enough, consumption will take care of itself.” He was wrong. The key to the economy was now consumption. To stimulate sufficient consumption, advertising was essential. “We grew up founding our dreams on the infinite promises of American advertising,” Zelda Fitzgerald later recalled. “I still believe that one can learn to play the piano by mail and that mud will give you a perfect complexion.” Old habits of thrift and sacrifice, deeply ingrained in early stages of industrialism, now were to be altered, if not reversed, and advertisers would be the instructors in the new ways. All the virtues of the age of scarcity were now questioned. Even before the war some advanced thinkers were spelling out the new values that would win far more converts in the twenties. “The non-saver is now a higher type than the saver,” economist Simon Patten wrote in 1907, “I tell my students to spend all they have and borrow more and spend that. It is foolish for persons to scrimp and save.” Advertisers spread this new consumer ethic widely in the prosperity decade: Don’t tighten our belts, loosen them—the more we spend the more prosperous we’ll be. Many Americans brought up on the old virtues converted to consumerism in the twenties.
The relationship between expanding consumption and prosperity in the 1920s is clear. Productivity increased astronomically. Between 1920 and 1929 output per person-hour soared upward by 63 percent. If the economy was to stay afloat, someone had to buy these products. Many of them were new. Starting from a tiny base in 1922, sales of radios had increased by 1400 percent by 1929. There was a similar, if not quite so spectacular, explosion in sales of such household appliances as vacuum cleaners, electric irons, refrigerators, and washing machines. Such new industries helped greatly in producing the economic boom of the decade.
But no product was more representative of the mass consumption culture of the twenties than the automobile. Much of the twenties’ prosperity was directly related to this industry. Between 1919 and 1929 the number of motorcars in the United States leaped from fewer than 7 million to more than 23 million. It was during the 1920s that the automobile became a “necessity” for Americans.9
The greatest seeming paradox of the 1920s centers on the attitudes of Americans toward “new” things. The decade’s love of newness was hardly unique in United States history. We have always placed a premium on anything new. We were a new country in a new world, a new people, and we have seen such curiosities as a New Freedom, a New Nationalism, a New Deal, a New Frontier, a—or rather, many—New Souths, a New Left, a New Right, innumerable New Nixons, at least two New Federalisms, and even a New Conservatism. Manufacturers of well-established products feel constrained every few years to advertise them as “new, improved,” or with “new ingredient X-17.”
This persistent American affair with novelty reached special heights in the 1920s, the New Era. It was an age in which one fad displaced another with astonishing rapidity: mah-jongg, crossword puzzles, the teachings of Emile Coué, the wearing of open galoshes (which gave us the term “flappers” for those who wore them—a name that far outlasted the fad), marathon dancing, flagpole sitting. Each burned briefly, only to be replaced by another. A fad could move from “new” to passé in a matter of months. Being new was not enough; what was sought was the newest of the new.
Technologically also, the decade was a New Era. Henry Ford put it simply: “Machinery is the new Messiah.” To prove his assertion, Ford perfected his assembly line to the point where, at the end of October 1925, a car came off the end of the line every 10 seconds, with a total of 9575 autos produced in a single day. By 1929 the ratio of cars to people in the United States was 1:5, and one-third of the nation’s homes were equipped with radios. Inventions that had seemed impossible within the living memory of much of the population came quickly to be seen as necessities. Obviously, new was better than old.
Or was it so obvious? This same era of newness saw Warren Harding achieve the biggest landslide victory the nation had yet experienced by promising a return to normalcy. The Klan thrived by defending the old morality. And surely there was nothing “new” about Calvin Coolidge.
Some aspects of the New Era were not as pleasing as others. The new technology appeared to be beneficial, but many were not so sure about the new morality. Young and urban Americans seemed especially to be gripped by new ideas that were upsetting to their older and more rural fellow citizens. New gadgets might be one thing, but a New Woman was quite another. (What, after all, was wrong with the old one?)
Here again we are on ground that spreads throughout American history and is not peculiar to the 1920s. It is the clash between old and new values, between farm and city, between the garden and the machine. But this recurrent clash reached a particular crescendo in the 1920s. The revolutions in technology and morals came on top of the finding of the 1920 census that for the first time the nation’s urban population exceeded the number of rural residents. Although the statistics are rather misleading, since the census classified as “urban” anyone living in a community of 2500 or more residents, the conclusion was in its own way as great a shock as had been the finding of the 1890 census that the frontier had closed. American values since before Jefferson had been rural values. What would happen now?
To many horrified Americans of traditional beliefs, the answer was all-too-apparent in the actions of the young people of the Jazz Age, especially the women. It was always expected that boys would be boys, and since they would be irresponsible, women had to assume, as The Atlantic stated in 1920, “the entire responsibility of the human race.” But now young ladies—if that was the appropriate word—were beginning to smoke (some even in public), drink alcoholic beverages, wear rouge, lipstick, and daringly short skirts, bob their hair, ride in automobiles with men, use strong language, and engage—with men to whom they were not even engaged—in such unspeakable practices as kissing and … who could be sure what else? “Gazing at the young women of the period,” historian Paula Fass has said, “the traditionalist saw the end of American civilization as he had known it.” The president of the University of Florida verbalized the fears of many: “The low-cut gowns, the rolled hose and short skirts are born of the Devil and his angels, and are carrying the present and future generations to chaos and destruction.”
Farmers and small-town residents, long accustomed to being hailed as the backbone of the nation, now found themselves ridiculed as hayseeds and hicks. An urban-industrial society that they neither liked nor understood was engulfing them. In the early twenties, farmers saw the last bulwarks of their values being attacked. Theirs was not the world of speakeasies, petting parties, rosary beads, or bagels. To rural Americans the city was the center of sin. It was associated with all evils: crime, prostitution, Catholics, Jews, immigrants, bootlegging, alcohol, Wall Street, and everything else that was wrong with America. And, adding insult to injury, the cities were rapidly attracting the sons and daughters of the Middle Border, thus corrupting the last hope for the future.
Rural America seemed to be making its last stand in the twenties (actually its values would often be heard again). The struggle might be hopeless, but it must be carried on. It was a many-fronted war. The Ku Klux Klan and rural churches provided the foot soldiers. Such politicians as William Gibbs McAdoo and the aged William Jennings Bryan offered their skills as general officers. Battles took place on the fields of both sides, from New York City to Dayton, Tennessee.
The clearest sign that the old, rural values were on the defensive was that their champions began enacting laws to uphold the old ways. If the old morality worked properly, it would be self-enforcing; no laws would be needed. Turning to the law was an admission of defeat. A bill introduced in the Utah legislature in 1921 would fine and imprison persons (presumably females) who were caught in public wearing “skirts higher than three inches above the ankle.” A proposed Ohio law would prohibit the sale of any “garment which unduly displays or accentuates the lines of the female figure.” The Ladies’ Home Journal called for the “legal prohibition” of jazz dancing. Numerous books and plays were banned in Boston—and elsewhere. And several states passed laws prohibiting the teaching of the theory of evolution.
To see the values conflict of the twenties merely as a struggle between groups, however, is to miss its most important aspect. The apparent paradox of a decade of loving everything “new” and Calvin Coolidge at the same time would be no paradox at all if we were dealing with distinct groups, one committed to the old and the other to the new. But in fact the tension between stability and change often took place within individuals. Here Henry Ford was the representative American of the 1920s. No one did more than he to destroy the old rural America, yet he continually tried to escape from the age he did so much to create. “It was,” Ford affirmed in his autobiography, “an evil day when the village flour mill disappeared.” He attempted to recreate the old world at his Greenfield Village, a replicated nineteenth-century farming community. There he employed a blacksmith, a cobbler, and a glassblower.
The irony goes even deeper. Henry Ford had gotten into tinkering with machines as a boy because he despised farm work. Here was the apotheosis of the American love-hate relationship with both farm and city. Ford did his best to try to reconcile the two, and to convince himself that his lifework had not really been destructive of the lifestyle he professed to favor. “You can’t fix a dead horse with a monkey wrench,” the nature-loving industrialist observed. “Unless we know more about machines and their use …,” he said, trying to persuade himself as much as others, “we cannot have time to enjoy the trees and the birds, the flowers and green fields.” Ford went so far as to promote village industries at the same time he was building the enormous River Rouge plant. The ultimate symbol of his ambivalence, though, must have been the seventy-six-unit “bird hotel” he constructed at his estate. Each apartment was equipped with an electric heater and other mechanical devices.10
In times of rapid change—newfangled contraptions, new morality, new women—old values take on a new importance; they give people something to which they can cling. Thus Americans held fast to Calvin Coolidge, the security blanket of the 1920s.
If American farmers were roaring in the twenties, it was not in the same sense of the word that is usually connected with the decade. While other sectors of the economy recovered from the collapse of 1920–22, in agriculture that depression continued on until it blended into a new one at the end of the decade. Farm prices fell to a much greater extent than did the cost of industrial products. The relationship between the prices of farm and nonfarm commodities determines the welfare of farmers. In 1922 the farmer was at a 19 percent disadvantage compared to the price relationship that had prevailed in 1913. The gap narrowed only slightly over the rest of the decade. The nation’s farm families, representing some 22 percent of the American people in 1920, received 15 percent of the national income in that year. Eight years later, farm families accounted for only 9 percent of national income. In 1929 annual per capita income of farm persons was $273; the average for all Americans was $750.
It seems significant that as the major economic group which shared least in the prosperity, farmers remained the most committed to “progressive” programs in the 1920s. The farm bloc pushed for government intervention, redistributive (in the direction of farmers, of course) taxation, and government waterpower projects.
Industrial workers found a bit more to roar about in the twenties than farmers did, but not much. For most working-class families in the decade, labor historian Irving Bernstein has said, “the appropriate metallic symbol may be nickel or copper or perhaps even tin, but certainly not gold.” At first glance, workers seem to have shared in the decade’s new material abundance. Many obtained cars, radios, and other trappings of at least modest prosperity. But despite surface appearances, the American economy of the 1920s was seriously out of balance, and workers as well as farmers bore the brunt of the cost of the maladjustments. In Middletown, their classic study of a “typical” middle American city (Muncie, Indiana), Robert and Helen Lynd found that work was very unsteady in “good times” as well as bad. In the prosperous year of 1923, more than one-quarter of a sample of Muncie workers had been laid off, 15 percent for more than one month. During the first three-quarters of 1924, when “times were bad,” 62 percent of the same workers had been out of work, 43 percent for a month or more out of nine.
National unemployment figures cannot be stated accurately because government statistics were not collected on the subject. Estimates of nationwide unemployment at the height of Coolidge Prosperity vary from 5.2 percent to 13 percent. The class difference in views of the problem can be seen in Muncie. The wife of a prominent businessman stated the upper-class view: “I believe that anyone who really tries can get work of some kind.” When jobs were scarce, though, Muncie workers became very fearful.
The conditions under which many Americans labored in the twenties are often obscured by the surface image of general prosperity. In 1929 the American wage earner had a longer workweek than did his counterpart in other industrialized nations. The American enjoyed almost no protection against the constant specter of unemployment. All the nations of Western Europe had gone further than the United States in acknowledging public responsibility in this area. By 1929 the United States stood with two underdeveloped countries, China and India, as the only major nations allowing women and children to work at night.
All of this said, however, it should not be concluded that prosperity had no effect on workers in the twenties. Despite the inequities, the material well-being of workers who had jobs was improving. “Ownership of a Model T,” Irving Bernstein points out, “even if shared with the finance company, was more than entertaining: it inclined one to accept things as they were.” Industrial workers were generally persuaded that they had never had it so good. “Falling prices and fuller employment,” as E. P. Thompson has observed of a similarly prosperous period in England a century before, tend to “take the edge off” worker anger.
By 1923 unions had lost nearly 1.5 million members, a decline of almost 30 percent from the 1920 high. The loss of union membership during hard times was not unexpected. This was the usual pattern. What was unusual about the status of unionism in the twenties was that unions failed to regain members when prosperity returned. In the greatest period of prosperity the United States had yet known, union membership stagnated and even declined a bit, hitting a low for the decade in 1929. It was the first period of prosperity in American history during which unions failed to increase their membership. There were several reasons for this reversal of form. Most important was that the prosperity was based principally upon increased productivity resulting from automated, assembly-line manufacturing. The number of person-hours worked fell by 7 percent between 1920 and 1929. This meant that despite the prosperity unemployment continued to be a problem. The persistent labor surplus made it difficult for workers to organize. The usual aspect of prosperity most helpful for unions—plentiful jobs—was absent during the twenties.11
If Calvin Coolidge himself made little use of his office, the same could not be said of his two leading Cabinet members, Herbert Hoover and Andrew Mellon. Hoover was, one contemporary quipped, “Secretary of Commerce and Undersecretary of all other departments.” He provided much assistance for business. Hoover’s role, though, seemed to be less important than that of Mellon, the Pittsburgh banker and industrial magnate who was secretary of the Treasury in the administrations of all three Republican presidents of the twenties. One of the richest men in America, Mellon strived unstintingly toward his major goal: the reduction of the tax “burden” on the rich. Until 1926 progressives in Congress were able to block Mellon’s most generous proposals for his own class. Of Mellon’s bill Senator George Norris said without much exaggeration, “Mr. Mellon himself gets a larger personal reduction than the aggregate of practically all the taxpayers in the state of Nebraska.” Under Mellon’s Revenue Act of 1926, a person with an annual income of $1 million had his taxes lowered from more than $600,000 to less than $200,000. Mellon’s own income was considerably higher, and he benefited even more from his own program. Small taxpayers received minimal reductions. Nor did Mellon’s munificence end there. He and the few people in his income bracket also enjoyed clandestine administrative refunds of taxes. Mellon was generous in granting tax credits, abatements, and refunds. Between 1921 and 1929 (when his activities were exposed by House Speaker John Nance Garner), $3.5 billion was granted to corporations and individuals friendly to the Republican party.
In the twenties, however, helping the rich was not a partisan issue. Pierre S. du Pont, president of E. I. du Pont de Nemours & Co., was—for understandable reasons—of the opinion that the rich should not bear the burden of taxation. Taxes, Du Pont reasoned, ought to be paid by the working class, not the “productive” class (which he defined as large employers). This view was shared by John J. Raskob, an executive of Du Pont, General Motors, and other corporations. In seeking a way to bring about their objective, these two eminent gentlemen, along with some wealthy associates, hit upon a brilliant scheme. They would seize control of the Democratic party and commit it to the repeal of prohibition. When this was achieved, they would place a tax on beer, the workingman’s drink. Estimates indicated that such a levy could raise $1.3 billion. This would allow a further 50 percent cut in corporate and individual taxes. In short, Du Pont and friends would give workers beer, and with it they would also give them half the tax burden of the rich. It seemed to them to be a reasonable trade-off.
New York Governor Al Smith became the instrument of the Du Pont plan. When Smith won the 1928 Democratic presidential nomination, he named Raskob—who listed his occupation for Who’s Who as “capitalist”—as chairman of the Democratic National Committee. Interestingly, Raskob had been up until that time a Republican. Despite Smith’s loss to Hoover, Raskob owned the Democratic National Committee for the next four years. He and the Du Ponts still hoped to put their plan into effect in 1932.
Thus by the end of the twenties, Raskob seemed to be close to the mark when he said the only difference between the two parties was that the Democrats were wet and the Republicans were dry. He might have added that Republicans favored shifting taxes from the rich to the poor by means of a national sales tax (an idea that Mellon had been pushing since 1920), while the Democrats would try to achieve the same end by taxing beer. The conservative era had reached its zenith—or its nadir, depending upon one’s point of view.12