IN THE EARLY 1970S the unexpected rise in oil prices forced people to give some attention to other negative indicators in the industrial world: the slowing growth rate, intractable inflation, rising unemployment, the plunging dollar, and fluctuating exchange rates. The comfortable understanding among big business, big labor, and big government was coming apart. The unwelcome appearance of stagflation also signaled that national policy makers could no longer depend upon the economic prescriptions of John Maynard Keynes. He had given a central role to government to spend when private investments could no longer achieve full or near-full employment, as in the Great Depression. Most countries in the postwar West followed Keynesian policies to ward off recessions. Alas, few had had the courage to cut off popular spending programs when they no longer were needed to boost the economy. This negligence contributed to inflation, exacerbated by the 1973 spike in oil prices. But now inflation was accompanied by high unemployment. The facts no longer supported the original Keynesian proposition. Government spending, which he had recommended in times of falling demand, had created the “flation” in stagflation, and stagnating sales the “stag.” What had seemed a stable, comprehensible, and predictable economic environment became fluid and puzzling.
When the smooth performance of the advanced industrial countries came to an abrupt end in the early 1970s, it was time to look for help in a new theory. This gave an opening to Milton Friedman, who had some insights appropriate to the time, or so they seemed. A University of Chicago economist, Friedman wrote extensively on consumer behavior and public policy, often in partnership with his wife, Rose. Friedman analyzed the new data and explained why a volatile inflation rate actually contributed to unemployment because it increased uncertainty. Its harm to creditors and those on fixed incomes also put pressure on governments to do something—wise or not. Friedman advised cutting back on government activity in the economy so that the market could do what it does best: communicate simple, unadulterated information through its prices to market participants, who could then make the soundest decisions with their resources.
As an influential writer on monetary theory Friedman recommended that government confine itself to a small increase in the money supply. As a public figure he wrote tirelessly to bring the public back to an appreciation of “economic man,” that rational chooser upon whom Keynes had cast doubt. Friedman reaffirmed economists’ early conviction that the market helped people choose what was in their interest. Competition, he said, worked best for consumers and producers alike. He won a Nobel Prize in economics in 1976. His ideas soon percolated into public policy first in Great Britain with Prime Minister Margaret Thatcher and then in the United States. As President Ronald Reagan announced in 1981, “It is time to check and reverse the growth of government,” though he recognized that the imperative was to make government work better.1 While Thatcher and Reagan were in power, Friedman was showered with awards, prizes, and appointments. In actual practice, monetarism enjoyed Federal Reserve support only for the years between 1979 and 1982. It failed to keep the country from sliding into recession.
Not all government intrusion into the economy had been inspired by Keynesian theory. Much of it was in pursuit of a social goal. In the 1960s legislatures began controlling how factories affected the environment or endangered species. Other laws dealt with worker safety, discrimination in hiring and housing, and the protection of consumers. Without debating their social and moral benefits, Friedman pointed out their adverse effect upon competition.2 His work became the basis of the deregulation movement that liberated credit institutions, telecommunication corporations, and the energy sector. His faith in self-interest’s capacity to trump prejudice led him to predict that employers would not discriminate because it hurt them not to offer jobs to the best applicants, a position contested by many field studies. Perhaps the most interesting of these was the blind auditioning for orchestras that gave a considerable boost to female candidates. The recession in Japan in the 1990s and the meltdown of the Argentine economy in 2000 offered a reprise of the Keynes-Friedman debate over the relative merits of government spending and government restraint. Keynes came out the better in the contest of ideas while the countries themselves suffered from following Friedman’s prescriptions.
Still, Friedman’s ideas exercised a great influence on corporate heads and policy makers alike. Supported by both Democratic and Republican administrations, the first wave of deregulation came in the late 1970s. Laws freed the airlines and trucking companies for competitive pricing. Regulation had been especially heavy in the transportation industry because it was seen as a public service in need of stability and protection. More slowly a broad band of civic-minded men and women worked to deregulate investment banking. This came at the same time that traditional long-term relationships between banks and their corporate customers were breaking up under pressure from newly minted MBAs moving into the banks’ executive suites with new ideas about improving banking profits.3 Two laws in 1980 and 1981 eased accounting rules on savings and loan institutions and reduced minimum down payments on their mortgages. Sailing faster with less ballast, they floated many more loans, and American personal indebtedness began its three-decade climb. Within the next decade over seven hundred S&Ls went under at a cost of over one hundred billion dollars to their insurers, the American taxpayers, but without slowing the movement for deregulation.
The 1980s also brought wrenching changes to manufacturing in the homelands of capitalism. The worldwide circulation of people, investment, and goods took an unexpected turn when multinational corporations sought out countries with cheap labor to build new plants. The enhancement of global communication made this easier to do. The United States, in particular, lost high-paying factory jobs that had boosted millions of families into a prospering middle class. Soon the steel centers that stretched from Buffalo, New York, to Gary, Indiana, lost out to Mexico, China, South Korea, and Brazil. Cheap imported steel entered the country from Japan and Europe. The land of smokestacks became a Rust Belt. Millions of jobs were opening up in finance, computers, and the service sectors, but Americans were used to their manufacturing might. And the new areas promoted an income split: minimum wage work in fast-food outlets and nursing care facilities and higher wages for the denizens of Wall Street and Silicon Valley.
The novelist Tom Wolfe commented recently that we were witnessing “the end of capitalism as we know it.”4 That’s a statement that could have been made many times in the past two centuries, for capitalism is a system constantly reinventing itself, a set of prescriptions peculiarly open to disruption, a work in progress. It looks the same only if you examine the categories instead of the participants and practices. For instance, people have long insisted that market economies flourish only in open, secular societies where property rights are enforced and individual ambition is cultivated at the knees of mothers. Seven success stories in the second half of the twentieth century suggest that capitalism can take hold in diverse social contexts under government supervision and within communitarian cultures—that it is, in fact, always adapting.
The Formidable Economic Power of Japan
First among the countervailing examples is Japan, which started its economic transformation more than a century ago. Next, the Four Little Tigers—Singapore, Hong Kong, Taiwan, and South Korea—charged out of their traditional cages in the 1960s and 1970s. Sometimes called the East Asian NICs (newly industrialized countries), their takeoffs diverged from that of Japan’s as Japan’s had diverged from those of Western Europe and the United States. India and China, coming along more slowly, portend even greater influence in the global economy, as befits the first and second most populous nations, with 37 percent of the world’s people.
Japan appeared an unlikely candidate for industrialization, much less for rapid industrialization. An East Asian island of thirty million people in the mid-nineteenth century, deliberately cut off from the world, it burst into prominence as a military and economic power at the end of that century. In a report card for worldwide economic development between 1820 and 1970, Japan placed first. Its GDP grew twenty-fivefold, a growth spurt unique in human history.5 Starting at the most advanced level in 1820, Great Britain multiplied per capita income ten times, Germany fifteen, and the United States eighteen. The Western sequence of industrialization went from textile making and mining to metallurgical industries, railroad building, and heavy industry generally, its source of energy from waterpower to steam created with coal fires to electricity powered by generators. Consumer goods slowly diverted investments from the production of capital goods, the whole accomplished in a more or less trial and error fashion, through the decisions of entrepreneurs and investors.
Japan did not reverse the process, but its deviations from the pattern set in the West shows the diverse paths that capitalism can follow. Japan lacked the raw materials that mattered in heavy industry, meaning that it would have to import its iron and coal. The government outlined a program of exporting textiles, shoes, and trinkets that could pay for these essential imports. Hastened by its ability to borrow foreign technology and guided by a very determined elite, it did everything fast. Its traditional industries like cottage silk reeling, food processing, and various handicrafts used waterpower well into the twentieth century, but electrical motors replaced steam engines so quickly in the first decade of the twentieth century that you could almost say that Japan skipped the steam age. It also followed its own traditional path in placing the modernization of production and finance in the hands of a very few families like the Mitsuis, Mitsubishis, and Sumitomos, who in turn launched joint-stock trading companies in different sectors of the economy like steel and automobile-making. These family concerns formed pyramids from the top down, unlike the United States, where managers usually came up from the bottom rungs of business. The great industrial families exercised tight control from the center and cultivated a privileged group of insiders. They also blocked investment opportunities for foreigners.6
Even with military expenditures, government spending in Japan represented only 7 to 11 percent of the total annual investment in the economy, compared with 28 percent in the United States. It played a much larger role in capital formation—probably 30 to 40 percent—until private investments took off during World War I. As would be expected, Japan had its Wedgwoods, Watts, Carnegies, Rockefellers, Thyssens, and Siemenses, who laid the corporate foundation for Japan’s successful industrialization. Sakichi Toyoda, like Thomas Edison, was a natural inventor and an even better business organizer. Born in 1867 into a family of carpenters, he set out single-mindedly to design a better power loom and devoted his life to this goal. In his province almost every farmer had a loom in his cottage for the family to earn extra income weaving cloth, so he was familiar with its construction and operation.
After decades of work Toyoda caught the attention of the British firm Platt Brothers, which dominated the world market in textile machinery. In 1929 he sold it the right to manufacture his G-type automatic loom. The contract rewarded his genius and tenacity and signaled the progress of Japanese technology. For cotton cloth makers his loom looked like a good investment. It cost three times the price of a conventional loom and increased tenfold the output of one operator, but it failed to catch on. The failure of Toyoda’s loom for Platt Brothers uncovered a central weakness in the British textile industry: the strength of organized labor. Few manufacturers bought the Platt-made Toyoda loom because their workers objected to being displaced. By acceding to them for short-run peace, the British industry lost its preeminence in the world market. Between the 1880s and 1930s, Britain’s market share dropped from 82 to 27 percent while Japan’s climbed to 39 percent. The ability of Japan’s association of textile makers to buy cheap raw cotton contributed to this rise in market share. Eventually Japan lost out to countries with cheaper labor but retained the lucrative business of making the textile machinery.7
You probably already realize that the Toyoda Automatic Loom Works would not be getting this attention were it not for its parentage of the Toyota Motor Company. Sakichi Toyoda, on his deathbed in 1930, advised his son Kiichiro, another inventing genius, to find his own passion. Having been astounded by the mass production of Model T Fords when he visited the United States, Toyoda pushed his son in the direction of automobile making with a pot of cash to get started. At that time Ford and General Motors dominated the Japanese automobile market. Kiichiro Toyoda built his early automobiles on the technology developed for the family’s automatic looms. He changed the name of his car from Toyoda to Toyota for reasons pertaining to Japanese calligraphy.
In 1936 the Japanese government, already well advanced in an aggressive colonial policy, used a new licensing law to throw most of the automobile business to Toyota and Nissan. Both became giant holding companies in the 1930s. Their executives were military men who based their market strategies on advanced technology. They used state funds rather than banks for their capital, though Toyoda did extract working money from Toyoda Loom’s accumulated earnings.8 The decision of the Japanese government to go to war in 1894 and 1904 and a generation later in 1937 and 1941 put Japanese industry on a war footing. The country’s industrialists had been the most ardent supporters of the doomed Greater East Asia Co-Prosperity Sphere and Japan’s aggressive prewar foreign policy. American bombing raids during 1945 destroyed Japan’s war machine, but not the know-how that had built it.
From the point of view of the history of capitalism, Japan’s capitulation to the United States in 1945 was more portentous than its earlier, but short-lived, imperial successes. Having accepted “unconditional” surrender with the one condition of maintaining the emperor, the United States was free to rebuild Japan in its own image. General Douglas MacArthur, the supreme commander for Allied powers in the area, took charge of the occupation. His thoroughness and the absence of atrocities from American troops stunned the Japanese. The country was demilitarized; jails were cleared of dissident liberals, socialists, and Communists; and political parties and labor unions encouraged to participate in the hoped-for establishment of a postwar democracy.
When the Japanese were slow to produce a constitution, General MacArthur’s staff did it for them, investing power in a legislature like that of Great Britain and giving women equal political rights with men. Land reform placed more than two million acres in the hands of nearly five million tenant farmers. The rural economy began to blossom. Turning their attention to the manufacturing sector, the occupiers became intent on breaking up the giant holding companies of the prewar period.9 World politics then intervened.
When Soviet-backed North Korea invaded South Korea, the United States led a United Nations action against the invaders. American attention went from reforming the Japanese state to strengthening its power to resist communism. The intensification of the Cold War in the East with the emergence of a Communist regime in China deflected American advisers from their initial push for democracy.10 Becoming a frontier in the Cold War had major consequences for Japan. It sped up the end of the American occupation with a formal treaty ratified in 1952. At the same time, Japan signed an agreement to provide bases for American troops, ships, and aircraft, an act that aligned it with the West to the exclusion of Russia, China, and neutral countries in Asia. The Korean War jump-started industries, light and heavy, as Japan extended hospitality to American forces and supplied munitions and equipment to the war effort.
A remarkable American, William Edwards Deming, came to Japan in 1950 as an assistant to the supreme Allied commander and stayed long enough to impress an obsession with quality on the country’s leading industrialists. Trained in physics, mathematics, and statistics, Deming was a natural teacher, churning out such student-friendly aids as fourteen points for transforming business effectiveness, seven deadly diseases, and four obstacles to progress. The gist of his message was that manufacturing is a system that can be improved by exquisite attention to detail and made cost-efficient by constant improvements in every phase of production. He originated the famous Japanese team system, in which personnel in research, design, sales, and production worked closely together, often achieving an esprit de corps that banished tensions from the work site. Japanese leaders consider Deming, awarded the emperor’s Order of the Sacred Treasure, practically the father of Japan’s postwar industrial rebirth.
American advisers threw their considerable weight behind Japan’s conservative politicians once the Korean War erupted. Throughout the second half of the twentieth century, the Liberal Democratic Party enjoyed an almost unbroken run of dominance in Japanese politics, though in 1993 there was a temporary pause in the party’s hegemony. Bringing the occupation to an end, the Japanese government began its own program of economic reform, what it called rationalization. The goal was to make Japanese producers competitive on the world market, beginning with steel. The most modern integrated steel plant on the globe rose from land reclaimed in Tokyo Bay in 1953. It took raw materials in a continuous series of processes through to the finished products.11Soon other Japanese steel firms copied it, demonstrating the perverse advantages of the wartime destruction of Japan’s industrial base. Obsolescence was swept away with the rubble. In defeat, its industrialists discovered the virtues of flexibility.
That flexibility did not extend to the government. Japan’s continued attachment to the American dollar has extended its dependence upon its World War II conqueror. Collusion among the bureaucracy, leaders of big business, and members of the Liberal Democratic Party have frozen out their opponents from making and implementing policies. The Socialist and Democratic parties have regularly elected members to the lower and upper houses of the Parliament but rarely exercised any real power. The big exception came in the 1970s, when the leftist parties, with a national consensus behind them, pushed the LDP to address the deterioration of the environment that industry had brought about. In this sense, they act as something of a safety valve on an otherwise closed system. This pattern of course is consistent with Japan’s prewar institutions. It is also the case that Japanese leaders have been hampered from making policy changes or responding to unfolding events because of the intransigence of a very independent governmental bureaucracy.
While Europe and the United States were enjoying those two decades of strong prosperity between 1953 and 1973, the Japanese record was even more impressive with an annual growth rate of 10 percent, then unique in the history of capitalism, but now matched by China.12 After the war, Toyota, Nissan, and Honda benefited from the growing domestic market, which the government protected from European and American competition. Under this umbrella Toyota and Nissan built new plants in the early 1960s. They pioneered so-called lean production with the “just-in-time productive system.” This program of using one machine to do several tasks was born of wartime necessity, as was the inability to produce a large backup of items. Factory managers did not have the luxury of assigning one machine to turn out, say, left fenders. They also didn’t have the space for a long assembly line. Hence they didn’t stockpile parts, and they put together their vehicles in tight quarters.
Outsiders after the war analyzed just-in-time processes and declared them superior. They dubbed it lean production to contrast it with America’s mass production. More than merely reduce the inventories for parts, lean production emphasizes precision assemblage with defect-free components put together by skilled teams of workers who show little tolerance for any imperfections along the line. An echo of “small is beautiful,” lean production uses less space, fewer backups, and an appreciation of the importance of each move, each piece of material going into the vehicle. Parts are ready just in time.13 General Motors, Ford, and Chrysler have seen their market shares melt like a piece of ice at a July picnic, but they’ve resisted copying some successful Japanese techniques. Here is another reminder that innovation keeps capitalism moving forward, but entrenched managerial elites can avoid responding to its promise.
Japan secured a very important quid pro quo from the United States. In return for basing troops and aircraft in Japan after the Korean War, the U.S. government promised the Japanese access to the American market. Detroit probably didn’t pay much attention to the United States-Japan Mutual Security Treaty signed in 1960, though it would soon enough feel the competition from Japanese automobile exports.14 During the ratcheting up of oil prices in the 1970s, Japanese automakers moved into the huge American market with their small, snappy fuel-efficient models. Rather than buy into foreign companies to get a share of their markets, both Toyota and Nissan set up their own dealerships and put a lot of money at risk by doing so. Soon they were building their own manufacturing sites in the United States. Almost fifty million new vehicles roll out of auto plants worldwide every year, making it the number one industry.15 The Japanese were astute marketers of their cars, which helps explain how Toyota was able in 2008 to pass up General Motors after its seventy-seven-year run as the world’s largest automaker.
The structure of European economies is corporate with the interests of labor and management worked on together through public and private organizations. That of the United States is more competitive than corporate, and we can characterize the Japanese economy as paternalistic. Its most prominent firms appear like an extended family with joint-stock companies running specific enterprises under the benevolent guidance of its holding company. This arrangement offered protection from hostile takeovers. Paternalism shouldn’t be confused with patriarchal, for unlike America’s hierarchical decision making, in Japanese companies, ideas percolate up from the bottom. Middle and local managers make many of the operational moves; all focus on cultivating skills and talent from within with eyes on long-term growth.16
Rather than members of a cartel for a single industry, Japanese firms belong to holding companies, but the competition among the parts of such a company can be fierce. While in recent years, family ties—both real and metaphorical—have loosened, loyalty to one’s own group has retained an importance not found in the West. And like stable families and their friends, Japanese firms develop and keep long-term relationships. Even relations with labor have been marked by mutual trust after some long, bitter strikes. In exchange for firing 25 percent of its work force during a slump in the late 1940s, Toyota worked out a bargain with its unions promising lifetime employment, pay increases for seniority, and bonuses tied to profits.17 Lifetime employment policies for firms with more than one hundred workers served to stabilize Japanese labor relations, even as it put in place a rigidity that was to hurt farther down the road.
Japan benefited from another event in the United States. The 1958 Supreme Court consent decree in the antitrust case against RCA, IBM, and AT&T forced these companies to give patent licenses to domestic applicants for free and to sell them to foreign firms. This lucrative possibility entranced RCA, which moved quickly to maximize short-term profits from its patent licenses while ignoring the research and design that had made it a leader in television and radio equipment.18 The race for color TV resembled a NASCAR event. The lead car was RCA flying the colors of its founders, General Electric, Westinghouse, and Telefunken. Way out in the lead for many rounds, it made a fatal mistake. Sony, the hungry upstart in the field, saw its advantage and rushed to the lead. Having invested heavily in design and maintenance, the Sony entrant maintained its lead. RCA’s policy of selling its patents sped up the transfer of color TV technology to Japan’s leading consumer electronics firms.
RCA finally dropped out of the race altogether, pulling with it every other consumer electronics company in the United States. The race went to the Japanese firms of Sony, Sanyo, and Matsushita, which began buying up failing American companies. Then Sony, just twenty-six years old in 1972, moved to construct its own color TV plant in California, producing 450,000 sets a year. Toshiba, Mitsubishi, and Hitachi followed suit during the rest of the decade. So grateful were the Japanese that when RCA’s CEO David Sarnoff visited Tokyo in 1960, the emperor awarded him the Order of the Rising Sun! (Sarnoff, as a young telegraph operator, had had the distinction of receiving the distress message from the sinking Titanic.) The size and scope of the Japanese corporate giants made possible these aggressive moves.19
The transistor stirred more military than commercial interest in the United States when it made its appearance in 1947, but physicists and engineers working for Sony quickly turned out a very popular commercial product, the transistor radio. Sony specialized in miniaturization. Having very deep pockets, it invested in the research that yielded audiotape recorders, stereo equipment, videocassette recorders, digital videodiscs, video games, and camcorders. In 1996 the Korean firm LG bought the major share of stock in Zenith, the last American firm to make TV sets. It looked like a fitting epitaph to the America’s consumer electronics until Apple took away Sony’s lead in digital music a decade later with its iPod.
Arrival of the Personal Computer
America maintained the lead with one of the market stars of the 1980s, the personal computer. Before PCs appeared on the scene and stole the show, data processing with computers had spread through manufacturing, retailing, and financial firms. This created a market for peripherals, software, and something called a chip, which is a small crystal of a silicon semiconductor that when put on an integrated circuit can do a lot of electronic tricks. A semiconductor, by the way, is not a part-time railroad employee, but an element like silicon that is halfway between a conductor and an insulator.
As the price of computer components came down, hobbyists around the country began putting together their own small computers. The cover of Popular Mechanics of January 1975, featuring one of these amateur efforts, caught the attention of Paul Allen and Bill Gates, ages twenty-two and twenty respectively. They joined the lists of computer tyros. Around the same time, Steven Wozniac and Steven Jobs started their Apple company in a garage when they too were in their twenties. Michael Dell went one better, assembling custom-built IBM compatible computers in his University of Texas dorm room! From these early efforts emerged three commercial PCs, those of Apple, Commodore, and Atari.
Observing these start-up companies, IBM set up a task force to consider the future of minicomputers. PCs were possible because of the great advances with silicon chips. The size of a postage stamp, they could hold millions of transistors. IBM’s task force reported back in 1980 that IBM could enter this field quickly if it set up an autonomous unit within the company and designed an open machine that operated more like a system than an appliance. More crucially, it recommended that IBM buy the component parts for its microcomputers from those available on the market rather than create and patent its own.20 Management gave the green light to the project. IBM chose Intel’s micro chips. From Microsoft it ordered a language for its PCs and then an operating system.
These decisions assured the fame and fortune of Intel’s Robert Noyce and Gordon Moore and Microsoft’s Gates and Allen. Noyce, who came into computers by way of transistor inventor William Shockley, had put together an integrated circuit that could be combined with transistors on a single silicon wafer. Gates and Allen finally settled down near Seattle, where they produced an array of computer languages along with a disk operating system, MS-DOS. IBM’s powerful marketing system redounded to the benefit of these two principal suppliers. When IBM further agreed to let Microsoft license its system to others, Intel and Microsoft gained the most lucrative franchises in industrial history. Bill Gates was on his way to becoming the wealthiest man on the planet, for unlike Intel, which counted on venture capital, he and Allen had borrowed for their start-up and plowed earnings back into it.21 The early competition among PCs had produced a variety of incompatible systems. People wanted compatibility so that they could share files. Gates exploited this potential market with his IBM-compatible MS-DOS system that worked with the same generic hardware components that IBM used. With its closed system, Apple made itself a marginal player.
IBM executed its plan with dispatch. Within two years it was producing a PC every forty-five seconds and still couldn’t keep up with demand. PCs flew into people’s homes. Writers and teachers moved their sleek IBM Selectric typewriters to the garage or gave them to a charity as they began a journey of amazement and frustration with their new desktop computers. Like television sales in the 1950s, the popularity of PCs astounded everyone. Why would thousands of people, with no real need for a computer and far from conversant with its peculiar ways, plunk down upwards of thirty-five hundred dollars for a crude version of today’s personal computers? Businesses soon discovered that they could use PCs at every workstation and construct a network among them. By the mid-1990s PCs accounted for 80 percent of every dollar spent on information technology.22 “Interfacing” went from a term in dressmaking to one for attaching an electronic device like a memory chip on a computer or a peripheral like a printer. A Timemagazine cover named the personal computer “Machine of the Year, “and typists became word processors.
IBM’s success put an end to the British, French, Italian, and German computer companies that had sprung up to contest the American near monopoly in the field. In 1997, more than a third of American homes had at least one PC, with sales mounting each year.23 That same year IBM was shipping more than three million microcomputers to businesses. The opportunity to build PC clones spawned dozens of start-up companies that took advantage of the absence of patents for PCs. The popularity of desktop computers created a market as well for software designed for specific applications. The computer industry continued to be ferociously competitive, more Darwinian than anything IBM had faced before. By the end of the 1980s there was a tight race among Apple, IBM, Dell, and Compaq for PC customers. IBM’s market share had slid from 50 to 22.3 percent by 1999.24
Automobile making was not the only major industry the Japanse took on. They moved into computers and consumer electronics like television, VCRs, and DVDs with alarming speed. They made a critical move in the 1970s, when British, French, Italian, and German companies failed to keep up with IBM’s mainframes and plug-in compatibles, an ugly term for those items like printers or modems that can be connected to a computer. Japanese companies decided to continue making mainframe computers for its domestic market. Two unpredictable developments rewarded this decision: IBM moved into PCs, and the Internet created a new demand for large systems. In addition, large corporations began creating their own private networking systems, which produced a new demand for the mainframe computers that had been abandoned in the rage of PCs during the 1990s. Japan regained its European market for big systems and stayed current with electronic advances while the Europeans fell back on their excellent software.25
Another Technological Advance for PCS
Soon PC users got to connect with one another and then to a cornucopia of knowledge, information, data, and a personal message system. Networks joining people using the same mainframe computer within a company or organization gave several researchers the idea to create the technology for similarly joining individual PCs through telephone or cable lines. The actual origins of the Internet lay with the U.S. Department of Defense, which in 1969 linked together minicomputers in government and university laboratories. From this network, called ARPANET, came other networks, initially involving universities. Slowly ARPANET lost its military starch and became more like the wrinkled academic. What started out under government sponsorship became the twenty-first century’s biggest, commercial success story.
The telecommunications network Telnet went into operation in 1969 with a commercial component in 1975. The desire to connect with other computers grew exponentially as people acquired PCs. A little more technological tweaking perfected the Internet.26Meanwhile in the late 1980s, Tim Berners-Lee and Robert Cailliau at the European Organization for Nuclear Research in Geneva came up with a system to go beyond connecting computers and arrange for transferring information over the Internet by using hypertext. Their World Wide Web did indeed go worldwide as computer users discovered the wonders of the Web. Commercial possibilities emerged immediately. Soon hundreds of newspapers from many countries were available on the Web. Banks and airlines encouraged their customers to do business through their Web sites. Then actually locating this plethora of informative stuff became a problem. The University of Illinois developed the first graphical Web browser in 1993. Mosaic became more familiar to the public as Netscape. Next, Microsoft’s Internet Explorer began eating away at Netscape’s market share, followed by Mozilla Firefox, in a seemingly endless race among improving services.
Like the PC, the Web browser’s popularity was not predictable, though retrospectively its delivery of online instruction, encyclopedias, and downloaded movies and music make it hard to imagine the world without a telecommunications network and its Girl Friday, the browser. With capitalism’s insistent search for new ways to make a profit, the Internet became a vehicle for retail shopping. Free access to the Internet drew viewers, who in turn formed the basis for a booming advertising industry. The moving graphics enticed as the instant information satisfied. In 2005 start-up companies began putting lenders and borrowers together through the Internet to make loans without those middlemen called banks. Today Internet access is approaching a billion users just ahead of the six hundred million mobile phones.27 Alas, the flexibility that so delights consumers also opens up avenues of fraud. Both the music and publishing industries have encountered serious problems protecting their products from illegal sharing through the Internet.
Jeff Bezos started Amazon.com in his Bellevue, Washington, garage in 1994. A pioneer in Internet retailing, he popularized “.com” as part of a firm name. Soon the request from stores and services to their customers to go to “www [fill in the blank].com” became ubiquitous. Branching out from its initial stock of books, Amazon grew rapidly, had a rough patch, but then recovered by opening up its site to other retailers. Today the Web has largely replaced the yellow pages of the telephone book as the place to go for information for everything from pleated lamp shades to Spanish anchovies. With all these retailing novelties, the cost of serving customers has sharply declined, contributing to American prosperity in the closing decade of the twentieth century, when most other economies were slowing. Competition in these ventures has acted as a goad to better performance, but the Internet’s vast catch basin of customers has also intensified success, making for superbig winners and lots of failures.
Alongside these developments sneaked in something called e-mail. Begun with messages sent by those using the same computer system, e-mail became accessible to a larger audience through the Internet. E-mail’s popularly soared in the 1980s; today more than six hundred million people e-mail, making it the most widely used facility of the Internet. While e-mail has not eliminated telephones, fax machines, and postal services, it has cut into their spheres of influence. The U.S. Postal Service acquired a new nickname, snail mail.
By 1996 a new Internet problem had emerged: how to retrieve easily the mass of information floating freely on the Web. Again two men in their twenties—Stanford grad students Larry Page and Russian-born Sergey Brin—came up with an answer in a new “search engine” that they called Google. Another phenomenal success, Google has been turned into a verb, as in “I googled Google to learn about its history.” Winning a protracted competition with Yahoo, another popular search engine, Google saw its market value soar about two hundred billion dollars in 2005. By dint of constant improvising from ongoing research, it has developed an e-mail service with video chat capabilities. Google also bought YouTube, a video source where people can share news clips, entertainment, and amateur humor. Now the largest ad seller in the world, Google continues with seeming effortlessness to improve its proliferating features.28
One of Intel’s founders, Gordon Moore, announced—with remarkable accuracy—that the number of transistors that could be placed on an integrated circuit would double every two years, greatly expanding computer and phone capabilities. Cell phones became smart phones; functions of PCs squeezed into Palm Pilots and iPods. Though no one predicted it, but equally impressive, the price of computers decreased annually by 20 percent.29 Yet nothing quite compares with the price history of cell phones. In 1987 a Motorola cell phone was a luxury that cost $3,996; today cell phones are given away for two-year contracts with telecommunications companies.
Although Gates and Allen planted Microsoft near Seattle, William Shockley, who won a Nobel Prize for his work on the transistor, returned to his home in Palo Alto when he decided to begin his own firm, Shockley Semiconductor. Others followed him, collecting outside Palo Alto in what became known as Silicon Valley. Moving from Massachusetts to California proved propitious for start-ups because Massachusetts law favored established companies with strong “noncompete” clauses. Such laws limited the opportunities of former employees to set up businesses of their own with ideas learned on their previous jobs. Nearly invisible, these institutional arrangements can be crucial in nurturing innovation.
In the stylish world of high capitalism on Wall Street, the start-up millionaires in California were startling: youthful, informal, egalitarian in style, and decidedly brainy. Accepting the designation of “geeks,” these soft and hardware engineers were anything but cool like the denizens of lower Manhattan. Initial successes enabled dozens of men (and some women) with an ingenious scheme to find backing from venture capitalists. The dot-coms of Silicon Valley flourished until they got their comeuppance. The NASDAQ stock market exchange attracted thousands of new buyers. With investors, large and small, rushing to get in on the action in information technology, initial public offerings became oversold. The hot air from speculation burst the dot-com bubble in the late 1990s, but the avidity of Silicon Valley engineers for finding new applications did not abate. After a decade of shakeouts, Silicon Valley rebooted, as those in the computer world say. It attracted $10 billion worth of venture capital in 2007 compared with $7.2 billion for all of Europe.30 That same year American inventors filed for eighty thousand patents, a total larger than the rest of the world’s combined.
Information technology exploited the American openness to novelty. Consumers, by responding quickly to the chance to have personal computers, laid the foundation for IT’s phenomenal growth. When IBM raced to catch up with the new product, it brought the resources, learning base, and marketing know-how necessary to sustain the young industry. When IBM’s major suppliers Intel and Microsoft became billion-dollar companies, they poured much of their profit into advancing technology. At the same time, the PC craze undermined IBM’s dominance. Without patented components, it became easy for newcomers to enter the field. The simplicity of PCs freed owners from reliance on IBM’s legendary service. American success with computers, peripherals, the Internet, the Web, and e-mail boosted American capitalism, both materially and psychologically. But staying on top is never easy in a market of fierce competitors. After dominating the technological market, America in 2002 began importing more high-tech products than it exported. The trade gap in 2008 was more than fifty billion dollars.31 Worry about this might be mitigated by awareness that when Apple imports iPods from China, it adds the valuable component to the product the Chinese make.
Unlike European computer companies, Japan’s giant firms—Fujitsu, NEC, Toshiba, and Hitachi—were able to weather the American storm of success because of their size and experience in electronic equipment and telecommunications. Supported by the government, Fujitsu, NEC, and Hitachi became the world’s largest producers of semiconductors.32 The resurgence of demand for large computer systems in Europe, occasioned by networking, enabled Japan to compete with IBM in that market. When IBM opened itself up to cloning with its nonproprietary components, Japanese firms entered the PC market in a big way in the late 1980s. By 1996 only IBM, Dell, and Compaq had sold more PCs worldwide than Fujitsu, Toshiba, and NEC. The versatility of PCs had a cascading effect. New applications, peripherals, and improvements abounded. Sony entered the computer market with its CD-ROM (compact disc read-only memory) that transformed computers from data processing equipment to multimedia devices. Much of the sophistication and elegance of Japanese technology in consumer electronics and computers can be attributed to the circulation of ideas permitted by the close proximity of Japan’s giant firms concentrated in Toyko and Osaka.33
Still, old contenders never seem to surrender. The makers of American semiconductors and PCs carried their competitive spirit back to the Japanese market itself. IBM and Microsoft made a major dent in the home market because they have solved two problems of incompatibility, that of the West’s alphabet and the Japanese system of pictograms, or kanji, and the other of the diverse Japanese computers and peripherals that cannot be shared. Conjuring up that old generic magic, the new Japanese-language version of Windows, 31.J, became a great success. It can be run on any of Japan’s high-end computers as well as on all PCs.34 With this, the bouncing ball of technological innovation in computers bounced back to the United States.
Asia’s Four Little Tigers
The world market for everything connected to microprocessing gave a boost to four Asian countries, the Four Little Tigers of Singapore, Taiwan, South Korea, and Hong Kong. Their successful trajectory challenged the widespread belief that countries outside the circle of the economic giants would not be able to vault themselves into sustained growth. In the warm glow of sustained prosperity in the 1960s, when the economies of Western Europe rose like phoenixes from the ashes of war, analysts turned modernization into an end stage that all countries would reach given enough investment capital. It was an extremely optimistic prediction, the first to put prosperity into every country’s future.35 When countries in Latin America, Asia, and Africa failed to respond to mere infusions of Western capital and know-how, modernization theory slipped into deserved oblivion.
With the Cold War’s divvying up countries among capitalist, Communist, and nonaligned ones, a French demographer introduced the idea of First, Second, and Third Worlds to distinguish the First World of the West and the Second of the Soviet sphere of influence from the rest of the world, the unaligned. The first two categories never really caught on, but “Third World” served a real lexical need. A more sensitive public discourse exchanged the old term “backward” for “underdeveloped.” The Third World, according to thinking in the First, was not so much poor as underdeveloped. No longer colonial dependencies, countries outside the West could be induced to modernize on their own, a conviction suggesting that development was just a few steps away.
In the late 1940s and early 1950s, Western advisers suggested to Third World countries, particularly in Latin America, that they could best accumulate the capital necessary for development if they specialized in exporting their raw materials—beef, sugar, or soybeans—and imported manufactured goods from the West because the terms of trade were going to shift in their favor. This never happened because the demand for manufactured goods pushed up prices more than the demand for raw materials. So the core of industrialized nations prospered while the peripheral nations stagnated. Tackling the question of why countries didn’t or couldn’t modernize, Raul Prebisch, Immanuel Wallerstein, and Andrew Gunder Frank developed dependency theory, which argued that Third World backwardness emanated from decisions made in the First World. Underdevelopment was not a stage but a permanent condition, the deliberate result of policies adopted by the economic winners. Far from being inevitable, modernization was a chimera. By encouraging Third World countries to find their niche in the world economy, the advanced industrial powers were consigning them to a permanent periphery, a state of dependency from which they could never escape unless they pulled out of a game where the cards were stacked against them.36
Dependency theorists recommended that the countries in the periphery do an about-face. They should make their raw materials more expensive to outsiders and start to produce for themselves the manufactured goods they had been importing. This import substitution program would thwart the West’s exploitation and save precious exchange funds. They also hinted strongly that it was impossible for any undeveloped country to generate enough capital for a takeoff into economic development. Adding a conspiratorial note, some Latin American experts traced the region’s problems to exploitation by the United States operating through the CIA and multinational corporations.37 Further evidence of American malevolence accrued when Milton Friedman’s monetary policies boomeranged when applied in Argentina in 2000. The International Monetary Fund became implicated after encouraging the countries to borrow heavily. The idea that peripheral countries could enter the world market on advantageous terms looked almost dead, until nations in East Asia showed how it could be done.
Hong Kong, Singapore, Taiwan, and South Korea proved dependency theory wrong. Their economies managed their own takeoffs into self-sustained growth, doing in thirty years what it had taken Japan a hundred to do. Successful development in Taiwan and South Korea started with land reform, a step strongly backed by the United States, which exercised a powerful influence on the leaders of Korea and Taiwan through its aid programs. Just moving landownership from the hands of a leisured elite to those of the working farmers had many profound and lasting consequences. Crop yields went up, lowering food prices and giving everyone more purchasing power. Tax revenues from the new landowners went into the purchase of fertilizer, equipment, and farmer education programs in a mutually enhancing spiral upward.38 As in England in the seventeenth century, agricultural improvement required fewer workers, releasing men and women for other occupations, like manufacturing. The more egalitarian distribution of wealth created by land reform made rural radicalism less likely while it undercut opposition to modernizing reforms that entrenched landed elites usually mount. Less tangibly, the relative income equality in Singapore, Taiwan, South Korea, and Hong Kong consolidated the support from a prospering working class. One can only wonder what would have happened to the economies of Argentina and Mexico if they had undertaken similar land reforms.
More important, the Korean War of 1950–1953 had introduced a big spender into the Pacific basin trade universe. The founder of Hyundai, Chung Ju-yung, for instance, found good customers in the American armed forces for his two lines of business, construction and car repair. Born into a poor peasant family in North Korea, Chung had already demonstrated his intrepid character and knack for business during the Japanese occupation. In the years that followed, Hyundai manufactured cars from Japanese components and moved onto the world construction stage, building expressways, ports, nuclear power plants, and shipyards.
At first the United States had supported democracy in these countries as well as Japan, but the invasion of South Korea led American policy makers to take a sharp turn to the right. They tolerated repressive regimes in Singapore, South Korea, and Taiwan in exchange for a firm anti-Communist stance. Still economic benefits followed. In 1960 Singapore became the principal host for the Seventh Fleet of the United States, providing a place of repair, rest, and recreation, rather than a base for its ships. More relevant, the United States never wavered in its support of economic development, sending money and experts to South Korea and Taiwan.39 The Four Little Tigers all had political cores made up of technocrats and market advocates who were able through pressure or repression to insulate their policy preferences from domestic critics. They also developed alongside hostile Communist neighbors, helping their leaders stifle dissent. Legislatures, where they operated, were kept weak, leaving the field of political action open to strong executives. Still, over time South Korea and Taiwan became more democratic.40
In their economic ascent, the Four Little Tigers rejected the policy of import substitution and decided to promote exports instead. “Decided” is the correct word because their political leaders did the thinking and planning. In no case was the domestic market of the NICs big enough to support the economies of size that would make them competitive in world markets. So, after some initial failures, they established free ports and became “superexporters,” starting with traditional clothes, textiles, and footwear, then moving to consumer electronics like calculators and color television sets. Korea even manufactured iron and steel items. In the 1960s the United States opened its market to such imports, as did Australia and New Zealand. By 1980 their exports represented more than 50 percent of their GNP, compared with 8 percent for the United States and 16 percent for Japan.41 This of course took money and workers. Here the population density of the NICs became an asset, as did their people’s commitment to acquiring the skills and learning for labor-intensive, complex production processes. All four countries maintained high levels of domestic savings—above 20 percent—so reliance upon foreign aid investment tapered off once they got started.42 Their particular mix of advantages is not easily duplicated, but the blueprint is pretty clear: export, educate, innovate, and find niches in the world economy.
Because Hong Kong, Taiwan, Singapore, and South Korea succeeded in the same area at the same time—Hong Kong and Taiwan in the 1950s, Singapore and South Korea in the 1960s—their similarities seem more important than their differences. Still, the differences are worth noting. Hong Kong was a British crown colony until 1997, when it was reincorporated into China. Singapore was a poor city when it was expelled from Malaysia in 1965. Commenting on its spectacular success, its founder, Lee Kua Yew, now calls it “a first world oasis in a 3rd world region.” The same could be said for Hong Kong. Korea shared the distinction with Vietnam and Germany of being divided between Communist and non-Communist parts, but unlike Germany, it is still so divided. Taiwan too has a perilous existence as a breakaway island province from China, run for a long time by the Chinese Nationalist Party which fled China when the Communists took power in 1949. Singapore, steadfastly authoritarian, has had to integrate the most diverse population, composed of Chinese, Malay, and Indians with a strong representation of Christians, Hindus, and Sikhs. The other three are more homogenous.
Timing, location, and luck played their parts in the phenomenal success of Hong Kong, Singapore, Taiwan, and Korea. Two industrial giants, Japan and the United States, had a positive impact on their growth, which averaged 7 to 9 percent in GNP annually during the 1960s and 1970s. Both “sugar daddies” offered a big market for the NICs’ products along with an infusion of investment capital. The Four Tigers caught the wave of explosive growth in consumer electronics and computers in part because Japan was challenging American dominance. The American firm Texas Instruments set up overseas assembly plants for semiconductors in Taiwan. Soon American firms were buying smaller peripherals and components from Taiwan. Taiwan also produced motherboards, monitors, keyboards, scanners, and mice leading to a major production of notebook computers. Hong Kong and South Korea got into these productive lines as well, especially semiconductors in Korea. Singapore too developed a strong manufacturing preference in the American-led PC industry, becoming the source of most American PC disk drives. Korea and Taiwan could shift their exports toward more skill-intensive products, such as consumer electronics, in the 1960s and 1970s because it already had a proficient work force.43
They made long-range plans and were lucky enough, despite some turmoil, to enjoy the order and peace that ripened their plans into mature performance. Their governments invested enough in utilities and communications systems to prevent the bottlenecks that have plagued other developing countries where poor transportation has delayed the flow of goods between production and shipping sites. The courts have worked well and fairly, though the draconian laws of Singapore still appall. In a unique mix of government direction and free market dynamics, these countries have confounded many an economic prediction, none more hallowed than the idea that inequality accompanies economic development.44 They have benefited enormously from fitting themselves into the niches created by each new technological breakthrough. Korea, with a current population of forty-eight million, has a GDP ranking of eleventh in the world. Even their neighbors Malaysia, Thailand, and Indonesia are developing in promising ways.
Economic development transformed women’s lives in these traditionally patriarchal societies. A measure of the preference of Asian families for boys can be demonstrated by their skewed sex ratio. Once ultrasound permitted a pregnant woman to learn the sex of her fetus, abortions of females began to climb. The normal ratio of the births of boys to girls is 105:100. In recent years it has reached the high of 120:100 in China, with other Asian countries close behind. Officials in most countries have soundly condemned the practice. In India a doctor or nurse telling a woman the sex of the child she is carrying violates the law that was passed to stem this practice. Yet it is widely violated. Estimates put the number of female fetuses aborted annually in India at ten million.45 This hardly sounds like a benefit to women, but something is happening today in South Korea that suggests a turnaround as its sex ratio has dropped from 116 to 107.
A new appreciation for girls has emerged in this once deeply traditional society.46 Good schooling has brought more and more women into jobs in business and the professions. At a practical level, parents no longer depend upon their sons to support them in retirement, for they are retiring with benefits. Their daughters, working outside the home, are no longer near servants to their husbands’ families. They earn their own support and maintain the family’s emotional ties better than their brothers do.
During the 1970s experts considered everything Japan did as optimal. But even the best of times must come to an end, as the saying goes, or maybe “what goes up must come down” is more apropos. After astounding the world by becoming its second-largest economy, Japan slid into a prolonged recession in the 1990s. The quality of its cars and stereo equipment continued to impress; its lean production put to shame American and European factory management, but these strengths couldn’t prevent a downward spiral of prices. Stock market and real estate values dropped, leading to an accumulation of bad debts. To boost the economy, the government finally poured trillions of yen into public spending that pushed up the value of the yen and an unintended but consequent fall in exports. Nature kicked in with a major earthquake.
These problems proved intractable, and they exposed some of the structural weaknesses in the Japanese economy, the most prominent being the cozy relationship between its leadings banks and corporations and the government. This revelation garnered some important support around the world for America’s strong antitrust policies. The Japanese had had antimonopoly legislation since 1945, but these laws were weakly enforced. When the economy took a dive in the 1990s, the government put some teeth into their enforcement. Japan broke up its telecommunications monopoly, as the United States had done in 1982.47
Although there was a modest recovery in Japan in 1997, prices declined again, and nothing seemed to relieve this deflationary pressure. When Thailand, Indonesia, Korea, and Singapore experienced a financial crisis that year, Japanese firms and households became more anxious, further deflating the economy.
The Asian crisis highlighted the need for more transparency in government programs, less rigid exchange ratios, and stronger, better-regulated financial systems. The International Monetary Fund shored up Japanese markets with large infusions of cash, much of which went to buy food, fuel, and medicine for those most distressed by the unexpected downturn. Other problems, like the absence of bankruptcy in Korea, came to light. As one expert noted, “capitalism without bankruptcy is like Christianity without hell. There is no systematic means of controlling sinful excesses.”48
Walmart Retailing Wizardry
Microprocessing was by no means the only engine of capitalism in the last decades of the twentieth century, though it was integral to one of the most astounding successes of the century, Walmart. Sam Walton started a chain of discount stores in Arkansas, Missouri, and Oklahoma in 1962. He began an astounding ascent to the position of the world’s number one retailer by figuring out how he could buy directly from manufacturers and bypass the wholesalers, who added 4 to 5 percent to prices. Walton turned his Bentonville headquarters in Arkansas into a distribution center that could receive bulk orders from suppliers and send them to particular stores through a fleet of Walmart trucks. Being able to buy big-city items at low prices made a big hit with customers in the small towns where Walton placed the stores in his expanding empire. Reminiscent of Tom Watson, Sr., at IBM, Walton became a bigger than life figure for his employees. His style was simple, direct, and a bit intrusive. Everyone was on a first-name basis; he drove around his vast empire in a pickup truck. He hired young men, often the sons of farmers, and instilled them with a spirit of company loyalty that merged into a shared evangelical piety.
Like Ford and Carnegie, Walton didn’t know how to think small. When he wanted to start a new store, he’d fly over the chosen area, mark the spot most easily reached by a cluster of towns, land his plane, and buy up a piece of farm property.49 And then another and another until some seven thousand Walmart stores sprang up, many outside the United States. Even though Walton was born in 1918, he became the retailing maven of the information technology revolution. First he networked his stores with computer connections. He installed the most advanced inventory control. Whenever a cash register rings up a Walmart sale, a message goes to company purchasing agents, the manager of the store, and the vendor saying that another Hewlett-Packard printer or Disney DVD should be sent to Bentonville.
This just-in-time restocking systems helped both Walmart and its suppliers. It also enabled Walmart executives to analyze what its customers wanted in winter or summer, flush times or lean ones, when celebrating an anniversary or anticipating bad weather. Walmart truck drivers keep in constant radio or satellite contact with headquarters to learn where to pick up items so that they can return from making deliveries with full loads. Expanding size and scope made this system more and more efficient. Computers track the pallets moving endlessly through the vast Walmart loading area. When its managers discovered that bar codes on items could be mutilated or unreadable, they switched to radio frequency identification tags that convey all the necessary inventory information through antennas and radio waves into computers.50
Everyone who works for Walmart is kept on a tight electronic leash. Critics say Walmart became the behemoth of world retailing by driving down wages and scaling back benefits for its own employees and those of their suppliers. Its vendors claim that its ruthless bargaining has reduced everyone’s profit margin, sometimes to the point of vanishing. Admirers point to the boon of low prices for low-income families. Less entranced observers focus on Walmart’s arrogance in insisting that all business with it be done in Bentonville, Arkansas. Sam Walton liked flying around rural America, but he didn’t want to do business in Chicago, Los Angeles, or New York. Vendors have to travel to Walmart headquarters, and many keep offices there. A Disney executive wryly noted that when his company, not known as a pushover, had disputes with Walmart, it always lost and had to go to Bentonville to do so.
The Walt Disney Company has been selling its DVDs, toys, interactive games, and apparel in Walmart’s seven thousand–plus stores. With Disney parks in Japan, France, and Hong Kong, in addition to the United States, the company has developed a large customer base for the consumer products that Walmart distributes. Of course maintaining a record of high-quality entertainment, especially for children, since 1929, Disney was already doing well. It had long been the largest publisher of children’s books and magazines that go into the homes of one hundred million customers in seventy-five countries. Mickey Mouse, who turned eighty in 2008, is the most recognizable icon in the world. More Americans (95 percent) recognize him than they do Santa Claus. Movies made in the 1930s and reissued regularly have introduced Disney characters around the world. When rivals to Spain’s president José Luis Rodríguez Zapatero wanted to blast his gentleness, they called him Bambi.
Competitors took notice of Walmart’s success and emulated its system. Otherwise three other American discount retailers—Home Depot, Costco, and Target—would not be among the top thirty-five of the Fortune 500 companies. A retailer as extensive as Walmart has greatly enhanced the commercial reach of many American companies like Disney, not to mention introducing a host of foreign goods into American homes. Walmart imports one-third of all consumer durables produced in China.51 Like a great oak that suppresses all growth around it, Walmart stores have made ghost towns of countless small cities by drawing customers to their exurban locations. Walmart has also steadfastly fought off unionization of its 1.4 million employees, an effort that has provoked a vigorous anti-Walmart campaign. Some of those employees took the company to court for violation of wage and hour laws, winning a fifty-two-million-dollar settlement. A substantial number of other Walmart employees share Sam Walton’s linkage of evangelical fundamentalism with free market competition.
The conspicuous difference between America’s number one employer at the end of the twentieth century and the automakers forty years earlier is that Walmart has not pushed its employees into the middle class. Instead it has been a cause and an emblem of a seismic shift in the fortunes of American workers. After World War II, powerful labor unions, buoyed by decades of prosperity, successfully bargained for high wages, worker safety, and generous benefits. Jobs in American industry that had been unsafe and unremunerative became the basis for a generous standard of life. People now regret the loss of all those great jobs in the steel mills, but they have put the horse before the cart. They were lousy jobs before their work force was unionized.52 Over the succeeding decades, a variety of factors battered organized labor: vigorous political opposition from management, a shift of jobs away from industry to the more difficult to organize service sector, failures in union leadership, the intense competition of the world market, and the new technologies that made it easier for producers to move abroad, where there were pools of cheap labor. Most devastating, the American public lost its sympathetic connection to organized labor, no longer viewing a strong labor movement as good for the economy and essential for a vigorous democracy.53 Once this was gone, an endeavor like Walmart could switch public attention away from securing good paying jobs to having access to cheap goods.
Had wages not been driven down in recent decades, low prices would not figure so prominently in people’s calculation of their interest. Globalization, that combination of marketing and messaging, has played its part in the shift of American labor from a coordinated, stable, securely earning group to an aggregation of individuals perpetually uncertain about the next paycheck. Moving in tandem, information technology and globalization have opened up markets and made them more out of the reach of any government’s control. Even profits have been rendered less secure. Dependency upon unseen forces and unexpected shifts in supply and demand have introduced a stratum of worry that undermines the efforts of labor to win back its public voice and bargaining power. In the last decade, union leaders have raised the call for international standards for labor, revisiting the old issues of gaining living wages, safe working environments, and the eight-hour day for a global economy. The rapidity with which cheap labor centers have moved from countries like Mexico to China and then on to Vietnam suggests that this campaign may gather force in the next decade.
Globalization has thrust Western culture into all the communities of the world, evoking a powerful reaction to the intrusive images coming out of American movies and television shows. Criticism often comes from members of the educated elite who seem unable to appreciate that the commercialization of entertainment has delivered a powerful antidote to boredom. With more disposable income, millions around the world are patronizing the products of Hollywood and Bollywood and hundreds of other sites that produce drama and documentaries. Although an important economic force in itself, the American entertainment industry has influenced people’s material aspirations in a way that’s probably been more significant for economic development than its revenues. An alternative way of life comes in with its CDs, DVDs, videos, TV shows, and movies.
More Personal Choices with Capitalism
Capitalism has encouraged countless new drugs and medical procedures. Perhaps the most revolutionary of all for women has been effective birth control. This means that while busy generating wealth, the market has also increased the number of options in people’s lives, setting off what might be called a rebellion of the womb. Birthrates are dropping precipitously, as women in the West and Japan are having fewer children, and many having none at all. This has shocked those who thought that they understood women’s natures. In many countries, there are not enough births to replace the existing population. At the same time, in much of the West the sexual freedom of the 1970s has altered attitudes and practices about pregnancy and marriage. Now half the babies born in France and the United States, for instance, are born “out of wedlock.” While the plight of the single mother is real, it is also the case that in the past illegitimacy carried the onus of a disgrace and one that fell on women more than upon men.
What capitalism has uncovered is what many people really want. The value systems of the past grew out of scarcity and restraint. Traditions prioritized ways to behave and ennobled values compatible with the scarcity of food and other goods. The international press, which capitalism has promoted, has also carried to the most remote places on the globe the spirit of the Helskini Accord on human rights, if not the actual text. In the first decade of the twenty-first century, an eighteen-year-old woman living among the Stone Age inhabitants of New Guinea fled to Papua to assert her right to choose her husband, and a ten-year-old girl in Yemen found a court where she could seek a divorce in 2008.
Pharmaceutical companies in the United States and Europe came into their own with a cornucopia of new drugs in the 1980s and 1990s. Many of them targeted the aging populations around the globe; new antidepressants also became hugely successful among men and women no longer willing to accept melancholy as a fact of life. In addition to the research done in corporation laboratories, European and American universities have devoted billions to finding new cures for old maladies, in some cases eliminating old diseases altogether. The U.S. National Institutes of Health and the Department of Energy launched in 1990 the Human Genome Project, which became an international effort to identify the genes in human DNA. An astoundingly ambitious effort, the project determined the sequencing of the three billion chemical base pairs that make up human DNA. Slated to take fifteen years, it finished early in 2003, when a private geneticist, Craig Venter, turned the project into a race between competitive sequencing efforts. Still undeveloped, genenomics is pushing genetics in new directions, many with commercial possibilities. This has aroused fears about interfering with natural processes. There is also concern that disinterested science will become a thing of the past with pharmaceutical companies lavishing gifts upon researchers. Of the top twenty pharmaceutical companies, twelve are American, two Swiss, and two German. Britain, Sweden, Japan, and France have one each.
Capitalism had proven its adaptability and its capacity to nurture technology and commercialize its findings. Firms, universities, and whole countries have built impressive learning centers, most of them open to outsiders. Curiosity about how the world works has been sustained; talent has often trumped wealth, as the computer revolution demonstrated. Since World War II, institutions to promote development and cooperation among Western nations have gained in influence, but not without generating discontent. The industrialized nations have often played their strong cards with indifference to the other players in the world market, ignoring calls for eliminating domestic subsidies, for instance. Not exactly a tradeoff but relevant has been the decline of violence—both public and private—since 1975. Even with the war stories that clog our newspapers and nightly news shows, casualties on various battlefields are minuscule compared with the first three-quarters of the twentieth century. Forget the carnage on a single day in a World War I battle, and just compare the forty-seven thousand American battle deaths in five years of fighting in Vietnam with the four thousand plus in six years in Iraq.
And then there is the ambiguous link between capitalism and democracy. The United States has been a vocal booster of both. The connection was actually forged much earlier when the market economy in late-seventeenth-century England revealed that quite ordinary people could take care of themselves and make reasonable decisions about their welfare. Over time these observations replaced earlier assumptions that men and women were woefully fickle creatures, derailed by their emotions and cursed with a tendency toward wickedness. With an improved view of human nature, sober thinkers could entertain the idea that rule of the people—democracy—might be a good form of government. The United States put these ideas into practice after its Revolution. A few years later it ratified a constitutional order that sharply curtailed the will of the majority and guaranteed a panoply of civil rights. The joining of capitalism and democracy in popular thinking caused a Russian woman, looking at her empty cupboard, shortly after the collapse of the USSR in 1991, to announce that there would be no democracy in her country until that cupboard was full. To her, evidently, majority rule meant abundance, presumably because both were found in the United States.
The disintegration of the post-World War II synergy after 1973 introduced a period of fluidity and fluctuations in the fiscal and commercial stability of Japan, the United States, and Europe.54 Critics look for structural changes that will undermine capitalism as a system. They often underestimate the two enduring strengths of capitalism, encouragement of innovation and a capacity to create new wealth along with the real satisfactions that wealth brings to a growing population of recipients.
The shame in the flourishing of capitalism is the stark inequality between nations and regions of the world. Measures of well-being like life expectancy, family purchasing power, and children’s nutrition reveal greater inequalities than fifty years ago.55 A statistician might point out that this spread is as much a function of the improvement of billions of people as the want of others. It’s very much of matter where one directs a spotlight, but when you illuminate America’s Rust Belt or Zimbabwe’s child mortality rate, capitalism looks like a failure. Cities like New York, Geneva, Seoul, and Tokyo tell a different story.
The race we know goes to the swift, but we should also be wise enough to realize that the analogy between a footrace and life is imperfect. Lots of evidence indicates that competitive international trade brings wrenching social and moral pain. When the creative destruction of competitive enterprise hits, the losers suffer. We have vigorous preservationist movements because we often learn too late that not all improvements improve. The second millennium went out with widespread anxiety that the computers we had come to depend upon might not be able to make the change from the dates 1999 to 2000. It turned out that they could, and so could we. A year later an attack on one of the symbols of capitalism, the World Trade Towers of New York City, made it obvious that we all were moving into an unknown and disturbing future. Still, the arrows don’t point in the same direction. New threats, new opportunities, new problems, new solutions, new perplexities, and new possibilities abound.