7

THE CORPORATE PARADOX, 1975–2002

In 1973, the Sears Company proudly unveiled the world’s tallest building. The skyscraper in downtown Chicago was a study in superlatives: 1,454 feet high, with 16,000 tinted windows and 80 miles of elevator cable. For Sears, it was a convenient bit of one-upmanship over Montgomery Ward. It was also a proclamation of the self-confidence of American capitalism. The American corporation bestrode the world. Why shouldn’t it allow itself a little self-indulgence in the form of a 110-story tower?

Paradoxically, that self-confidence proved to be both eminently justified and hopelessly deluded. The justification was found in the triumph of private-sector capitalism, spurred on by privatization and deregulation around the world: the next twenty-five years saw the joint-stock company vastly expand its territory, trampling many of its rivals as it did so. The delusion was that it would be companies like Sears that would thrive in this freer world. By 2002, the basic idea of a big company—a multidivisional, hierarchical institution that could offer a lifetime career to its employees—had been unbundled. The increasing fragility of individual firms had profound effects on the company’s relationship with the rest of society, with many of the doubts and frustrations bubbling to the surface after the Enron scandal in 2002.

HAIL THE COMPANY

In the early 1970s, big companies were expected to play a pivotal role in supporting the postwar consensus. In return for economic stability and social peace, they were expected to look after other stakeholders. But that consensus was beginning to get more burdensome. The economy in many countries was in a wretched state. Unions had seldom been more powerful: in 1974, the miners toppled Britain’s Conservative government. And even in America, governments kept introducing bothersome rules. In 1971, Richard Nixon introduced controls on wages and prices. His administration also launched affirmative action and established some of the country’s most powerful regulatory agencies, such as the Environmental Protection Agency and the Occupational Safety and Health Administration.1

The deregulatory revolution began in Britain, where Margaret Thatcher was swept to power in 1979 by a wave of resentment over strikes and stagflation. Privatization was such a radical idea that the Tories scarcely mentioned it in their 1979 manifesto, and the government initially flirted with “corporatization”—making public companies act more like private ones. Eventually, Thatcher and her guru, Keith Joseph, rejected the idea as insufficient—like trying to “make a mule into a zebra by painting stripes down its back.”2 The mules had to be put back in the private sector.

In 1982 and 1984, the government privatized its share in North Sea oil and gas; this was soon followed by British Telecom, British Gas, British Airways, and British Steel. Even the water supply and the electricity grid were handed over to private companies. By 1992, two-thirds of state-owned industries had been pushed into the private sector. Privatization was invariably followed by the downsizing of the workforce (sometimes by as much as 40 percent) and the upsizing of executive salaries, both of which raised the public’s hackles, and the Conservatives made a complete hash of privatizing British Rail. But in general the new companies improved the services on offer.

European governments soon followed suit, prompted by the introduction of the single market in 1992. Venerable national champions such as Volkswagen, Lufthansa, Renault, Elf Aquitaine, and ENI were wholly or partly privatized. Deutsche Telekom became Europe’s largest privatization. In Latin America and Southeast Asia, government also sold off telecom companies and utilities, too often to their political supporters. But the most radical expansion of the company took place in the former Communist world.

In 1992, the Yeltsin government embarked on a gigantic program of privatization. It first “corporatized” state-owned enterprises by rechartering them as joint-stock companies, with the state owning all the shares. It then issued vouchers to every Russian citizen (including children) to buy shares. By 1996, some eighteen thousand companies had been privatized, including more than three-quarters of all the larger industrial ones.3 Even more than in Western Europe, this was far from an unqualified success: many of the new companies were still run by the old nomenklatura, and millions of people lost their jobs. Yet, some 40 million Russians became shareholders, and the idea of the company persisted.

The Chinese, by contrast, tiptoed into privatization. In the 1990s, they allowed small entrepreneurs to create companies. They also created a small class of “red chips”—favored state firms that were allowed to register on the Hong Kong stock market. But at the Fifteenth Party Congress in 1997, the pace of reform increased dramatically. The Party decreed that most of the country’s state companies—some of which employed thousands of people—would be freed from state control and operated as “people-owned companies.” They would issue shares and be subjected to mergers and bankruptcy.

Meanwhile, the company expanded its range within its traditional hunting grounds. In Washington, D.C., too, the talk was of deregulation. Jimmy Carter started the ball rolling by deregulating the airline industry. Next came railroads and trucking. America’s biggest regulated company, AT&T, was broken up in 1981. As we shall see, bureaucrats on both sides of the Atlantic were more ready than ever to circumscribe the activities of companies indirectly, adding social obligations. But even at the European Union they worked to make companies easier to set up: work began in Brussels on standardizing a European-wide company directive.

Meanwhile, within the existing private sector, the publicly quoted joint-stock company consolidated its hold over capitalism. There were still plenty of different sorts of businesses. Indeed, financiers and tax accountants conspired to invent new ones: open the back pages of any company’s annual report in 2002 and you would find a lengthy list of “single-purpose vehicles” and limited partnerships based in the Antilles (assuming such things were detailed at all, which, in some cases, notably Enron, they were not). But these were merely the outer defenses. In general, the larger businesses got, the more they tended to converge on the joint-stock idea. Around the world, institutions that had stuck to partnership structures or mutual societies for decades—Goldman Sachs, Lloyds of London, a whole host of insurers, friendly societies, and farmers’ banks—converted to joint-stock companies.

THE UNBUNDLING OF THE COMPANY

Sears’s perspective on the global triumph of the joint-stock company must have been mixed. Within a decade of the Sears tower appearing, America’s biggest retailer was fighting for its independence. While Sears’s managers had been going about their ancestral business of battling Montgomery Ward, the department-store market was disappearing, undermined partly by Wal-Mart, an upstart from Arkansas that Sears’s internal positioning documents did not even mention until the 1980s. In 1992, Sam Walton (who incidentally had been offered a job by Sears as a young graduate from the University of Missouri back in 1940) died as America’s richest retailer. In the same year, Sears made a net loss of $3.9 billion. A new chief executive, Arthur Martinez, saved the business only with the help of dramatic cutbacks. In 2001, a slimmed-down Sears announced that it was ceasing to be a department store: it would concentrate on clothes. As for Montgomery Ward, it had gone out of business in December 2000, laying off 37,000 people.

The rate at which large American companies left the Fortune 500 increased four times between 1970 and 1990. Names that once bespoke corporate permanence, like Pan Am or Barings Bank, disappeared. Corporate Icari, like Netscape and Enron (named the most innovative company in America by Fortune for six years in a row), emerged from nowhere and changed their industries but in one way or another flew too close to the sun and plummeted to the ground.

Far from being a source of comfort, bigness became a code for inflexibility, the antithesis of the new credo, entrepreneurialism. In 1974, America’s one hundred biggest industrial companies accounted for 35.8 percent of the country’s gross domestic product; by 1998, that figure had fallen to 17.3 percent. Their share of the nation’s workforce and its corporate assets also roughly halved.4 Big firms grew (by 1999, the average revenue of the top fifty companies in America reached $51 billion); they just grew much more slowly than small ones, which supplied most of the new jobs throughout the developed world. Big firms were much more likely than ever to go out of business: by 2000, roughly half the biggest one hundred industrial firms in 1974 had disappeared through takeovers or bankruptcy.5

The big firms that survived this maelstrom only did so by dint of bloody internal revolutions. In the first five years of the 1990s, IBM, a company once so stable that it refused to sack people during the Depression, laid off 122,000 of its workers, roughly a quarter of the total. Jack Welch’s two-decade reign at General Electric began in the 1980s with a period of shocking corporate brutality. A series of quasi-Maoist revolutions followed, complete with slogans (Work Out, Six Sigma, Destroyyourbusiness.com) and methods (getting thousands of managers to measure each other’s “boundarylessness,” and sacking the underperformers). By the time Welch retired in 2002, GE, which had repeatedly been voted the world’s most admired company, had become at its heart a services conglomerate. Despite this painful metamorphosis, the company still looked vulnerable, with analysts wondering whether Welch’s successor could keep the group together.

By the turn of the millennium, it no longer seemed odd that, at least for a time, the biggest challenge to the world’s richest man, Bill Gates, should suddenly spring up in a Finnish university dorm or that its product—the new operating system, Linux—should be given away for free. Such uncertainty proved too much for the Sloanist idea of a company. It was too slow, too methodical, too hierarchical, too reliant on economies of scale that were withering away. It also proved too cumbersome when it came to husbanding knowledge.

Brainpower had always mattered in business. But this truism became far more valid after 1975, as Peter Drucker’s knowledge workers finally began to make their weight felt. By the end of 2001, General Motors boasted net-book assets (tangible things like factories, cars, and even cash) of $52 billion, but its market value of $30 billion was only a fifth of that of Merck, a drug firm that could muster a balance sheet value of $7 billion, but had a far more valuable trove of knowledge. In 1999, America’s most valuable export was intellectual capital: the country raked in $37 billion in licensing fees and royalties, compared with $29 billion for its main physical export, aircraft.6

The story of the company in the last quarter of the twentieth century is of a structure being unbundled. Companies were gradually forced to focus on their “core competencies.” Ronald Coase’s requirement of the company—it had to do things more efficiently than the open market—was being much more sorely tested.

The managers of big companies liked to claim that new technology made it more efficient to bundle things together in a single company. In a few cases, this proved correct. Big media conglomerates were able to sell the same “content” through different channels. New technology to monitor drivers in the trucking industry in the 1980s made it cheaper for shippers to employ them directly, so they got bigger. 7

Yet, for the most part, the world was moving in the opposite direction. Despite all the consolidation at the top of the media world, the number of small companies in places like Hollywood multiplied, with many of these specialists sucking most of the value out of the industry. More people left big firms to set up on their own: in Britain, for instance, the number of companies rose by 50 percent between 1980 and 1996.8 And as big companies were forced to refocus on the things that they could do cheaper or better than outsiders, they discovered that such “core competencies” lay not in tangible things, such as factory equipment, but in intangible values: the culture of discovery at Glaxo Wellcome, for instance, or the traditions of engineering at Mercedes-Benz.

It is perhaps not surprising that hollowing out was commonplace in Silicon Valley: Cisco managed to become one of America’s biggest manufacturers while only directly making a quarter of the products it sold. But the same thing was also happening in older firms. For instance, Ford’s River Rouge plant in Dearborn, Michigan, had once represented the zenith of integration, employing 100,000 workers to make twelve hundred cars a day, and producing almost everything itself, including its steel. Yet, by 2001, 3,000 people at River Rouge produced eight hundred Mustangs a day, mainly assembling parts sent in by outsiders, and Ford’s bosses were talking about the carmaker becoming a “vehicle brand owner,” which would design, engineer, and market cars, but not actually make them.9

ROUND UP THE USUAL SUSPECTS

There was something strangely backward-looking about all this. The networks of specialists, the ever-changing alliances, the constant sense of foreboding: these might have been familiar to the Merchant of Prato.

Three groups of people played a leading role in unbundling the corporation: the Japanese, Wall Street, and Silicon Valley. The creativity, carnage, and (sometimes) corruption that this trio unleashed in turn set the scene for a fourth player to reassert itself in the wake of the Enron scandal: the government.

In the mid-1950s, a young Londoner with a taste for the open road and the wind on his face would never have dreamed of looking beyond Britain to buy a motorbike. What could be more stylish than a Vincent Black Shadow, a Triumph Thunderbird, or a Norton Dominator? Harley-Davidson commanded similar feudal loyalty from Americans. A decade later, bikers everywhere were aware of an alternative. At first, the main attraction of Honda, Yamaha, Kawasaki, and Suzuki was price. But the four Japanese firms soon became the industry’s pioneers, introducing electric starters, four-cylinder engines, and five-speed transmissions, and launching new models every year. By 1981, Harley-Davidson had been forced to seek government protection, and the British motorcycle industry was to all intents and purposes dead.

This story seemed emblematic. In 1980, Chrysler, obliterated by better Japanese cars, lost $1.7 billion and had to be bailed out by the government. Sony and Matsushita had sewn up the consumer-electronics industry, and the Japanese had Silicon Valley on the run. Meanwhile, the idea that Japanese capitalism could work only with Japanese workers was about to be shattered. During the 1980s, the Japanese made direct investments overseas of $280 billion, ten times the figure for the previous three decades.10 That still left Japanese companies with a smaller share of corporate America than the British, and a much smaller share of corporate Europe than the Americans.11 But their change in stature was dramatic. They picked up a string of corporate trophies, including Firestone, Columbia Pictures, Rockefeller Center, and two of the world’s best golf courses, Turnberry and Pebble Beach.

In 1992, Rising Sun surged to the top of the best-seller lists: its author, Michael Crichton, painted a picture of a fiendish master race of businessmen, marshaled into families of firms and backed by an inscrutable government, cannily outmaneuvering their naïve American peers. This view was hardly confined to cheap novels. In the early 1990s, the business sections of American bookstores were crammed with paeans to Japanese capitalism. In Europe, the myth of the unstoppable Japanese company neatly replaced the 1960s myth of the unstoppable American company. Japanese manufacturers reinvented the once-reviled British car industry, turning allegedly work-shy Geordies into paragons of productivity—and creating what one French car boss unsubtly dubbed “an off-shore aircraft carrier” to attack the Continent.

Of course, Crichton was wrong. In the eight years following the publication of Rising Sun, the Nikkei index lost around two-thirds of its value, while the NASDAQ, reflecting those battered American high technologists, rose fivefold. The Japanese model of the company proved to have its problems. Yet it still managed to change business around the world, not least because it represented a cohesive alternative to the Western model.12

At the heart of the Japanese model was Toyota’s system of lean production. After the war, Toyota’s bosses toured American factories and became obsessed by the amount of muda or wasted effort they saw. They turned to the ideas not only of Peter Drucker but also of W. Edwards Deming (1900–1993), who focused on improving quality. (Deming was virtually unknown in America; by 1950, Japan had a much-publicized annual Deming Prize for manufacturing excellence.) Toyota treated all the different parts of the production system—development, purchasing, manufacturing—as a seamless process, rather than a series of separate departments. It brought together several important ideas, such as total quality management (putting every worker in charge of the quality of products), continuous improvement (getting those workers to suggest improvements), and just-in-time manufacturing (making sure that parts get to factories only when they are needed). Workers were put into self-governing teams, and there was far more contact with suppliers.

These ideas were shocking to American managers. Under the Sloanist system, “quality control” was a department. The idea of allowing a worker to stop a production line seemed heretical. Indeed, many American companies initially missed the point, and decided that Japan’s success was based on technology: General Motors, for instance, spent billions on robots in a desperate attempt to catch up with Toyota. But gradually they began to learn from the Japanese. In 1987, America launched its own equivalent of the Deming Prize, the Baldrige. America’s high-tech industries discovered that as long as they embraced Japanese manufacturing methods, they could compete in innovation and design. Harley-Davidson used its period under government protection to change its working practices as well as to upgrade its machinery. By the early 1990s, it was back on level terms with its Japanese peers.

The other part of the Japanese model might be dubbed “long-termism.” Japanese companies believed in lifetime employment for all (something that their Western rivals tended to reserve for white-collar workers). Management was usually by consensus—again something Jack Welch would have found inconceivable. Japanese companies operated in families or keiretsu—reinventing the zaibatsu that General MacArthur had broken up—while American companies operated as independent units. And while Western companies tended to be accountable to short-term investors, Japanese firms financed their expansion with loans from their keiretsu banks. As for profits, these were deemed less important than market share. Japanese firms were prepared to tolerate very low returns on investment. And they had the firm support of the Japanese government, which protected some of the weaker keiretsu industries, and also turned a blind eye both to corporate-governance questions and to antitrust considerations.

In the late 1980s, this “long-term” stakeholder capitalism represented a real challenge to shareholder capitalism—not least because critics could also point to other apparent successes. South Korea’s chaebol, which had broadly copied the keiretsu system, were seen as the next threat. German companies were outperforming their Anglo-American peers in some high-profile industries, notably luxury cars. They, too, were protected from the distractions of short-term capitalism by their two-tier board systems, the argument went; they, too, ruled through consensus and works councils rather than through strikes and layoffs; they, too, enjoyed government support, rather than being left to sink or swim.

In the 1990s, admiration gave way to doubt. There were several reasons why Japan stagnated, not least macroeconomic mismanagement, but the stakeholder ideal was one of them. Consensus management often became an excuse for paralysis; lifetime employment not only proved impossible to maintain but also was a formidable barrier to promoting young talent. Clever young Japanese bankers and businesspeople migrated to Western firms that were prepared to give them more responsibility, not to mention money. Keiretsu firms tended to overproduce and overinvest when compared with independent firms. Even in the boom years of 1971 to 1982, they derived significantly lower returns on assets.13 In the 1990s, they drifted from one disaster to another.

The decade was also a humbling one both for the chaebol, which were flattened by the Asian currency crisis and charges of crony capitalism, and for German companies, which were hamstrung by the high labor costs that stakeholder capitalism entailed. The relative absence of “short-term” shareholder pressure proved a comparative weakness—all the more so because Anglo-Saxon firms, particularly American ones, were just beginning to benefit from genuine investor pressure.

BARBARIANS AND PENSION FUNDS

In the heyday of managerial capitalism, “shareholder activism” was limited to cases so extreme that they made a nonsense of the term, such as the occasion in 1955 when City of London institutions forced the dictatorial Sir Bernard Docker out of BSA/Daimler, after a series of revelations about his luxurious lifestyle, including Lady Docker’s frequent use of a bespoke gold-plated Daimler. The only real option open to unhappy shareholders was to do “the Wall Street shuffle”—to sell the shares and look elsewhere.

This comfortable state of affairs relied on shareholders remaining both dispersed and submissive. But over the next quarter of a century, the power of big investment institutions rose relentlessly: by 1980, they owned a third of the shares on Wall Street; by 2000, more than 60 percent. Pension funds grew particularly fast—from 0.8 percent of the market in 1950 to more than 30 percent by the end of the century. This handed enormous power to organizations such as the huge California Public Employees Retirement System. (“Guys,” the treasurer of California remarked to his peers in New Jersey and New York, whom he had summoned to a hotel room in 1984 during an oil battle, “in this room we control the future of Phillips. All we have to do is vote the proxy.”)14

It was not just a question of numbers, but of status. Here mutual funds, which also grew quickly (from 2 percent of the market in 1950 to 12 percent in 1994), and various savings schemes launched by governments, such as ISAs and PEPs in Britain and 401k plans in America, were crucial, because they forced savers to start checking up on investment managers and their quarterly performance figures. From being rather dowdy creatures, fund managers became downright glamorous, their bespectacled faces peering owlishly from the front of magazines, their words of wisdom captivating the masses on CNBC. Peter Lynch, who built Fidelity’s Magellan Fund into the largest in the market, became better known than the corporate managers whom he backed—and occasionally sacked.

Fund managers were quick to dump shares in order to boost their quarterly earnings, a habit that helped drive up volume on the New York Stock Exchange from 962 million shares in 1962 to 27.5 billion in 1985 and 262 billion in 2000. But they were also more likely to interfere in the companies they owned. For a huge institution like Calpers, it was not easy to off-load the 7.2 million shares it owned in General Motors; when the company began to lose billions in the early 1990s, Calpers called for heads to roll. In 1992, GM’s board ousted its slow-moving chief executive, Roger Stempel (the first such coup since Pierre du Pont got rid of Durant seventy years earlier). Stempel was soon followed by his peers at IBM, Westinghouse, American Express, and Kodak.

Meanwhile, the investment world became infinitely more complex, as markets deregulated and computers popped up in dealing rooms. The development in the 1960s of the offshore “euromarket” in London prompted more flexible rules in New York, which in turn prompted more flexible rules everywhere else. By the early 1980s, the Western world had an integrated foreign exchange market and, for big firms at least, a global bond market. Soon mathematicians were dreaming up ever more ingenious forms of swaps, options, and other derivatives. The first hedge funds appeared, while other phrases such as “off balance sheet liabilities” acquired new meanings.

Yet, the Wall Street figures who struck most fear into managers were the corporate raiders—particularly now that they focused on using debt to dismantle companies. One of the pioneers was Hanson Trust, a British conglomerate that did half its deals in America. Set up in 1964 by two buccaneering Yorkshiremen, James Hanson and Gordon White, it rose to prominence in the 1970s by taking over a series of unglamorous but cash-rich businesses, such as brick firms and tobacco firms. Once a takeover had been completed, Hanson rapidly repaid some of the debt by selling assets (typically a now-unnecessary head office) and then set about pruning management. Any acquired business was theoretically up for sale almost immediately: Hanson was rather like an antique dealer, buying slightly dingy assets, polishing them up, and putting them back in the shop window.

Most of the other raiders also had a sense of swagger. T. Boone Pickens was a folksy oilman who found that he could make a fortune by failing to take over oil firms: thanks to the rising share price, he made $500 million in one foray at Gulf alone. Carl Icahn, a former stock-market trader who liked to pontificate about the way that “the corporate welfare state” was smothering the American economy, bought TWA. The most beguiling of all was Sir James Goldsmith (1933–1997). Having made several hundred million dollars asset-stripping Diamond International, a timber firm, he bought 11.5 percent of Goodyear in 1986. The tire company’s hometown, Akron, Ohio, responded furiously. A subsequent congressional hearing was dominated by a broadside by Goldsmith against the corrupting effect of entrenched management, but he eventually retreated, selling his shares at a profit of $93 million.

The battle that came to define the 1980s takeover boom occurred in 1988. RJR Nabisco was formed by the marriage, in 1985, of the Reynolds tobacco business and Nabisco Brands. But the stock market was unimpressed by the union, and the firm’s high-spending chief executive, Ross Johnson, began to talk to Wall Street about taking the company private. He chose Shearson Lehman, part of American Express, to advise him; but after a fierce battle, the company was eventually bought by Kohlberg Kravis Roberts, an adviser whom Johnson had somewhat foolishly spurned, for $25 billion. Johnson was given a $53 million payoff; thousands of his former workers lost their jobs in the subsequent rationalization.

KKR was a new sort of organization—a leveraged buyout partnership that created a succession of investment funds. Formed in 1976 by three bankers from Bear Stearns, KKR had already taken over Beatrice Foods (in an $8.7 billion buyout) and Safeway ($4.8 billion) and a string of smaller firms. The structures varied, but KKR’s fund put in a relatively small portion of equity—in RJR’s case, only $1.5 billion. Following the same sort of procedures as Hanson, it then paid off the debt, ideally leaving the equity-holders sitting on an enormous profit.

At heart, leveraged buyouts were an attempt to make managers think like owners. In 1989, Michael Jensen of the Harvard Business School claimed that such private firms heralded the “eclipse of the public corporation,” because they resolved the conflict between owners and managers in a much clearer way. He heralded debt as a more demanding way of financing companies than equity: “Equity is soft, debt hard. Equity is forgiving, debt insistent. Equity is a pillow, debt is a sword.”15

In fact, the success rate of leveraged buyouts depended enormously on the price that was paid. The main winners were usually the original shareholders, who sold tired-looking companies at massive premiums: the price KKR paid for Safeway was 82 percent above its market value three months before. Buyouts were less popular with unions, who associated them with large redundancies. This was unfair: the seventeen companies bought out by KKR in 1977–1989 increased employment by 310,000 (and also spent more on research and development).16 But the process could be savage. At Safeway, for instance, where the company motto had been “Safeway offers security,” 63,000 people lost their jobs.17

LBOs, in turn, relied on another Wall Street invention: “junk bonds.” Wall Street had always traded bonds in distressed companies. The genius of Michael Milken was to create bonds specifically for this “non-investment-grade” market, opening up the market to ventures that were too small or risky to issue regular bonds. Milken first began to push his “high-yield” bonds in the 1970s; by the 1980s, his firm, Drexel Burnham Lambert, dominated the junk-bond market, and his annual Predators Ball in Los Angeles had become a fixture for entrepreneurs and politicians. In 1982, President Reagan made Milken’s job a little easier by allowing banks and, crucially, savings and loan institutions to buy corporate bonds. Between 1975 and 1986, some $71.5 billion of junk bonds were issued, with an average yield of 13.6 percent.

In some ways, the merger boom ended in disaster. Junk bonds lived up to their name: around a fifth of the bonds issued from 1978 to 1983 had defaulted by 1988.18 Many of the thrifts that bought junk bonds went bankrupt, as did Drexel Burnham Lambert itself in February 1990. Milken was indicted on almost one hundred counts of racketeering—and eventually sent to jail. Across in Britain, Hanson’s ambition overran itself: after an unsuccessful play for ICI, its two founders effectively broke up the company in 1996. Goldsmith ended his career as an antiglobalization crusader. Wall Streeters were pilloried for their greed in The Bonfire of the Vanities (1987), Liar’s Poker (1989), and Barbarians at the Gate (1991).

By the end of the century, shareholders had plainly failed to restrain managerial power in the way that many had hoped. Nine in ten big American companies were incorporated in Delaware, a state whose laws favored managers over shareholders. The experiment of making managers behave more like owners had been perverted, via excessive use of stock options, into a get-rich-quick scheme for bosses. By the end of the 1990s, the chief executives of big companies took home an average of $12.4 million—six times as much as in 1990.19 A couple of years later, the Enron scandal revealed managerial abuses on a scale that the staid Company Men of the 1950s could never have imagined. Hostile takeovers were far rarer in continental Europe and Japan and companies’ managers were much better protected by their close ties to banks.

Still, these qualifications should not obscure how much the 1980s merger boom changed companies. LBOs, for instance, did not go away: indeed, the device spread to Europe and eventually Japan. Many of the management techniques pioneered by LBO funds, such as incentivizing managers with stakes in their businesses, spread widely. As for takeovers, ten years after the world gawked with disbelief at the size of the RJR Nabisco deal, three takeovers trebled the amount: Glaxo Wellcome bought SmithKline Beecham for $76 billion, Pfizer paid $85 billion for Warner Lambert, and Exxon paid $77 billion for Mobil. In 2000, Vodafone, a British mobile phone company, stunned the German establishment by winning control of Mannesmann in a hostile takeover. And the power of investment managers continued to grow. By 2002, three groups—Fidelity, the Union Bank of Switzerland, and Allianz—each controlled about $1 trillion in assets.

These investors were far from omnipotent, as Enron showed. Yet, with the barbarians and pension funds at the gate, company managers were continuously reminded of the aphorism “Money goes where it wants and stays where it is well treated.” Companies had to ask hard questions about their scope. Investors, with a few prominent exceptions, wanted companies to be good at one thing: diversification was something they could do themselves. And they were remorseless about punishing bureaucratic flab. It was no coincidence that the main corporate heroes of the period all hailed from a place famous for small, agile firms—the thin sliver of land between San Jose and San Francisco that had once been known as the Valley of Heart’s Delight.

SILICON VALLEY

In 1996, with the Internet revolution gathering pace, John Perry Barlow, a Grateful Dead songwriter and cyber guru, issued the following warning: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of the Mind. I ask you of the past to leave us alone. You are not welcome among us.” Silicon Valley’s penchant for hyperbole can be grating. All the same, the business ideas that the Valley pioneered, combined with the technology it invented, helped unbundle the company still further.

Silicon Valley’s story actually dates back to 1938, when David Packard and a fellow Stanford engineering student, Bill Hewlett, set up shop in a garage in Palo Alto. In the 1950s, Hewlett-Packard, along with several other Stanford-inspired companies, moved into the university’s new industrial park. Over the next two decades, this cluster of small firms multiplied slowly with plenty of help from the government. By one count, in the period 1958 to 1974, the Pentagon paid for $1 billion worth of semiconductor research. Packard served as deputy secretary of defense in the first Nixon administration.

In the 1970s, the Valley began to acquire its identity. The name “Silicon Valley” was invented in 1971 by a local technology journalist—reflecting the success of its memory-chip makers. Meanwhile, the Valley began to be taken over by the sort of people who protested against the Vietnam War, rather than helped run it. In 1976, Steve Jobs and Steve Wozniak set up Apple Computer in the Jobs family garage. But the 1970s boom was brought to a halt by the Japanese. On “the black day,” March 28, 1980, Richard Anderson, a HP manager, revealed that tests had shown the Japanese memory chips outperformed the Valley’s. To its shame, the Valley turned to the American government for protection, but it also successfully changed shape, outsourcing its manufacturing and diversifying from chips into computer software.

This metamorphosis underlined what set the Valley apart.20 America’s other high-tech center, Boston’s Route 128, boasted just as much venture capital, and just as many universities. Yet, when both clusters fell from grace in the mid-1980s, Silicon Valley proved far more resilient. The reason was structural. Big East Coast firms such as Digital Equipment and Data General were self-contained empires that focused on one product, mini-computers. Silicon Valley’s network of small firms endlessly spawned new ones.

It was for much the same reason that the Internet business found a natural home in northern California. The late 1990s saw an unprecedented number of young Valley firms going public. In 2000 alone, some $20 billion of venture capital was pumped into the region. By then, the Internet bubble was already bursting. Even allowing for that (and all the Valley’s other drawbacks, such as high house prices, terrible traffic, and unrelenting ugliness), the region still counted as the most dynamic industry cluster in the world. By 2001, Silicon Valley provided jobs for 1.35 million people, roughly three times the figure for 1975, its productivity and income levels were roughly double the national averages, and it collected one in twelve new patents in America.21

Silicon Valley changed companies in two ways. The first was through the products it made. At the heart of nearly all of them was the principle of miniaturization. In the last three decades of the twentieth century, the cost of computing processing power tumbled by 99.99 percent—or 35 percent a year.22 Computers thrust ever more power down the corporate hierarchy—to local area networks, to the desktop, and increasingly to outside the office altogether. Meanwhile, the Internet reduced transaction costs. By the end of the century, General Electric and Cisco were forcing their suppliers to bid for their business in on-line auctions; and eBay, the main independent on-line auction house, had 42 million users around the world. In the last three months of 2001, those eBay customers listed 126 million items and spent $2.7 billion. Previously, those transactions, if they had happened at all, would have involved thousands of intermediaries.

The other way in which Silicon Valley changed the company was by pioneering an alternative form of corporate life. Some of its companies, such as Hewlett-Packard and Intel, lasted for decades, but the Valley epitomized the idea of “creative destruction.” An unusual amount of the Valley’s growth came from gazelle companies—firms whose sales had grown by at least 20 percent in each of the previous four years. It also tolerated failure and even treachery to an unusual degree. Many would argue that its real birth date was not 1938 but the moment in 1957 when the so-called “traitorous eight” walked out of Shockley Laboratories to found Fairchild Semiconductor, which in turn spawned Intel and another thirty-six firms. Virtually every big firm in Silicon Valley was a spin-off from another one.

Right from the beginning, it was a place where ties were optional, and first names compulsory. In 1956, the same year that The Organization Man was published, William Shockley (1910–1989) took all his colleagues out to breakfast in Palo Alto to celebrate the fact that he had won the Nobel Prize for inventing the transistor: a photograph shows only two of them wearing ties, and nobody wearing a suit.23 Meritocracy was crucial: youth was promoted on ability alone, and the Valley was unusually open to immigrants. In 2001, one resident in three was foreign-born.

By the end of the twentieth century, you could see the gradual Siliconization of commerce. The hierarchies of big firms everywhere became looser. Manpower, a temporary worker agency, replaced General Motors as America’s biggest employer. Most economies relied on gazelle firms to produce jobs (by some counts, they provided three-quarters of the new jobs in the mid-1990s). Firms everywhere discovered the benefits of alliances, partnerships, joint ventures, and franchises. By one estimate, around a fifth of the revenues of America’s biggest one thousand companies in 1997 came from alliances of one sort or another. The border of the company, so rigidly defined under Alfred Sloan, became fuzzy—or, in the jargon of the time, “virtual.”

As the previously firm lines around companies began to blur, some old business ideas began to seem extremely modern. Business experts like Michael Porter pointed to the competitive advantages buried in the guildlike networks of German engineering firms. East Asia, the most exciting area of business geographically, was dominated by “bamboo networks” of overseas Chinese companies. Rather than trying to set up state companies, governments were now fixated by trying to foster entrepreneurial clusters of their own in places as far apart as the South of France, Malaysia, and Taiwan.

UNBUNDLED, FLAT, AND BORDERLESS

To ascribe everything that happened to the company in the final quarter of the twentieth century to Silicon Valley, Wall Street, and the Japanese would be an oversimplification. But this trio provided the discordant background music to a period of mounting uncertainty.

Nothing better symbolized the loss of confidence than the rise of the management-theory industry. As companies rushed to outsource everything in sight, many even outsourced their thinking to a growing number of “witch doctors.” By 2000, McKinsey had four thousand consultants, ten times the number in 1975. Other companies—notably IT firms and accountants—established consulting businesses. The accountants at Arthur Andersen were so jealous of the fees being charged by their colleagues at Andersen Consulting that they tried to invade the business themselves, a move that led to one of the most expensive divorces in corporate history. This was a time when fads such as business-process reengineering sped around the world at a dizzying pace, and when businesspeople rushed to buy books that distilled the management wisdom of Siegfried and Roy, the English rugby captain, Star Trek, and Jesus, CEO.

As the company jumped through these hoops, its relationship with the rest of society changed again. By the 1990s, companies had begun to move their headquarters out of city centers. Rather than displaying their might to the world, they preferred to retreat into low-slung business campuses in the suburbs. The cult of the bottom line was forcing companies to do away with what their bosses deemed to be unnecessary expenditure, even if it meant abandoning their old civic responsibilities.

Philadelphia, which had done so well out of the robber barons, got clobbered. Scott Paper had been a pillar of civic life in Philadelphia for decades. But in 1993 it posted a loss, and in 1994 it brought in Al Dunlap to boost its flagging performance. “Chainsaw Al” moved the headquarters to Florida, laid off thousands of workers, reneged on a promise to pay the final $50,000 of a $250,000 pledge to the Philadelphia Museum of Art, and finally sold the business to Kimberly-Clark.24 Another staple of local civic life, Drexel and Company, wound up as part of Drexel Burnham Lambert, and was forced out of business by federal prosecutors. SmithKline merged with a British company, Beecham.25 Meanwhile, many of Philadelphia’s new companies preferred the anonymity of Route 202 to the expensive amenities of downtown.

For Company Man, the period was brutal. All his basic values were under assault—loyalty, malleability, and willingness to put in face time. The new hero of the business world was the tieless entrepreneur rather than the man in the gray-flannel suit. Women began to provide competition, not just secretarial assistance. Jack Welch complained that lifetime employment produced a “paternal, feudal, fuzzy kind of loyalty”—and forced his employees to compete to keep their jobs.26 In IBM towns, like Endicott and Armonk, IBM men lost more than their jobs; they lost access to the cocoon of institutions, such as the IBM country club, with which the company had long protected them.

This devastation can be exaggerated. Some company towns, such as Redmond, boomed during the period. In Delaware, Du Pont may have faded (its workforce was slashed from 25,000 to 9,000), but its role in local society was partly assumed by MBNA, an uppity credit-card firm that employed 10,500 people in the state by 2002.27 Company Man did not so much die as enroll in a witness-protection program. Successful companies usually possessed powerful corporate cultures—something that are impossible to maintain without a hard core of loyal employees. Under Welch, General Electric might not have believed in lifetime employment, but it certainly hired a distinct brand of person wherever it went (broadly, a competitive male, with a keen interest in sports, usually from a second-tier university). The Microserfs in Redmond may not have worn blue suits, like IBM’s foot soldiers, but they still boasted a strong clannish spirit.

As for the much mooted death of the career, the aggregate statistics suggest that workers in the 1990s were changing their jobs only slightly more frequently than they did in the 1970s.28 The biggest novelty was the sharp rise in temporary jobs for women. With the possible exception of Britain, where almost a quarter of the workforce was part-time, it would be hard to make the case that the job was disappearing; and even harder to make the case that workers wanted it to disappear.29 The biggest change was psychological: even if people were continuing to work at companies, the old certainties of employment and position had patently gone. People talked about employability, not lifetime employment. Career paths followed a more topsy-turvy route; and everyone began to work longer hours. Sociologists such as Richard Sennett (The Corrosion of Character, 1998) worried about the growth of anxiety even in those people who had done well in the system.

REGULATORY CAPITALISM

These changes began to pose questions about the company’s relationship with the state. By 2002, society’s attitude toward the corporate sector seemed two-faced. On the one hand, governments had set the company free, deregulating markets, loosening trade barriers, and privatizing state-owned companies. On the other hand, politicians and pressure groups were looking for ways to turn the company to social ends.

Many governments had given up power reluctantly anyway. The French, for instance, carefully rigged their privatizations so that they could preserve as much state planning as possible, selling packages of shares to friendly strategic investors. They thought nothing of introducing a thirty-five-hour week in 2000. Throughout the 1990s, European governments, both individually and through the European Union, increased red tape dramatically in the name of the common good. Consumers had to be protected, safety standards had to be met, and products (including, famously, the banana) had to be defined. According to the British government’s own regulatory impact assessments, the European working-time directive alone, which set a maximum forty-eight-hour week, was costing the country’s businesses more than £2 billion a year by 2001.30 According to the same figures, Tony Blair’s Labour government had added £15 billion worth of regulatory costs in its first five years.

The American government also increased its grip on the company through laws governing health, safety, the environment, employee and consumer rights, and affirmative action. Often the effect was not just more red tape but also more lawsuits. The 1991 Civil Rights Act, signed by George Bush senior, imposed huge regulatory burdens on businesses. It also created a litigation bonanza by increasing attorneys’ fees and allowing claims for “emotional injury.” American managers were more restricted than ever before in performing one of their most basic functions—hiring and firing. They could not ask about such things as an applicant’s family or his health. Bill Clinton was a still more fervent micromanager. By the end of the century, the cost of meeting social regulations to American firms was $289 billion a year, according to the Office of Management and Budget, a figure that other experts reckon was only a third of the real amount.31 And there were other “costs” to America—particularly the ever-growing amount of time that companies put into political lobbying (both in Washington, D.C., and in various state capitals) to twist this sprawling thicket of rules to their own advantage.

Meanwhile, both the British and American governments began to niggle away at one of the tenets of “Anglo-Saxon” shareholder capitalism: the idea that companies should be run for their shareholders. During the 1980s, about half of America’s fifty states introduced laws that allowed managers to consider other stakeholders alongside shareholders. Connecticut even introduced a law that required them to do so. In Britain, the 1985 Companies Act took the same approach, forcing directors to consider the interests of employees as well as shareholders.

If the main thrust of regulatory capitalism was social, there was also a corporate-governance element as well. Worried by the buccaneering spirit that deregulation had unleashed and by the piratical excesses of some corporate captains, governments sporadically tried to call the bosses of companies more firmly to account. In some cases, regulators breached the corporate veil—holding directors personally responsible for their firms’ actions. In Britain, for instance, the 1986 Insolvency Act made directors liable for the debts a company incurred after the point when they might reasonably be expected to have closed it down. But the real onslaught occurred in America, after the New Economy bubble burst.

ENRON AND BEYOND

The 1990s was a decade of infatuation with companies. The number of magazines devoted to business multiplied, even as the ages of the smiling chief executives on their covers plummeted. But the adoration went well beyond young whippersnappers. When Roberto Goizueta, the veteran boss of Coca-Cola, tried to justify his $80 million pay packet to a shareholder meeting, he was interrupted four times—with applause. It was thus hardly surprising that, in January 2001, the new administration tried to capitalize on the prevailing probusiness mood by presenting George W. Bush as America’s first M.B.A. president; nor that he stuffed his cabinet with chief executives; nor even that he pursued a shamelessly probusiness policy, allowing companies to help craft a new national energy policy and hinting at repeal of some of Bill Clinton’s social regulation.

A year and a half later, everything had changed. By the summer of 2002, Bush had signed into law the Sarbanes-Oxley Act, arguably the toughest piece of corporate legislation since the 1930s. Meanwhile, many of the bosses who had once graced business covers were now facing criminal charges. The American people were furious: 70 percent of them said that they did not trust what their brokers or corporations told them and 60 percent called corporate wrongdoing “a widespread problem.”32 Even bosses who had not been caught doing anything wrong, such as Hank Paulson of Goldman Sachs and Andy Grove of Intel, felt obliged to apologize to the public for the sorry state of American capitalism.33 Meanwhile, in continental Europe, the two bosses who had most obviously proclaimed themselves disciples of the American way—Thomas Middelhoff of Germany’s Bertelsmann and Jean-Marie Messier of France’s Vivendi—were both sacked.

The general catalyst for this revolution was the bursting of America’s stock-market bubble. Between March 2000 and July 2002, this destroyed $7 trillion in wealth—a sum equivalent to a quarter of the financial assets owned by Americans (and an eighth of their total wealth). The spread of mutual funds and the change from defined-benefit to defined-contribution retirement plans meant that this was a truly democratic crash: most of the households in America lost money directly.

The specific catalyst was, ironically enough, one of the firms that Bush had turned to to design his energy policy. Enron was a darling of the 1990s—a new form of energy company that did not rely on drilling and gas stations but on teams of financial traders. A Harvard Business School case study was approvingly titled “Enron’s Transformation: From Gas Pipelines to New Economy Powerhouse.”

Unfortunately, the energy trading company took its penchant for innovation just a little too far. Its managers used highly complicated financial engineering—convoluted partnerships, off-the-books debt, and exotic hedging techniques—to hide huge losses. And when those losses emerged, they sold millions in company stock while their employees were prohibited from doing likewise. All the corporate overseers who were employed to monitor Enron on behalf of its shareholders—the outside directors, auditors, regulators, and stockbroking analysts—were found wanting. Despite four centuries of corporate advancement, the hapless shareholders turned out to be no better protected or informed than the London merchants who dispatched Edward Fenton to the East Indies in 1582, only to see him head off to St. Helena, hoping to declare himself king.

Enron’s collapse led to the destruction of Arthur Andersen, a giant accounting firm that had signed off on its books (and had also made a mint providing consulting advice). The government charged Andersen with obstructing justice by willfully destroying Enron documents. Soon afterward, WorldCom followed. The telecom giant, it emerged, had perpetrated one of the most sweeping (and crude) bookkeeping deceptions in corporate history, overstating a key measure of earnings by more than $3.8 billion over five quarters, starting in January 2001. Meanwhile, its boss, Bernard Ebbers, had apparently treated the company as a piggy bank, borrowing hundreds of millions of dollars when it suited him. WorldCom’s stock, which peaked at $64.50 in 1999, stopped trading at 83 cents, costing investors about $175 billion—nearly three times what was lost in the implosion of Enron.

A stream of other scandals followed: Xerox and AOL Time Warner had to revise their accounts. The former boss of Tyco, who had taken home $300 million in just three years, was charged with evading $1 million in sales tax on paintings. The boss of ImClone was accused of insider trading; the founder of Adelphia was charged with defrauding investors. (Nobody was particularly surprised when a survey showed that 82 percent of chief executives admitted to cheating at golf.)34 Meanwhile, investors fumed when they discovered that Wall Street analysts had been misleading them with Orwellian doublespeak: to the cognoscenti, a “buy” recommendation meant “hold” and “hold” meant “run like hell.”

What had gone wrong? Two explanations emerged. The first, to which the Bush administration initially subscribed, might be described as the “bad apples” school: the scandals were the product of individual greed, not a flawed system. The bankruptcies and the arrests would be enough: the founder of Adelphia, John Rigas, was forced to do a “perp walk,” clamped into handcuffs and paraded in front of the cameras.

By contrast, those of the “rotten roots” school argued that the problems went much deeper. They argued that the 1990s had seen a dramatic weakening of proper checks and balances. Outside directors had compromised themselves by having questionable financial relationships with the firms that they were supposed to oversee. Too many government regulators had been recruited from the ranks of the industries that they were supposed to police. Above all, auditors had come to see themselves as corporate advisers, not the shareholders’ scorekeepers. In short, the agency problem—the question of how to align the interests of those who ran companies with the interest of those who owned them—had returned.

To begin with, it seemed that little would happen. As late as June 2002, Paul Sarbanes, the chairman of the Senate Banking, Housing, and Urban Affairs Committee and a longstanding critic of lax regulation in the boardroom, could not even muster sufficient votes on his own committee to pass a package of auditing reforms, in the face of frantic lobbying by accountants and skepticism from the White House. But as the scandals spread, the politicians realized they had to do something. In mid-July the Senate passed a toughened version of the Sarbanes bill by 97–0, and the president rapidly signed it into law. The law is particularly tough on auditors: the accounting partners who oversee the audit of a specific company have to be rotated every five years, and accounting firms are banned from providing consulting to companies they audit. The law also requires CEOs and chief financial officers to certify the accuracy of their financial reports, and it creates a new crime of securities fraud, making it punishable by up to twenty-five years in jail.

This was indeed a victory for the “rotten roots” school—probably the most important change in the oversight of American companies since the 1930s. But that claim merits two caveats. First, it was far less revolutionary than the barrage of laws that Roosevelt forced through (which, among other things, created the SEC and separated investment banking from commercial banking). The main contribution of Sarbanes was to tidy up the bit Roosevelt left out—by establishing clear standards and oversight for the accounting industry. Second, plenty of people in the “rotten roots” school thought that the company needed more tinkering: that a majority of its directors should be independent; that chief executives should be made still more responsible for their firms’ performance; that stock options should be formally curtailed; that auditing firms (not just the partners within them) should be rotated.

These shortcomings opened up the possibility that the backlash was only just beginning. Looking back through history, most periods of gaudy capitalism have been followed by a reaction—often an overreaction (the Bubble Act arguably did almost as much harm as the South Sea Bubble). Yet the chances were that the various seers who proclaimed that the American company would never be the same again were whistling in the wind.

To begin with, the “bad apples” school had been proved right in one respect: the market began to correct itself. Older faces began to appear at the tops of companies; large numbers of companies tried to improve the performance of their boards, not least because their directors were worried that they might be exposed to the sort of problems that had ruined the lives of Enron’s directors; many big companies, including Coca-Cola, announced that they would start expensing stock options; accountancy firms became tougher with their clients.

Next, politics was more on the side of the bosses than people realized: the fact that most Americans now owned stocks loaded the dice heavily against full-throated populism. Stock owners have a natural interest in improving the oversight of companies (through better accounting, more independent directorships, better regulation of corporate pensions); they have less interest in imposing constraints on companies’ performance.

Most fundamentally, despite the crowing in Europe, the fuss about Enron looked less like a revolution against American capitalism than a restatement of its basic principles. Forcing auditors and outside directors to represent shareholders was not a challenge to the company, but a reaffirmation of it. There was nothing particularly “corporate” about hiding debts in dodgy partnerships or spending $6,000 on a gold-and-burgundy floral-patterned shower curtain—and then charging it to the company that you ran (as the head of Tyco did). This was not a backlash against business but against bad business practices. Reform should ultimately increase the appeal of shareholder capitalism abroad.

All the same, it is hard to avoid the fact that the American corporation—the trendsetter for the company for most of the previous century—ended the period covered by this book not on a high note, but in a slough of self-doubt, with society peering questioningly at it. An old debate about whether companies were supposed merely to make money legally or to be active instruments for the common good had appeared once again (the difference this time being that while companies as a whole were vastly more powerful, individual companies were vastly more fragile). A full-scale backlash might be avoided, but there was every likelihood of continuing encroachment by government—of more rules, more obligations, more responsibilities. Many of the frustrations and expectations about what the company owed to society focused on one particular sort of company that is the focus of our next chapter: the multinational.

You can support our site by clicking on this link and watching the advertisement.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!