16

IN DEFENSE OF TH PAX AMERICANA

Small Wars in the Twenty-First Century

The past is an uncertain guide to the future, but it is the only one we have. What can it teach us about small wars and how to fight them?

The most obvious point, but one well worth making, is that the types of missions that the armed forces found themselves undertaking after the end of the Cold War—missions at odds with the military’s traditional conception of its own role, as embodied in the Powell Doctrine—are nothing new. Specifically, there is absolutely nothing novel about:

• Wars without a declaration of war. There is a myth prevalent in some quarters that the Korean War was America’s first “undeclared war,” and that presidents since then have been traducing the Constitution to deploy military forces abroad on their own initiative. Congress even passed the War Powers Act after the Vietnam War to limit presidential war-making authority.

True, Korea was America’s biggest undeclared war up to that point, but it was hardly the first. All the wars chronicled in this book were undeclared, starting with the Tripolitan War, when Thomas Jefferson initially sent a naval squadron to the Mediterranean without bothering to ask for congressional approval. As the Small Wars Manual states, “small wars are operations undertaken under executive authority.” On those few occasions when a president sought congressional authorization, he usually made clear that it was nonbinding. Woodrow Wilson, for instance, ordered the marines to land in Veracruz in 1914 before the Senate had finished debating the matter. The Philippine War, too, broke out before the Senate ratified the Treaty of Paris annexing the archipelago. Congress has generally voted a declaration of war only in the event of hostilities with another major industrialized power and sometimes not even then; witness the quasi-war with France in 1798–1800. Military operations in Third World nations have seldom been seen to require a formal declaration of war.

That doesn’t mean Congress has no role to play; it can cut off funding for an operation it finds objectionable. Various lawmakers tried to do precisely that during the U.S. intervention in Nicaragua from 1926 to 1933. Although these attempts never completely succeeded, they encouraged the Hoover administration to hasten the withdrawal of troops. Four decades later, Congress was more successful in cutting off funding for any further U.S. involvement in Indochina.

• Wars without exit strategies. The U.S. military stayed continuously in Haiti for 19 years, in Nicaragua for 23 years, in the Philippines for 44 years, in China for almost 100 years. These long-term deployments should be no surprise. After all, the U.S. still has not found an “exit strategy” from World War II or the Korean War; American troops remain stationed in Germany, Japan, Italy, and South Korea more than a half century after the end of the wars that brought them there. As of this writing, U.S. troops remain in the Middle East to contain Iraq a decade after the Gulf War ended, a mission that results in regular air attacks against Iraq and some U.S. casualties (in the Khobar Towers and USS Cole bombings).

This runs counter to the assumption implicit in the Powell Doctrine that U.S. troops should win a battle and go home. But home where? To bases in Europe and Northwest Asia—themselves the legacy of prior wars? Or to bases back in the U.S.A.? If the latter, why station troops on the mainland? This puts them farther from any likely action, makes life more boring (would you rather live in the Mojave Desert or Europe?), and does not save much money. Stationing troops abroad can actually be cheaper if Washington duns American allies, as it should, to pay part of the bill. Moreover, imposing an artificial exit deadline prior to any U.S. intervention severely limits its effectiveness by encouraging America’s enemies to await its departure.

• Wars that are fought less than “wholeheartedly.” Most of the wars chronicled in this book were fought for extremely limited objectives—to free hostages (Barbary War, Boxer Uprising), to exact revenge (Canton 1854, Korea 1871, Pershing Punitive Expedition), to overthrow a dictator (Veracruz 1914). Some had more ambitious objectives, such as annexing the Philippines or pacifying Haiti, but even where the ends were unlimited, the means remained severely circumscribed. Even in the biggest of the small wars chronicled here, the Philippine War, there was no mass mobilization. Only 70,000 soldiers were sent, and they were all volunteers. More typical was Haiti, which never had more than 3,000 U.S. Marines and little if any heavy weaponry.

The notion that U.S. troops must be deployed only in overwhelming force was alien to the armed forces of the past. In China, except for a brief period during the Boxer Uprising in 1900 and the Nationalist Revolution of 1927–1928 (when the U.S. had about 5,000 servicemen in the country), the U.S. never fielded forces remotely capable of dealing with any serious adversary.

• Wars in which U.S. soldiers act as “social workers.” This phenomenon, said to be prevalent in the 1990s, raises hackles among veterans who complain, “It wasn’t like that in my day. Our job was fighting wars, period.” It is true that during World War II and the Cold War the U.S. military concentrated for the most part on its conventional war-fighting capabilities, and for good reason. But this is far from the norm in American history, and it was not even exclusively true during the 1941–1989 period (witness the 101st Airborne Division escorting African-American students to school in Little Rock, Arkansas, in 1956).

Soldiers follow orders, and presidents have often found it convenient or necessary to order the armed services to perform functions far removed from conventional warfare. Throughout U.S. history, marines at home and abroad have found themselves providing disaster relief, quelling riots, even guarding mail trains. Soldiers also have often acted as colonial administrators—in the Philippines, Haiti, Nicaragua, and Veracruz, to say nothing of post–World War II Germany and Japan or the post–Civil War South.

In fact occupation duty is generally necessary after a big war in order to impose the victor’s will on the vanquished. If ground forces win a battle and go home, as the Powell Doctrine advocates and as actually happened in the Gulf War, the fruits of victory are likely to wither on the vine. Only boots on the ground can guarantee a lasting peace.

• Wars in which America gets involved in other countries’ internal affairs. It is often said that the Kosovo War marked a new departure: the first time the West tampered with the supposedly sacred principle of sovereignty, embodied in the 1648 Treaty of Westphalia, which held that countries should be free to run their own affairs without outside interference. In reality, Westphalian principles were often traduced in Europe and seldom considered binding by either Europeans or Americans in their dealings with less developed parts of the globe.

The U.S. has been involved in other countries’ internal affairs since at least 1805, when, during the Tripolitan War, William Eaton tried to topple the pasha of Tripoli and replace him with his pro-American brother. The rest of this book is a record of U.S. involvement in the internal affairs of other states, from Samoa to Nicaragua to China. Although the U.S. wanted to preserve the Open Door and protect China from being carved up into colonies by the Europeans and Japanese, American respect for Chinese sovereignty was far from absolute. What else explains America stationing troops in three of China’s largest cities and sending gunboats to patrol its largest river?

• Wars without a “vital national interest.” The search for a “vital national interest” was not much of an issue during the Cold War because almost every regional intervention (e.g., Grenada in 1983 or the Dominican Republic in 1965) could be linked, rightly or wrongly, to the fight against communism. But since the end of the Cold War, this has become a prime source of controversy. Various critics assert that U.S. armed forces should not be in Kosovo or Haiti or Somalia because, they claim, there is no national interest there. It all depends of course on how one defines “national interests.” Isolationists like Patrick Buchanan say that U.S. interests are not threatened unless someone attacks American territory. Interventionists like Richard Holbrooke assert that U.S. interests are threatened even by AIDS in Africa. Who is right? It’s a matter of individual judgment, but history suggests that the U.S. has never confined the use of force to situations that meet the narrow definition of American interests preferred by realpolitikers and isolationists.

A few small wars, like those against the Barbary pirates or the Boxers or Villistas, were fought to protect American nationals or territory and presumably would meet a narrow definition of self-interest. But it would be a stretch to claim that most of these interests were “vital.” How vital were most of the “wrongs” avenged by hot-blooded nineteenth-century naval captains? Or how vital was it to protect U.S. trade with China, which never amounted to more than 4 percent of the total? U.S. policy toward China was driven less by businessmen than by missionaries, predecessors of today’s human rights lobby.

What about American interventions in Latin America during the gunboat diplomacy era? Surely they were undertaken for material gain. A myth has flourished around these interventions—that they were conducted, as Smedley Butler later claimed, at the behest of Wall Street banks and banana companies. But in fact, as I argue in Chapters 67 and 10, the desire to protect American economic interests was only one of the motives behind the Banana Wars, and often not the most important one. Strategic and, yes, moral concerns played a vital role, especially in the actions of Woodrow Wilson, who vowed “to teach the South American republics to elect good men.” “The interventions by U.S. Marines in Haiti, Nicaragua, the Dominican Republic, and elsewhere in those years,” writes the political scientist Samuel P. Huntington, “often bore striking resemblances to the interventions by Federal marshals in the conduct of elections in the American South in the 1960s: registering voters, protecting against electoral violence, ensuring a free vote and an honest count.”

The moral component is sometimes hard to discern from the vantage point of the twenty-first century because the terms in which it is expressed have changed. In the early twentieth century, Americans talked of spreading Anglo-Saxon civilization and taking up the “white man’s burden”; today they talk of spreading democracy and defending human rights. Whatever you call it, this represents an idealistic impulse that has always been a big part in America’s impetus for going to war. When its leaders try to present the case for war in purely strategic or economic terms—as Secretary of State James Baker III famously did when he said the Gulf War was about “job, jobs, jobs”—Americans are likely to be left cold. It is usually the moral component—in this case, Iraqi atrocities against Kuwaitis—that convinces Americans to take up arms.

Few wars are waged for purely humanitarian reasons. Most are fought for a combination of causes, moral, strategic, and economic. Sometimes the balance is heavily tilted toward moral factors—the Spanish-American and Kosovo Wars are good examples—but geopolitical considerations are rarely absent altogether. The U.S. fought Spain in part to assert an American leadership role in the Western Hemisphere, and it fought Serbia in part to assert an American leadership role in Europe.

• Wars without significant popular support. Few of the wars chronicled in this book aroused much enthusiasm among the American public. Just as few nineteenth-century Britons thought much about the thin red line safeguarding their interests on the Indian frontier, few Americans paid attention to what their troops were doing on the periphery of empire. Americans who glanced at their newspapers on July 29, 1915, might have noticed reports that the marines had landed in Port-au-Prince the day before. But if they were interested in overseas developments at all, they were probably more concerned with the war then raging in Europe. (The New York Times featured a small page-one item under the pithy headline, “Haitians Slay Their President; We Land Marines,” but that day’s lead story concerned the imminent execution of a policeman convicted of murder.)

When small wars aroused any notice back home, it was usually due to opposition mobilizing—particularly notable in the case of the Philippine War, the 1918 deployment to Russia, the occupation of Haiti and the Dominican Republic, and the Sandinista war in Nicaragua. But small numbers of professional soldiers were able to function well far from home even in the face of domestic sniping.

Mass mobilization of public opinion is needed for big wars, especially those like Vietnam that call on legions of conscripts. It is much less necessary when a relatively small number of professional soldiers are dispatched to some trouble spot. This has been especially true of marines. Whereas the presence of the army signaled to American and world opinion that a war was in progress, the marines were known as State Department troops and landed with little public fuss.

• Wars in which U.S. troops serve under foreign command. This became a sore point with those who tried to blame the United Nations for the debacle that occurred in Mogadishu on October 3, 1993, although the U.S. combat troops involved remained under U.S. control (only a small number of American logisticians served under U.N. auspices). In the past, however, there have been several instances of U.S. troops serving under the effective command of British officers—for instance, in Samoa (1899) and north Russia (1918–1919). History also offers examples of how unwillingness to subordinate to a multinational command structure can backfire—witness the disorganized march to Peking in the summer of 1900. The issue should not be whether American soldiers are serving under foreign officers, but whether they are serving under competent officers.

On-the-Job Training

It is clear, then, that many deeply held shibboleths about the American way of war—which can be summed up in the misconception that the job of the armed forces is limited to “fighting the nation’s wars” in defense of “vital national interests”—have little historical basis. Nor, it must be added, is history kind to the warnings of post-Vietnam alarmists that America risks disaster every time it asks the armed forces to stray into other types of duties. Not all the operations chronicled in this book were totally successful—U.S. troops never caught up with Pancho Villa or Augusto Sandino—but the only real military failure was the mission to Russia in 1918–1920, and it was a pretty small-scale failure, hardly comparable to the grand disaster that unfolded in Indochina.

In most cases the armed forces, however ill-prepared for the job at hand, quickly adapted, figured out what they had to do, and did it with great success. The Philippine War stands as a monument to the U.S. armed forces’ ability to fight and win a major counterinsurgency campaign—one that was bigger and uglier than any America is likely to confront in the future. Their experience in China showed that the armed forces are quite capable of muddling through even when they have to rely on bluff and swagger rather than overwhelming force.

In most of these operations the U.S. suffered relatively few casualties. The most costly was the Philippine War, which resulted in 4,200 American combat deaths—a tragedy but not on the scale of Vietnam, which cost 58,000 American lives. No casualties—the mantra of the 1990s—is almost impossible to achieve, but few casualties is the norm in these types of operations.

Perhaps this surface success was deceptive; perhaps fighting small wars incurred deeper, less obvious costs in lost military effectiveness. Surely these wars eroded the armed forces’ ability to fight big wars and hurt their ability to attract recruits. Such are the refrains often heard in the post–Cold War era from critics, in uniform and out, who complain about the supposedly excessive pace (the “ops tempo”) of small-scale deployments. History suggests we need not worry unduly.

In the early years of the twentieth century, the marines filled their ranks with those who wanted to be “soldiers in the sun.” “We have a class of men in our ranks far superior to those in any other service in the world and they are high-spirited and splendid in every way,” Brigadier General Smedley Butler wrote in 1928. “They joined because of our reputation for giving them excitement, and excitement from a marine’s standpoint can only be gained by the use of bullets and the proximity to danger.” Today the marines are having similar success filling their ranks with such recruits even as other services go begging. Across the armed services, reenlistment rates are actually higher in forward-deployed units (the army reports its highest reenlistment rates are in Bosnia and Kosovo), and many soldiers say they enjoy helping people in peace operations.

Such operations may cause disaffection among some who joined the armed forces during the Cold War and view peacekeeping duty as an unwelcome deviation from their preferred mission. But as the Cold War recedes further into history, the bulk of recruits and officers will have a clearer understanding that policing the Pax Americana is a central part of their mission.

Nor do such operations necessarily diminish U.S. readiness for a major war. Units engaged in peace operations may lose their edge at large-scale combined arms operations, says retired marine General Bernard Trainor, “but the service of young NCOs and lieutenants is enhanced. They’re not drilling; they’re dealing with real world situations.” One survey of U.S. troops in Kosovo found that only 14 percent thought this peacekeeping duty would degrade their combat capability; the rest said their battle skills would be either improved or unaffected. Experience bears out this belief. The marines in the 1920s and 1930s were able to fight banana wars and prepare for amphibious operations in World War II; fighting the Sandinistas in the jungles of Nicaragua turned out to be excellent preparation for fighting the Japanese in the jungles of the South Pacific. Likewise, Pershing prepared for World War I by chasing Mexican Villistas and Philippine Moros, while the naval heroes of the War of 1812 learned their trade battling the Barbary Pirates.

British history offers further confirmation of this point. Queen Victoria’s army spent most of its time on imperial policing duties. Then it was thrust into the biggest war the world had yet known. When British Tommies first landed in France and brushed up against advance units of the Kaiser’s army on August 23, 1914, the Germans were startled by the rat-tat-tat rapidity of the fire coming at them. Surely the British must be employing machine guns, a German officer decided. In fact it was nothing more than rifle fire from expert marksmen schooled in the little wars of empire. The British cut the Germans to pieces that day at the Battle of Mons. True, the peacetime British army was too small to engage in all-out warfare against European opponents and had to be rapidly expanded, but its quality was not in doubt.

What Force Can Achieve

Having established, I hope, that small-war missions are militarily doable—and in fact have been done by the U.S. armed forces since the earliest days of the Republic—we turn now to tougher questions: Are they politically desirable? What have small wars of the past achieved?

American success is easy enough to see when the U.S. goal was seeking a strategic or economic concession, whether trade treaties with China, Japan, and smaller nations, or naval bases at Subic Bay in the Philippines and Guantanamo Bay, Cuba, or the right to build an isthmian canal in Panama. So too with missions designed to protect American life and property, such as the navy’s efforts to suppress piracy in the Mediterranean and Caribbean in the early years of the nineteenth century, and the army’s effort to rescue the legations in Peking during the Boxer Uprising. There is no question that these operations produced results. Equally clear, on the other side of the ledger, is the failure of the mission to Russia in 1918–1920, in large measure because it was unclear why the doughboys were sent. Was it to protect Allied war supplies and the Czech Legion or to overthrow the Bolsheviks? Woodrow Wilson himself was not sure, and the mission achieved little.

It is more difficult to judge the long-term success of punitive expeditions. Whether the expedition was being undertaken by gunboats or gunships, it generally hit its target—but to what effect? In 1986 Ronald Reagan dispatched warplanes to bomb Libya in retaliation for a terrorist attack on American servicemen in West Berlin. Subsequently the State Department claimed that Libyan support for terrorism declined, but there is also considerable evidence that two years later Muammar Gadhafi’s agents blew up Pan Am flight 103 over Lockerbie, Scotland, killing 270 people. Was the 1986 raid on Libya really a success?

It is just as difficult to judge the consequences of past punitive expeditions, such as the 1871 mission to Korea. Rear Admiral John Rodgers had the forts around Inchon razed after one of them fired on his ships. Then his naval squadron sailed away. Eleven years later America became the first Western nation to sign a trade treaty with the Hermit Kingdom. The relationship between these two events must remain a matter of conjecture.

In some significant sense, however, such speculation is beside the point. Even if their ultimate consequences are unclear, punitive raids serve an important function. Much like punishments meted out by the criminal justice system, they satisfy the human impulse to see wrongdoers punished. The courts would not stop putting felons in jail even if this were proved to have no deterrent effect, and likewise Washington will not cease ordering retributive strikes for attacks on American citizens.

Finally, we come to those small wars whose success or failure is most difficult to assess—pacification campaigns. This book describes how the United States has occupied for varying lengths of time Samoa, Wake Island, Cuba, Puerto Rico, the Philippines, Panama, Nicaragua, Veracruz, Haiti, the Dominican Republic, and the Virgin Islands, to say nothing of Alaska and Hawaii. Most of these occupations achieved immediate American objectives—keeping the Europeans out, creating stability, protecting foreign lives and property, furthering free trade, and safeguarding strategically important real estate such as the approaches to the Panama Canal.

Many of these interventions also delivered tangible benefits to the occupied peoples. Although American imperial rule was subject to its share of abuses, U.S. administrators, whether civilian or military, often provided the most honest and efficient government these territories had ever seen. Haiti offers a particularly dramatic example. The 1920s, spent under marine occupation, saw one of the most peaceful and prosperous decades in the country’s long and troubled history. Of course U.S. administrators, in the manner of colonialists everywhere, usually received scant thanks afterward. As Kipling wrote:

Take up the White man’s burden –

And reap his old reward:

The blame of those ye better,

The hate of those ye guard.

But the bulk of the people did not resist American occupation, as they surely would have done if it had been nasty and brutal. Many Cubans, Haitians, Dominicans, and others may secretly have welcomed U.S. rule—if not while the occupation was going on, then afterward, when local thugs took over from the Americans.

Although U.S. rule was generally benign, its effects often wore off quickly. The two most lasting legacies of American interventions in the Caribbean may be a Latin resentment of the Yanquis, now somewhat fading, and a love of baseball, still passionately felt, especially among Dominicans who provide a disproportionate share of the roster of Major League baseball. Most of the physical manifestations of the American empire—roads, hospitals, telephone systems—began to crumble not long after the marines pulled out. This should be no surprise; it has been true whenever technologically advanced imperialists leave a less sophisticated area, whether it was the Romans pulling out of Britain or the British out of India. Veracruz provides a vivid illustration of this phenomenon. The U.S. Army cleaned up the city and reduced the death rate from infectious diseases. Before long the zopilotes (vultures) that had made the city their home disappeared. But as soon as the army left, the residents resumed throwing garbage into the streets and back came the vultures.

More significantly, the American track record of imposing liberal, democratic regimes by force is mixed. The most notable successes are Japan, Germany, and Italy. The U.S. tended to be less successful in what later became known as the Third World, with the intriguing exception of the Philippines, which has been for the most part free and democratic except for the 1972–1986 period when Ferdinand Marcos ruled by fiat. Intriguing because the Philippines was the site of one of the longer U.S. occupations, which suggests that if the army stays long enough it leaves a lasting impact. Short-term (or even medium-term) occupations, on the other hand, are unlikely to fundamentally alter the nature of a society. It is not true, as some critics later charged, that in the Caribbean and Central American countries the U.S. deliberately installed dictators such as Duvalier or Somoza. But it is true that the marines could not create institutions strong enough to prevent their being usurped by a Duvalier or Somoza in the absence of American intervention.

That does not mean that U.S. administration is entirely futile. American troops can stop the killing, end the chaos, create a breathing space, establish the rule of law. What the inhabitants do then is up to them. If the American goal is to re-create Ohio in Kosovo or Haiti, then the occupiers are doomed to be disappointed. But if the goals are more modest, then American rule can serve the interests of occupiers and occupied alike. Put another way, “nation building” is generally too ambitious a task, but “state building” is a more realistic objective. The apparatus of a functioning state can be developed much more quickly than a national consciousness.

Successful state building starts by imposing the rule of law—as the U.S. did in the Philippines and the British in India—as a precondition for economic development and the eventual emergence of democracy. Merely holding an election and leaving is likely to achieve little, as the U.S. discovered in Haiti after 1994.

An analogy may help illuminate what U.S. troops can and cannot do. No one expects a big city police department to win the “war on crime.” The police are considered successful if they reduce disorder, keep the criminal element at bay, and allow decent people a chance to live their lives in peace. In the process a few cops are likely to die, and while this is a tragedy to be mourned, no one suggests that as a result the police should go home and leave gangsters to run the streets.

It is easy to be fatalistic about the prospect that vigorous policing, at home or abroad, will do much good. But this cheap and easy cynicism has been repudiated by events time and again. In the early 1990s it was common wisdom that New York City was doomed to a downward slide. Crime and disorder were rampant and seemingly ineradicable. But Mayor Rudolph Giuliani showed that the situation was far from hopeless, that effective law enforcement could make a difference. And if it could work in New York City (sections of which, Rick Blaine warned a Nazi officer in Casablanca, “I wouldn’t advise you to try to invade”), then why not elsewhere?

The Price of Nonintervention

In considering whether, based on the lessons of the past, we should undertake small wars in the future, we ought to remember not only the price of a botched intervention—Vietnam, Beirut, Somalia—but also the price of not intervening, or not intervening with sufficient determination. Two examples come to mind: Nicaragua and Russia.

In the former case, President Coolidge in 1925 withdrew from Managua the legation guard of 100 marines that had helped preserve stability for 13 years. Within a few months, Nicaragua was once again embroiled in revolution, and many more marines returned for a much longer stay.

In revolutionary Russia, Woodrow Wilson and David Lloyd George missed a prime opportunity in 1918–1919 to help topple the nascent Bolshevik regime. There is reason to believe that with slightly more Western help the Whites could have won the civil war—and in all likelihood changed the course of twentieth-century history immeasurably for the better. These examples are worth balancing against the Vietnam analogies that inevitably, tiresomely pop up whenever the dispatch of American forces overseas is contemplated.

The Butcher’s Bill

So far I have considered mainly the benefits of intervention. But the costs must never be forgotten. Although it is a mistake to think that every intervention risks becoming another Vietnam-style disaster, it is equally wrong to suppose that every intervention can be as painless as the Kosovo War, which produced no casualties (and only two lost airplanes) among NATO forces. The “no casualties” mantra can work for purely punitive missions, where the job at hand consists of discharging cruise missiles from afar, just as in earlier days a gunboat could shell a native village at no risk to itself. But this no-risk mindset can be fatal for any mission more ambitious. Somalia presents the worst-case scenario: Americans arrive full of vim and vigor, only to leave, tails between their legs, after 18 soldiers die. This sends a message of American irresoluteness that can only cause trouble later.

Any nation bent on imperial policing will suffer a few setbacks. The British army, in the course of Queen Victoria’s little wars, suffered major defeats with thousands of casualties in the First Afghan War (1842) and the Zulu War (1879). This did not appreciably dampen British determination to defend and expand the empire; it made them hunger for vengeance.

If Americans cannot adopt a similarly bloody-minded attitude, then they have no business undertaking imperial policing. I am not suggesting that Americans prepare themselves to suffer thousands of casualties for the sake of ephemeral goals, but policymakers should recognize that all military undertakings involve risk and should not run away at the first casualty. More important, Washington should not structure these operations with the prime goal of producing no casualties. That is a recipe for ineffectiveness.

In any combat operation, blood will be shed, not only on the U.S. side but also among the enemy. This is an elementary truth, yet it was often overlooked in the 1990s when cruise missile strikes were planned to hit buildings after their occupants had gone home. Despite advances in modern weaponry, war can never be a clean, surgical business. Especially not urban and counterinsurgency warfare of the sort the U.S. finds itself undertaking in places ranging from Somalia to the former Yugoslavia. Sometimes, as in the Philippine War, it can be a very ugly business indeed. If the U.S. is not prepared to get its hands dirty, then it should stay home.

Policing the World

Most small wars the United States is likely to wage in the future are not highly controversial in principle, even if specific circumstances are likely to arouse debate. By this I mean punitive and protective missions, those designed to safeguard American citizens or punish those who attack them. This role for the armed forces is likely to grow in importance, since the world is littered with American targets—civilian, diplomatic, military—sure to tempt any young man with a gun and a grudge. It is likely that the September 11, 2001, attacks on the World Trade Center and the Pentagon are only a taste of what America can expect in the future. Few except the most obdurate isolationists would begrudge the U.S. the right to hit back or snatch citizens out of harm’s way. Humanitarian missions are not very controversial, either, as long as they involve no prospect of combat—providing hurricane relief in Venezuela in 2000, for instance, or earthquake relief in Turkey in 1999. It is pacification missions, especially those billed as “humanitarian,” that continue to cause controversy.

Should the United States involve itself in others’ civil wars? Should America try to save foreigners from the cruelties of their neighbors or rulers? I believe the answer is yes, at least under certain circumstances. But it is not an easy question to answer and, based on the evidence presented here, I would not be surprised if some readers drew a different conclusion.

Such interventions offend the sensibilities of those who argue, as former Secretary of State James Baker did about Yugoslavia, that “we don’t have a dog in that fight.” Self-styled realists want U.S. forces to keep their powder dry until North Korea invades the South, Saddam Hussein makes another lunge for Kuwait, or China goes for Taiwan. (Some isolationists think even these contingencies would not warrant American action.) Ignore two-alarm fires, the “realists” and isolationists advise, and await the five-alarm blaze that may (or may not) come.

This cautious attitude, shared by many at the Pentagon, flies in the face of recent history. Democracy, capitalism, and freedom have spread across North America, Europe, Latin America, and to many parts of Asia—everywhere except Africa. At the same time, many parts of the world are being ravaged by tribalism and gangsterism. It is, to borrow the title of a book, Jihad vs. McWorld. The United States has an obvious stake in promoting the latter and stymieing the former. A world of liberal democracies would be a world much more amenable to American interests than any conceivable alternative.

Contrary to the dreams of some economic determinists, capitalism and freedom do not spread inexorably on their own. The nineteenth-century free trade system was protected and expanded by the Royal Navy. The only power capable of playing a comparable role today is the United States of America. Like Britain in the nineteenth century, the United States in the twenty-first century has power to spare. In fact the U.S. has more power than Britain did at the height of its empire, more power than any other state in modern times. It deploys the world’s only blue-water navy of any significance and the world’s most powerful air force; its armed forces have expeditionary capability undreamed of by any other power; its economy, powered by unceasing technological innovation, is the biggest and most dynamic on earth; its language has achieved a ubiquity unrivaled by any tongue since Latin; its culture permeates distant lands; and its political ideals remain a beacon of hope for all those “yearning to breathe free.” The U.S. is so far ahead of any rival “in all the underlying components of power: economic, military, technological or geopolitical” that scholars describe the international scene as unipolar.

When it exercised a lesser degree of international hegemony, Britain battled the “enemies of all mankind,” such as pirates and slave traders, and took upon itself the responsibility of keeping the world’s oceans and seas open to navigation. Today America faces equivalent tasks—battling terrorists, narco-traffickers, and weapons proliferaters, and ensuring open access to not only the oceans but also the skies and space. Britain acted to preserve the balance of power wherever it was endangered, coming to the aid of weak nations (such as Belgium or Turkey) being bullied by the strong (Germany or Russia). America has played a similar role in protecting Bosnian and Kosovar Muslims against Serbs, and Kuwaitis against Iraq.

Many of the steps Britain took, such as stamping out the slave trade or the murderous thug cult in India, were hard to justify on a narrow calculus of self-interest. It acted simply out of a sense that it was the right thing to do. It is doubtful that American leaders can resist the call for similar humanitarian interventions in an age when the public back home knows far more about horrors being perpetrated in the far corners of the world than it did in the Victorian era. And why should America not “do something,” assuming the cost of action is not high? Why not use some of the awesome power of the U.S. government to help the downtrodden of the world, just as it is used to help the needy at home? As Theodore Roosevelt said, “A nation’s first duty is within its borders, but it is not thereby absolved from facing its duties in the world as a whole; and if it refuses to do so, it merely forfeits its right to struggle for a place among the people that shape the destiny of mankind.”

Roosevelt committed the United States “to the exercise of an international police power” in the Caribbean, the one region where America was then the predominant power. Today America has as much power in many parts of the world—for starters, in northeast Asia, Europe, and the Persian Gulf—as it once had only in the Caribbean. Many Americans cringe at the notion that their country should play globocop. But this is not a purely altruistic exercise. Without a benevolent hegemon to guarantee order, the international scene can degenerate quickly into chaos and worse. One scholar argues, with great plausibility, that the 1930s turned out as badly as they did because Britain abdicated its international leadership role, and Uncle Sam refused to don the mantle. Another scholar argues, with equal persuasiveness, that the post-1945 era turned out as well as it did in large measure because America was willing to underwrite the security of the global economy—and that the entire world will suffer if America fails to exert leadership in the years to come.

That does not mean that the armed services should be redirected exclusively toward a constabulary role. This would be as misguided as not preparing for such missions at all. Though no major power threat confronted America at the dawn of the twenty-first century, the odds are that one will emerge in the years ahead; one always has. In the meantime, however, the military cannot simply turn its back on “peace” operations that hold out the promise of resolving small problems before they fester into major crises. The costs of engaging in places like Afghanistan are much lower than allowing them to become breeding grounds for terrorists such as those who struck America on September 11, 2001.

Some say the United States cannot afford to prepare for wars both big and small, that America might strangle its economy by the exertion needed to police the nether reaches of the globe. This is an enduring argument, stretching back to the days of Albert Gallatin, Jefferson’s Treasury secretary, who insisted that America would be stronger in peacetime if it paid off its debts rather than field a large standing military. This view made sense for a country like the Soviet Union, which spent perhaps 25 percent of its meager GDP on the armed forces—clearly an unsustainable level. But it does not apply to nineteenth-century Britain or twenty-first-century America.

The British Empire did not collapse because its defense drained the Exchequer. It collapsed because Great Britain exhausted itself in fighting and winning two world wars, conflicts that might have been averted if Whitehall had deployed a larger military capable of deterring German aggression. Britain spent an average of 3.1 percent of GDP on defense during its imperial heyday, 1870 to 1913, and kept about 1 percent of its population under arms—hardly a crippling burden. In 2001 America spent just 2.9 percent of GDP on defense, the lowest level since before Pearl Harbor, and approximately 0.5 percent of its population was on active duty. Of course Uncle Sam’s military is considerably larger in absolute numbers and infinitely more powerful than Queen Victoria’s. But if America needs to spend a little more to accomplish its dual military missions—policing its virtual “empire” and deterring major-power adversaries—then this is still a small price to pay, considering the alternatives.

No nation, no matter how rich, can afford to wage war without end. It makes sense for the U.S., wherever possible, to use non-military means to achieve its objectives. American culture and American economic strength are especially potent tools for spreading American influence abroad. When force is required, the U.S. can often seek help from ad hoc “coalitions of the willing,” as in Kosovo, or make use of the United Nations, especially to legitimate its interventions. It is doubtful, however, that this organization will be capable of fielding an effective, independent military force anytime soon. The really big challenges can be resolved only through American leadership, which means a willingness to commit troops if necessary. Often, however, the mere threat is enough. The Roman writer Vegetius’s advice, Qui desiderat pacem, praeparet bellum (Let him who desires peace, prepare for war), applies as much to small wars as to big.

When the United States does act, it ought to get it right. Sometimes U.S. troops can achieve a great deal by a quick in-and-out operation, as in Panama in 1989, when the U.S. toppled the drug thug Manuel Noriega. But when dealing with collapsing countries—and such states are increasingly common as the artificial boundaries drawn by European statesmen dissolve in places ranging from Indonesia to Congo to Yugoslavia—only a long-term commitment is likely to do much good. The 1994 U.S. intervention in Haiti achieved little beyond restoring the morally dubious Jean-Bertrand Aristide to power.

It is doubtful that America can or should unilaterally occupy countries for as long as it once did. Some system sanctioned by the U.N. may need to be worked out that will allow the international community to place failed states like Afghanistan or Sierra Leone into a state of receivership. A possible model can be found in the “mandates” granted by the League of Nations in the 1920s to European states to run various colonies, in theory for the good of their inhabitants. In the 1990s, Bosnia, Kosovo, Cambodia, and East Timor became wards of the international community; this may be a harbinger of things to come.

In debating the merits of intervention, Americans ought to be wary of arguments that say, “If you intervene in X, why not in Y?” The U.S. was accused of racism for going into Bosnia but not Rwanda. But there is nothing wrong with saving a few people in one place even if you cannot save everybody everywhere. The essence of statesmanship is making difficult choices, weighing costs and benefits, and committing limited resources where they can do the most good. No universal rule to govern American use of force—no alternative to the Powell Doctrine—is possible or desirable. Interventions must be decided on a case-by-case basis. In reaching their decisions, however, policymakers should keep in mind not only the U.S. interests at stake but also how much a proposed mission can accomplish, and at what cost. Even if the U.S. stake is not very great—and it seldom is great, when it comes to small wars—military intervention may still make sense if the costs are low enough or the potential good that American troops can accomplish great enough.

One final bit of advice, based on the lessons of history. In deploying American power, decisionmakers should be less apologetic, less hesitant, less humble. Yes, there is a danger of imperial overstretch and hubris—but there is an equal, if not greater, danger of undercommitment and lack of confidence. America should not be afraid to fight “the savage wars of peace” if necessary to enlarge the “empire of liberty.” It has done it before.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!