CHAPTER 1
War and military operations other than war will endure as viable policy options, though these are fraught with risks and costs. Technology has the potential only to modify these realities, but not to eliminate them completely. As the state of the art in technology changes, it is always important to consider the effects of new technologies on military options. Today's emerging technologies (ETs), such as advance computing, nanotechnology, and biotechnology, are poised to change the world and therefore the shape of international relations as well as military affairs.
Technological change in many areas is quickening. Exponentially growing computer power, government support for science in the United States and internationally, and the relatively new factor of consumer demand for high-technology goods are among the factors accelerating many areas of technological development. Beyond improving familiar and comfortable goods and services, technological development can bring into being seemingly new and different technologies, ETs. In general, an ET is one that has the potential to be disruptive to society by changing the existing norms. Those that can successfully adapt to the norms that emerge from the aftermath will be poised to be the winners in the new age. ETs therefore represent opportunity for some to make rapid advancements in the world, while for those comfortable in their current status, it is an opportunity to fall. The opportunity for rapid changes in the comparative status applies to individuals, organizations, businesses, and nations. For individuals and companies, this has the potential to bring fortune or ruin. However, in the anarchical international system, changes in norms and comparative power are often accompanied by conflict up to the level of systemic (global) war.
The present technological era is alternatively known as the digital, computer, or information age. The foundations of this era can be traced to various limited-function calculation devices of antiquity, but more commonly these are linked to the early programmable computers of World War II. Its impact on the general society, as well as on warfare, accelerated in the 1970s and 1980s with the rise of the semiconductor industry. Displacing fragile and expensive vacuum tubes, semiconductors allowed for not only reliable computing but also low-cost computing. This proliferation of computer power, in turn, spawned the digital age. At present, the world is still adapting to the changes brought on by the information technology revolution. Governments, industries, and individuals are still in the midst of tackling the many policy questions and societal challenges brought on by the seemingly overnight emergence of the information age. Policy issues such as digital privacy, cyber bullying, and digital rights management were not salient issues in the early 1990s. As there were societal winners and losers in the transition from an industrial society to an information society, the next ETs will have a similar capacity to change the rules of the world.
Advances in semiconductors and other ETs needed to bring on the information revolution would also make possible the reliable and relatively low-cost precision-guided munitions (PGM) of the last Revolution in Military Affairs (RMA). ET, in a defense context, appeared in an episode of a British satire, Yes Prime Minister, as a prominent option for defending the realm in the 1980s, though the hapless fictional prime minister, Hacker, did initially confuse ET with the sci-fi movie. At the time, precision weapons were somewhat unproven to military authorities, despite some success in the Vietnam War, and probably did sound like science fiction to policy makers unconvinced of their potential. Since the 1980s, precision weapons have become a major part of the conventional warfare paradigm, being proven to military officials, policy makers, and the general public in conflicts such as the 1991 Gulf War, as well as the Kosovo conflict at the end of the last century.
Military ET, having a limited customer base, must compete with existing concepts and systems for limited funding. Not only must new technologies be procured and maintained, but they must also be developed. The United States is the preeminent military power, but this lead has come as a result of a large expenditure on research and development. Internationally many of the technologies associated with U.S. military power have been replicated, often at much lower costs, because the United States has already borne the expenses of proving the viability of these technologies. Both Russia and China are developing their own stealth aircraft, but neither had to go through the trials of proving the utility of the concept.
In addition to the challenges to the preeminent power within the bounds of the dominant military paradigm, there are also those that will seek to find specific counters to the tactics and technology that represent the current state of the art. Stealth aircraft are an expensive technology and are possibly more expensive than the countermeasures including counter-stealth measures such as more sensitive radars coupled with improved analysis software and guided surface-to-air missiles (SAMs), which are already generally cheaper than manned combat aircraft. Ironically computers allowed the calculations needed to produce radar-defeating shapes (that could fly) and are now poised to allow lower-cost defenses to challenge such invisibility. Insurgency tactics and terrorism, concepts central to today's definition of asymmetrical warfare, also by definition seek to indirectly challenge conventional military power.
Intentionally pushing technological change itself can jeopardize any comfortable advantages in the military power a nation may possess. In the debate over the construction of H.M.S. Dreadnought, the world's first all-big-gun battleship among other innovations, it was feared that its appearance would make the existing superiority in pre-dreadnought battleships held by the Royal Navy obsolete. On the other hand, the basic technologies and operational knowledge that led to this new form of capital ship had proliferated; Dreadnought has contemporaries under consideration in the United States and Japan.1 In the ongoing debate over the deployment of anti-satellite (ASAT) weapons, there is the perspective that overt weaponization of space would jeopardize current U.S. superiority in space-force-enhancement capabilities by inspiring others to weaponize space as well. A competing perspective is that the space capabilities that the present-day U.S. military might is built on are only natural targets, the recognition of which would invite an asymmetrical response2 through space weapons. Throughout history, seeming patterns of military innovation and imitation have been upset by asymmetrical response and outright technological surprise.
Dual-Use and Spin-Ons
Dual-use refers to technologies and goods that may be used both by militaries and civilian industries. These include mundane items such as diesel engines and computer chips. Specialized technologies such as those found in the civilian chemical and pharmaceutical industries have militarized applications such as chemical and biological weapons production—a fact that has made limiting the proliferation of weapons of mass destruction (WMD) very difficult. The ongoing alarm over Iran's, allegedly peaceful, nuclear-enrichment activities is a reminder that nuclear technology is dual-use as well, despite the greater attention and fear this technology holds. Many of today's ETs are dual-use in nature, meaning that these too may proliferate under the guise of “peaceful use,” despite their effect on the military balance of power.
Spin-offs of military technology are well known, with examples such as jet engines, computers, and duct tape on the list of dual-use products with military origins. Spin-on military technologies are not quite so well known. The first trains, airplanes, and automobiles were civilian machines first and only later found strategic and tactical uses. Often new technologies gain acceptance in the civilian world first due to the generally more-benign operating environments found off the battlefield. The world's first tanks, another spin-on military technology, certainly had their share of mechanical difficulties and limitations, only made worse by having to be discovered and debugged in the misery of World War I. In military use, the horse was able to retain an edge over mechanization for a few years after World War I—it did have the advantage of millennia of operational experience over the relatively recent invention of the internal combustion engine and the caterpillar track.
Many of today's latest technological developments are often found in the civilian world with no real equivalents in the military. Deployment of personal digital communication devices in the military still lags behind the near-ubiquitous adoption of cell phones in the civilian world—military requirements and acquisition processes have led to a different marketplace for military digital radios. That said, there is interest in using off-the-shelf smart phones for some applications in the military.3 With the ETs, there is a race between the military and the medical communities to field the world's first production exoskeleton.
Technology and War
Since at least the shock of Sputnik in 1957, the measure of a nation's power has included its capacity for science. Technology is, however, more than science; it also encompasses the ability to produce something useful—an ability to convert abstract scientific discovery into something tangible. These two capabilities have intertwined throughout history to produce national wealth and security. Even with many high-technology industries being transnational, a nation's capacity to support a high-technology economy remains such a measure.
Definitions for technology include tools and the practical use of tools. Weapons are also tools, and their practical employment is codified in doctrines, tactics, and strategies. While it is often useful to have new weapons filling existing roles, full exploitation of a new military technology requires the development of unique and specific doctrines suited for the opportunities created. The early tanks of World War I were of questionable value4 without the effective tactics developed later in the interwar period. The correct usage of tanks did not become clear until World War II, during which the German blitzkrieg doctrine proved superior to those employed by France and Britain in 1940. The Allies and Soviets later employed many of the same tank doctrines to push Hitler's forces back into Germany.
New technology does not always lead to correct policies, as some technologies can be dead ends. Toward the later part of World War II, German tanks, such as the Tiger, were more advanced as individual pieces of technology, though this in itself hamstrung the war effort by being too advanced to be easily produced and maintained, arguably contributing greatly to Germany's defeat. By this time, the industrial capacity of the United States and Soviet Union were producing large numbers of simpler tanks, such as the M4 Sherman and T-34, which simply overwhelmed the small numbers of German super-tanks.
World War II represents the beginning of the conscious and deliberate link between technology and military power. The scientist came to prominence with the employment of revolutionary technologies such as radar in warfare, something that the Royal Air Force (RAF) had been able to exploit during the Battle of Britain to coordinate its limited number of fighter aircraft to blunt the previously unstoppable Luftwaffe. On the other side, the Nazi leadership became enamored with the creation of Wunderwaffe, wonder weapons, to stave off defeat toward the end of the war. These included jet aircraft, advance submarines, and guided missiles, all deployed in various stages of technological readiness (or lack of it).
That is not to say that a connection between technology and war did not exist before World War II. In fiction, the original Sherlock Holmes story, “The Adventure of the Bruce-Partington Plans,” involved the theft of plans for an advanced-technology submarine. The submarine was an ET in the late 19th and early 20th centuries, with doctrines still under development and fearful naval experts as well as pacifist elements of civil society calling for its prohibition. The long stalemate of World War I, characterized by the doctrine of trench warfare, was itself the product of the industrial revolution products such as the machine gun and barbed wire. Historians have reached back further to compare dissimilar weapons and tactics employed by groups in conflict, perhaps as a reflection of the current preoccupation with comparing the capabilities of today's military forces. However, in these past conflicts, technology and scientists were not celebrated (or demonized) to the extent they were in the aftermath of World War II and in the present day.
In the United States, there was the Manhattan Project5 to raise the profile of the scientist in war. Many Manhattan Project scientists became prominent, such as Edward Teller, who later became a vocal supporter of the Strategic Defense Initiative (SDI). Others, such as Leó Szilárd, became prominent in the disarmament movement. The initiation of the Manhattan Project itself was pushed along by the already prominent scientist Albert Einstein: the Einstein–Szilárd letter warning of the potential for a Nazi-developed atomic weapon was instrumental in convincing the United States to undertake atomic weapons development.
In the prelude to the Cold War, the dismantlement of Nazi Germany, Allied and Soviet forces raced to collect German scientists and examples of military technology. Many of the weapons of the early Cold War were based on this captured body of work and expertise. Both the Mig-15 and F-86 Saber owe their wing design to the German research. In the United States, former Nazi rocket scientist Wernher von Braun gained celebrity status promoting the U.S. Space Program, an offshoot of his work for the U.S. Army.
The Soviet Union had also infiltrated spies into the Manhattan Project, greatly accelerating its own efforts to develop the bomb. The Cold War espionage war between the East and the West highlighted the importance of military technology as both sides attempted to collect as much information as possible on each other's latest weapons and platforms. U.S. missile tests were monitored by Soviet surveillance vessels sitting off the coast in international waters. Spy and counter-spy technology pushed the limits of technology, with miniature cameras, passive electronic devices, and code-breaking supercomputers. Descendants of these secret devices are today's web cameras, radio frequency ID (RFID) tags, and home computers.
A defining moment of the Cold War technological race was the launch of Sputnik, the world's first artificial satellite, in 1957. Unlike the Soviet Union's first atomic bomb test, which could be attributed to espionage, the launch of Sputnik did, for a while, give the impression that the United States’ technological prowess had fallen behind—“There was a sudden crisis of confidence in American technology, values, politics, and the military.”6 This event was an ET “Pearl Harbor” due to the general surprise, shock, and later mobilization of the United States that resulted. The response included massive investment by the United States not only in the space program, but also in many areas of fundamental scientific research, as well as in education. Education remains a national security concern, as without scientists and technicians, the nation would fall behind in other technology areas, including those critical to protecting the country. At great cost, both financially and in the lives risked, the United States won the “space race” with the first manned landing on the Moon in 1969.
This climate of seemingly desperate technological competition sparked by Sputnik was the impetus for the creation of the Advanced Research Projects Agency (ARPA), later Defense Advanced Research Projects Agency (DARPA). DARPA has the mandate to:
…maintain the technological superiority of the U.S. military and prevent technological surprise from harming our national security by sponsoring revolutionary, high-payoff research bridging the gap between fundamental discoveries and their military use.7
While the civilian spaceflight program has languished8 since the heydays of the space race, DARPA continues to foster technological development by funding breakthrough, often seemingly unbelievable, research and development programs. The workings of the Internet, an ARPA invention, remain a mystery to the majority of people outside computer tech support. All of the ETs discussed here have publically known interest from DARPA.
The dominant military technology of the Cold War was nuclear weapon, with both sides eventually acquiring tens of thousands of warheads. These warheads armed what seems by today's standards to have been several rapidly replaced generations of bombers, missiles, and even artillery. Military aeronautics in the early Cold War went through a period of rapid development, when procurement programs would buy small quantities of a system with the expectation, and usually the reality, that something better would be available in a few years. Service lengths of less than a decade were not uncommon as systems rapidly became obsolete in the face of new technology. Today the development programs of aircrafts such as the F-22 and F-35 are decades in length, with the latter still not in service despite being officially started in the early 1990s with the merger of several similar programs that were already in progress.
There were, however, limits to this trend; technology and finances often present barriers to further performance increases. The U.S. B-52 is one example, as is the long drawn out history of attempting to find a successor to the M16/M4 rifle. At present, the United States is again considering a replacement bomber program for the B-52, an aircraft that first flew in 1954 and presently is expected to be in service beyond the year 2040.9 Consumer computer technology has arguably already run into something similar, with clock speeds no longer jumping as rapidly as they were only 10 years earlier due to physical limitations, such as heat dissipation, coupled with the average computer user not actually needing much more for applications such as e-mail, web browsing, and word processing.
During the Cold War, there emerged a very public debate over the usability of weapons. Outside the disarmament movement, there was a degree of uncertainty among actual defense academics over the utility of nuclear arms. At the time, the Soviet Union possessed a numerical advantage in armored forces, and the (conventional) warfare paradigm that emerged out of World War II involved numbers. The reality of Soviet armor superiority would have put the West in a precarious strategic position, if not for its nuclear arsenal, including the controversial tactical nuclear weapon concept. Among the problems with tactical nuclear weapons was the unclear line between them and strategic weapons. It was feared that tactical nuclear weapon employment to stop the clichéd Soviet armor invasion of West Germany through the Fulda Gap would escalate to an all-out nuclear war. Linked to this was the question of whether the United States would risk nuclear attack on itself to protect Western Europe. Additionally, there was the problem that defensive use of nuclear weapons would mean using them on the territory being defended. The sheer power of nuclear weapons led them to being something of an “unusable” weapon, too powerful for the majority of post–World War II conflicts, but still needed to prevent others who did not share that view from using them with impunity.
While the United States and the North Atlantic Treaty Organization (NATO) were debating the merits of the tactical nuclear weapons doctrine, and nuclear deterrence was becoming the core strategic reality governing East–West relations, the computer revolution was taking place, allowing the creation of unquestionably effective PGMs. The force-enhancement effect of precision-guided weapons allowed the United States and NATO to provide an additional counter to Soviet's numerical superiority. While PGM technology was maturing, the West retained the option to go nuclear, and ultimately nuclear weapons remained the core military concern during this transition period. Presently, nuclear weapons are of diminished but still important status,10 and hi-tech conventional warfare is an entirely usable option as has been proven since the 1990s.
An important consideration in the choice to use military forces is risk, something that technology can modify but cannot remove entirely. For some, the risks of nuclear technology, specifically mutually assured destruction (MAD), were a factor in superpower stability, which prevented major systemic war from breaking out during the Cold War. There were concerns raised on both sides of the Iron Curtain over the potential for new technologies such as improved missiles or anti missile systems to be destabilizing—to change the risk calculus of nuclear war, thereby making the nuclear option more likely. For others, the risk of global nuclear annihilation was generally unacceptable, leading to calls for disarmament.
Among the force enhancements conferred by precision-guided weapons is survivability of the attacking force—effective reduction in risk for individual soldiers arguably increases the tolerance of a society toward warfare due to diminished risk. Precision weapons increase survivability by reducing the number of sorties to achieve the same effects. Standoff precision weapons, such as drone aircraft, increase survivability by allowing a greater distance between the target and the attacker. However, society's tolerance, or intolerance, to risk now extends to that of unnecessary application of force—risk aversion toward collateral damage.
To be clear, precision warfare was developed primarily as a force enhancer—to allow relatively small forces to accomplish tasks that required in past much larger investments in force. The humanitarian aspect only came to the fore later as these weapons were used in conflicts other than total war, and indeed in military operations other than officially declared war. Arguably, increased precision as a simple force-enhancement capability for use against conventional military foes has diminishing returns because hard military targets do require a minimum of force to destroy, though the precision available would, on the surface, appear to be sufficient. However, hard military targets are not the only threats, and the United States and its allies have continued to invest in ever smaller and more precise weapons to counter smaller threats, possibly with the next wave of ETs down to individuals in a crowd.
That is not to say that these efforts are always successful; like all real-world implementations of concepts, there must be room for accidents, unforeseen events, and mistakes, such as misidentification of targets. Indeed the lower destructive power of today's focused weapons has led to charges that their very precision has made their use all too alluring for politicians—warfare made acceptable due to overhype that war can be fought without collateral damage. The argument is that, unlike nuclear weapons, precision weapons are all too “usable,” though whether this is a good thing depends on one's personal view on war as an instrument of policy. However, in a reality where not all threats and international crisis can be dealt with by discussion and negotiation, the military options will remain viable. Whether by design, accidental by-product, or requirement forced by a media-saturated society, it is likely that there will be continued investment in increasingly precise weapons. This, in turn, can only benefit civilians caught in the cross fire. While not all collateral damage can be prevented, at the very least the paradigms of the current and near-future military affairs will continue to emphasize greater discrimination between friend, foe, and bystander and therefore provide more options toward reducing such damage.
Today's Military Emerging Technologies
In the second decade of the 21st century, several emerging technological areas expected to reshape civilization have been identified. Three of these technological fields are advance computing, nanotechnology, and biotechnology. All three of these are in general dual-use, meaning that the only certainty with these technologies is that there will be military applications. However, the history of technological prediction, futurism, is littered both with failed dreams and unexpected challenges that have slowed development. The long and troubled development of ubiquitous (military) space access and directed energy weapons (DEWs) round out any discussion on emerging military technologies.
Chapter 2: Ubiquitous (Military) Space Access
Prior to the current information age was the space age, another era of technological optimism. The harsh financial and related technical challenges of space access have ended most of those dreams, though for the U.S. military this limited access has proven sufficient to build the foundations for contemporary U.S. military dominance. Exotic propulsion technologies have offered to change the price of spaceflight, keeping many space age dreams alive across generations of enthusiasts. In the near term, however, streamlined operations, advance materials, and control technologies offer to slash costs via old-fashioned rocketry. If either form of ubiquitous space access becomes viable, there will be security implications. Enhanced space access does not exist purely for its own purposes, and unlike many space dreamers, the military has been funding space access research since the very beginning. Although results for many have been too slow, many of the technologies being invested in today are finally reaching maturity. Although space access is expected to remain expensive, any lowering of the cost will affect the military first. Despite the present-day constrained funding environment, space remains the “highest of the high ground.”
Chapter 3: Directed Energy Weapons
Probably the most unabashedly military of ETs, DEWs have also been stuck in emerging status for decades. Demonstrated under laboratory conditions, the deployment of DEW systems have been stalled by many operational challenges. Convergence with other technologies may, however, produce a robust enough system worthy of being called a weapon. As an ET, the form of DEW systems and doctrines remains to be seen. Will it be a niche capability useful in only a few specialized roles, or will it come to be commonplace as science fiction often predicts? There is also the reality that physical forms of delivering destructive energy will be difficult to displace from the battlefield, possibly leaving DEW again as nothing more than an overhyped light show.
Chapter 4: Advance Computing
As computer technology advances, weapon systems have the potential to gain greater autonomy. Depending on what is included, robot weapons are either an ET, or an old concept. While robotics does have much to offer, there is also opposition to certain aspects of the weapon-bearing robot concept. This goes beyond the many technical problems of entrusting an autonomous machine with a lethal capacity and will discuss political, legal, and possibly ethical roadblocks. The existence of a potentially “disposable” soldier in the form of a robot or remotely operated weapon has both opportunities as well as controversy.
Chapter 5: Nanotech
Nanotechnology, the application of engineering on the nanometer (one billionth of a meter) scale is often touted as the next “big thing” for society at large and already has a host of emerging, short-term, and long-term military applications. Putting aside the hype, nanotechnology can be regarded as the basic sciences and engineering simply done at the finest levels of precision—many of the products of this field are simply refinements on existing goods and services. In the near term, nanotech represents a new form of material science, which in the 21st century must face more mundane scrutiny over its potential unintended side effects, such as environmental impact.
Like many of the ETs, there is international competition in this field, including in military applications. The forms of nanotechnology currently pursued are largely low-energy processes that may be easily hidden. Futurists even envision the proliferation of nanotechnology tools to empower those who previously had none. However, not everyone shares this vision of a bright shiny future assembled by nanotechnology, and the United States may be surprised by who will be able to use this technology for nonpeaceful applications. While both nuclear and nanotechnologies are in general dual-use, nanotechnology has not acquired the same stigma, though there are those who argue that nanotech represents as great, or even greater, an existential (extinction) threat to humanity than nuclear proliferation. These worries range from nanotech contributing to humanity simply being replaced as the top species on earth, to the now largely defunct “grey goo” scenario. The ease at which nanotech is expected to spread, essentially for economic and perhaps even charitable reasons, in a security and defense mindset represents a wide variety of potential new threats or spins on old ones.
Chapter 6: Biotechnology
The emerging biotechnologies promise minute control over the functions of life and death. In between these extremes are opportunities to enhance the human condition, and to enhance the individual soldier. Improved military health care will only be a starting point for the things that the convergence of biology, computers, and nanoscale manipulation will allow.
Biotechnology opportunities will, however, raise many troubling questions about where this fine control over life itself will lead. The military will have different answers to the many societal questions raised by advanced biotechnology. In an all-volunteer force, would some applications such as genetic screening be more acceptable than others, say for an insurance provider? Military service involves sacrifice in the defense of one's country, but will the opportunities provided by biotechnology be asking for too much on Western society? This is a proliferating technology, so competitive and security challenges posed by other nations’ exploitation of biotechnology may take this decision out of the hands of the general populace.
Convergence
The seemingly unpredictable nature of technological development is reflected by perhaps the ultimate expectation of the continued exponential growth in computer power combined with the other ETs, the “technological singularity.” Although sci-fi sounding, a singularity denotes points where the known laws of the field, such as physics, fail, a definition that in turn seems to tinge this real-world concept with fantasy. Up until the singularity of a black hole, the point where so much mass has collected that escape velocity exceeds the speed of light, physics can explain it. At and beyond this singularity, things become open to much more speculation. The many possibilities of the technological singularity have spawned its own subculture or subset of posthumanism philosophy, “transhumanism,” where superior computer intelligence brought on by the convergence of many ETs is used to either augment or replace ordinary humans, often with all sorts of utopian outcomes for the human condition. Those who believe that this event will occur—and there is a lot of debate on how much computer power is needed, and even if computers are suitable for replicating the mind—expect an unimaginable amount of societal and technological change once humanity can produce something more capable than natural-born human intelligence. Alternatively, much of today's dystopian science fiction concerns a future where this and other ETs have run amok, and the rise of artificial intelligence is not beneficial to humanity. Clearly not everyone agrees that intentionally working toward creating humanity's replacement is a wise idea.
Although the notions encompassed by the concept of the technology singularity are largely beyond the time frame under consideration here, not everyone would agree. A range of technologies are converging that give hope to those who want to soon see the miracles of a postsingularity world. Computer capabilities, as given by number of transistors on an ordinary computer chip, are increasing at an exponential rate with no end expected in the near term. Research into advanced computer capabilities, such as an ability to learn and reason, is being fuelled by both business and international competition. Already software mimicking components of living brains, neural nets, are in use to solve a range of computational tasks that other techniques cannot handle efficiently. Among the things neural nets are used for is machine learning, the capacity for a computer to be trained and gain experience. For both defense and civilian applications, the ability to learn is a possible route to producing autonomous machines capable of operating outside controlled environments.
The study of the human mind, aided by the rapid and ongoing development of computerized tools, is increasing our understanding of all aspects of intelligence. Successful technology investor Ray Kurzweil has predicted that by 2030, still within the near to midterm, computer technology and biotech would have advanced to the point where it would be possible to scan and then replicate a functioning human brain as a computer simulation. A 2007 U.S. Congress study put the necessary technology for singularity sometime after 2020. In the context of U.S. government's activities, there are, as mentioned earlier, military procurement programs that have taken longer to reach initial operating capability (IOC) than the estimated minimum of 13 years for the technology base needed for the singularity to appear. Quite clearly much is expected from the Emerging Military Technologies.
Notes
1. Robert K. Massie, Dreadnought: Britain, Germany, and the Coming of the Great War (New York: Ballantine Books, 1991), 487.
2. The symmetrical response to U.S. space power is the replication of space-force-enhancement capabilities. This in turn would build a case for the United States to weaponize space, as it would no longer have an advantage, and therefore, there would be utility in being able to disrupt the use of space by others.
3. United States Army, “Connecting Soldiers to Digital Applications,” Stand-To!, July 15, 2010, http://www.army.mil/standto/archive/2010/07/15/.
4. Keir A. Lieber, War and the Engineers (Ithaca: Cornell University Press, 2005), 101–3.
5. It must be noted that the Manhattan Project had British and Canadian involvement.
6. Paul Dickson, Sputnik: The Shock of the Century (New York: Berkley Books, 2007 edition), 4.
7. Defense Advanced Research Product Agency, “About,” http://www.darpa.mil/About.aspx.
8. Notwithstanding the remote but real threat of impact by natural space debris, space exploration simply does not have the same priority as the practical needs of defending the state. This, however, does not stop spaceflight enthusiasts from bemoaning the large disparity between the Department of Defense's budget versus that of NASA. That said, it must also be remembered that NASA's mandate goes beyond spaceflight and includes research into flight phenomena wholly within the atmosphere.
9. United States Air Force, “Factsheets: B-52 STRATOFORTRESS,” http://www.af.mil/information/factsheets/factsheet.asp?id=83.
10. The United States, United Kingdom, and France maintain a policy of ambiguity over whether they would be first to use nuclear weapons, though recent policy changes by the Obama administration, such as the April 2010 Nuclear Posture Review, have indicated that for nonnuclear nations, with a few exceptions, the nuclear attack option was not a policy option. This document, however, does not rule out first use of nuclear weapon given the appropriate adversary and circumstances. The People's Republic of China (PRC) and India have No First Use policies toward nuclear weapons; however, with closed societies, such as the PRC, there is some uncertainty over the credibility of this policy.