5

Cities, Communications Technology, and the Feedback Dilemma

In the months and then years after a pair of nuclear bombs levelled Nagasaki and Hiroshima, killing hundreds of thousands of civilians, a growing contingent of scientists, among them leading American physicists connected to the Manhattan Project, began expressing their profound unease with the deadly military uses of scientific knowledge. The Bulletin of the Atomic Scientists became the focal point of their conversations about what physics had wrought in the service of geopolitical conflict.

One of the early running themes had to do with the future of America’s cities: in the context of an accelerating nuclear arms race, what were the social and economic implications of weapons that could wipe out entire urban centres – not just the residents, but also infrastructure and industrial capacity? ‘In an atomic war, congested cities would become deathtraps,’ opined Edward Teller, often described as the ‘father of the hydrogen bomb,’ and two collaborators in a 1948 essay in the Bulletin. ‘A country like the United States … is particularly vulnerable to the devastating impact of atomic bombs’ (qtd in Kargon & Molella 2004, 764–77). The article’s title revealed their solution: ‘The Dispersal of Cities and Industries.’ Others, including leading urban planners, soon widened this discussion, recommending radical de-urbanization strategies that encouraged cities of no more than 100,000 people, highly dispersed and intentionally decentralized. The notion of upending long-standing conventions of urban planning in response to the threat of nuclear war seems incomprehensible today, but was seen then as a rational response to potential annihilation.

Yet Norbert Wiener, a then-prominent MIT mathematician who had coined the word cybernetics to describe the complex interactions and feedback loops between humans and machines, had grown uncomfortable with the implications of these planning ideas. He did not participate in the top-secret development of nuclear weapons during WWII but quickly joined the chorus of leading scientists sounding alarm bells in the postwar period, especially as the U.S. government stepped up its nuclear arms program and then entered the Korean War (Kargon & Molella 2004).

Wiener began working on an alternative approach to the threat facing cities with two MIT colleagues, a political theorist and a philosopher of science. Their concept, dubbed ‘life belts,’ called for the creation of radial beltways encircling major American cities, which would contain vital institutions like hospitals, as well as backup infrastructure. They reckoned post-attack chaos would be as deadly as the bombs themselves. ‘These networks were designed to control and direct the flow of traffic towards safe areas at the urban periphery during the hours immediately following a nuclear detonation aimed at the concentration of people, goods, and services in the city centers, while also providing bypass routes for major railroads and highways,’ Columbia University architectural historian Reinhold Martin recounts in a 1998 essay about Wiener’s ideas and their implications for urban space (Kargon & Molella 2004).

In a long feature published in December 1950, entitled ‘How U.S. Cities Can Prepare for Atomic War,’ Life magazine detailed Wiener’s proposal. The story included eye-catching visuals, among them photos of brutally congested downtown streets and god’s-eye-perspective renderings of this tidily reorganized radial city of the atomic age. ‘Life belts around cities would provide a place for bombed out refugees to go,’ the article claimed. (Wiener and his colleagues intended to publish their plan in a scientific journal but never got around to it.)

Though perhaps not a household name, Wiener in the postwar era was a well-known and influential academic whose 1948 treatise ‘Cybernetics: Or Control and Communication in the Animal and the Machine’ laid down some of the most far-reaching ideas behind technologies like artificial intelligence, process control, and computer vision. (His 1950 book, The Human Use of Human Beings, wanted to introduce these concepts to a general readership.) Anticipating a future that included computers, Wiener sought to establish a discipline that described the emerging relationship between human beings and machines, including those that might someday be capable of higher-order reasoning – a development he considered theoretically possible.

The acuity of his vision of the future is evident in many realms, not least in the highly individualized ways we use our laptop computers and smart phones. The unique configuration of settings, downloads, apps, online connections, and modes of storing files on our devices that each of us has developed, as well as the countless options provided through operating systems and software, attests to the existence of a symbiotic relationship that is almost as distinctive as a human fingerprint.

Wiener was also interested in feedback loops – iterative processes found in biology, technological systems, and social dynamics. Thermostats, for example, maintain steady heating or cooling levels through negative feedback loops, i.e., by constantly reading ambient temperature and adjusting the furnace or air conditioner accordingly. But positive feedback loops can create highly unstable outcomes – for example when a microphone linked to a speaker picks up its own sound. A handful of people panicking in a dense crowd may prompt a rapidly spreading sense of fear that feeds on itself and triggers a stampede.

Cities also serve as a locus for all sorts of negative and positive feedback loops, such as traffic light controllers that automatically adapt to vehicle volumes. But there are other urban feedback loops that create unintended outcomes. Consider the ‘discovery’ of an out-of-the-way and interesting neighbourhood. As word about its value as a hang-out or tourist destination spreads, the area attracts new restaurants, bars, and shops, and then more visitors, more selfies, more Yelp reviews, etc. The dynamic causes rents to rise, gentrification to accelerate, and ‘incumbent’ residents to move away. This negative feedback loop, in other words, gradually and then rapidly obliterates the neighbourhood’s perceived authenticity.

As he laid out cybernetic theory and considered the Cold War risks facing dense urban regions, Wiener saw analogies between biological processes such as the human circulatory system and the ways in which information moves through cities. ‘We have conceived the city as a net of communications and traffic,’ he and his collaborators wrote in their 1950 paper on lifebelts. ‘The danger of blocked communications in a city subject [to] emergency conditions is closely analogous to the danger of blocked communications in the body.’ If a nuclear attack was akin to a massive stroke, they argued, the city had to take preventative measures, such as ensuring that its residents could disperse quickly. As technology historians Robert Kargon and Arthur Molella noted in a 2004 account of that Life article’s impact, ‘Wiener’s attempts to extend cybernetics into civil defense policy and urban planning may seem an act of scientific arrogance, but it was a genuine expression of his commitment to an interdisciplinary outlook.’ They add that he, like some other Cold War scientists, had become ‘skeptical that technological advancement would improve humanity’s state.’

Wiener wasn’t the only one thinking about nuclear warheads and cities. The Eisenhower administration launched its extensive interstate highway-building program in part to create transportation corridors that could be used for civilian defence or military purposes. Military contractors working for organizations like the RAND Corporation were also busy consulting municipal officials on how to use the so-called scientific management techniques employed in the armed forces to plan for not just civil defence, but other ‘domestic security challenges,’ such as the growth of inner-city slums, according to MIT sociologist Jennifer Light, author of From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War America.

‘If at first they used the language of cybernetics and scientific approaches to urban problem solving, soon their justifications would expand to include the language of military attack,’ writes Light, who notes how municipal officials in the postwar period also learned to employ aerial reconnaissance photo interpretation techniques developed for military uses in their land-use planning (Light 2005, 64).

Yet perhaps the most intensive effort to safeguard American assets from attack involved information technology. By the late 1950s, every home had a telephone; radio and television were mass communications media; and a growing number of large organizations, including universities, governments, and the military, had invested in mainframe computers, many of which were set up to share information and software through rudimentary networks.

‘In 1960, the RAND Corporation commissioned a study of methods for protecting telephone and data links between U.S. military bases and their command and control centres, in the event of a nuclear war,’ explained Philip Steadman, a professor emeritus of urban studies at University College London in a 1999 anthology entitled American Cities and Technology: Wilderness to Wild City (Roberts & Steadman, 251).

With the prospect of an unmanageably large bill for hardening all that communications infrastructure, a RAND electrical engineer named Paul Baran proposed a workaround. His idea involved breaking electronic messages into many smaller ‘message blocks,’ each of which would include what today would be dubbed metadata: their point of origin, their destination, and instructions for reassembling all the other blocks to reproduce the original message.

Baran, Steadman wrote, also devised a technique that would direct these blocks through telecommunications networks along a myriad of possible paths – a deliberately nimble approach. At almost the same time, a Welsh computer scientist named Donald Davies came up with a nearly identical system, which he called ‘packet switching.’ In both cases, the goal was to leverage the dispersed architecture of a network of computers to mitigate the risk to the entire communications system if any individual node was damaged or destroyed – i.e., it had survivability. Their ideas, in some ways, offered a mirror image to what Wiener had proposed for making cities more resilient in the face of a calamitous attack. These solutions, taken up first by the military and later by leading research universities, laid the groundwork for ARPANET, which would eventually evolve into the internet.

For most of recorded history, information, with a few exceptions, travelled only as fast and as far as humans could propel themselves. In other words, communications technologies were, by default, transportation modes and technologies: horses, Roman roads, caravan routes, carriages, ships, and eventually railways. The modes of communication themselves ranged from letters or books to diplomatic pouches and commercial or financial documents, such as bills of lading, bearer bonds, promissory notes, or letters of credit.

Information, of course, had a very broad range of uses, including strategic purposes. In the early 1800s, members of the Rothschild family, which operated a far-flung banking and trading network throughout England and Europe, developed tactics for accelerating the pace at which they received the commercially sensitive information – e.g., decisions of foreign governments – that they needed for transactions. Instead of depending on mail service, they employed their own couriers and ships in order to further hasten the speed of their inter-company communications. The firm also used carrier pigeons to transmit stock prices, and colour-coded envelopes to signal shifts in exchange rates. Anticipating the resilience built into digital communications networks, the Rothschilds also sent multiple versions of the same message through different routes as a hedge against the risk of intercepted notes (Ferguson 1999).

In the 1840s, the commercialization of the electric telegraph rapidly altered long-distance communication. As Steadman points out, the earliest British customers were railway companies, which deployed telegraph service along their networks and used it to transmit information about schedules and even tips for police pursuing escaping suspects. In the U.S., earlier adopters included news wire services, like the Associated Press, and stock exchanges. The Boston police department set up an internal telegraph network to connect precincts and allow beat cops to send messages to headquarters from police boxes located on city streets.

By the 1860s, tens of thousands of kilometres of telegraph cable had been laid, including transatlantic lines. Within cities, thousands of bike couriers physically delivered telegraph messages to recipients, meaning, as Steadman points out, that ‘communications between cities was at this stage much faster than communications within cities.’

Alexander Graham Bell upped the technology ante with the invention of the telephone in 1876. While his first public call was a long-distance one, between Brantford and Paris, Ontario, Bell envisioned a means of providing local service that was inspired by the underground networks built in previous periods by water, gas, and electricity utilities. ‘In a similar manner,’ he explained to his investors in an 1878 letter, ‘it is conceivable that cables of telephone wires could be laid underground, or suspended overhead, communicating by branch wires with private dwellings, counting houses, shops, manufactories, etc., uniting them with a main cable to a central office where the wires could be connected as desired, establishing direct communications between any two places in the city’ (Roberts & Steadman 1999, 243). But as it happened, interest in telephone service was keener in remote rural areas. ‘This confounded the telephone companies, who anticipated only urban markets,’ says Steadman. Those differences soon evaporated. By the 1920s, he notes, telephone service had more or less eclipsed the telegraph to become the dominant form of interpersonal communications technology.

One can certainly argue that the intervening century, from a technological perspective, has been defined, if not utterly dominated, by rapid and relentless advances in information and communications technology (ICT) – from radio, TV, and record players to digital switching, fax machines, cellular service, personal computers, the World Wide Web, high-speed internet, smart phones, fibre optic cable, WiFi, e-commerce, CCTV, GPS, artificial intelligence, the internet of things, 3-D printing, 5G mobility, cryptocurrency, etc., etc., etc.

As is also true for air travel and the private automobile, these technologies have all served to collapse distance, whereas the civil engineering advances that emerged in earlier periods, and particularly during the Industrial Revolution, responded to the challenges posed by the perils of urban proximity: fire, infectious disease, waste, extreme overcrowding.

From the first days of the telephone, examples of the device’s particular impact on cities and urban form soon surfaced. Telephones seemed to be very well suited for large office buildings, both during the construction process, when they allowed crews working high up to communicate with site managers, and once the building was in use, so office employees on different floors could connect with one another. Steadman offers other early ‘use cases,’ such as room service in high-rise hotels, although he and others caution that telephone technology alone didn’t spur the development of high-rise buildings: the emergence of building technologies such as steel frame construction, curtain walls, and automatic elevators all played a critical role.

In the 1960s, American planners and defence analysts sought to adapt another Cold War–inspired technology, coaxial cable TV, for urban (i.e., civilian) applications, namely the creation of widely accessible communications networks that would connect poor inner-city residents to education and local government services via two-way video as a means of improving social welfare. As Jennifer Light comments of three lengthy government discussion papers that considered this ultimately stillborn idea, ‘[C]able was never discussed as the money-making entertainment system it eventually became. Rather, each report imagined an urban infrastructure developed to provide a variety of services to users in their homes or in neighbourhood telecommunications centres, an infrastructure with capacity for two-way communication as well as instruction’ (Light 2005, 181). The high capital costs, however, worked against this model for community-access cable, and the nascent cable industry pivoted to mainly commercial programming by the late 1970s.

The much wider question is what role ICT has played in shaping urban regions over the past fifty years, and whether the influence of one extremely powerful family of technologies can be prised apart from other city-shaping forces, from the Baby Boomers’ suburban exodus to mass migration, trade liberalization, and geopolitical conflict over natural resources. It is a puzzle that has preoccupied sociologists, geographers, planners, and policy makers for several decades. These problems, moreover, have also given rise to big-bang theories, such as the central role of networks in the Information Age and the cumulative impact on society of the so-called third and fourth industrial revolutions.5 Some analysts describe the modern, wired city as, essentially, a communications hub.

Long before Zoom and shared cloud-based files, Norbert Wiener recognized that there would come a day when professional services wouldn’t need to be co-located with more mechanical forms of labour and production. In The Human Use of Human Beings, he describes how an architect in Europe could work remotely on a building in the U.S., using existing technologies like a ‘teletypewriter’ and ‘ultrafax’ to transmit facsimile images of blueprints ‘in a fraction of a second’ to the construction manager. ‘The architect may be kept up to date with the progress of the work by photographic records taken every day,’ he mused. ‘In short, the bodily transmission of the architect and his documents may be replaced very effectively by the message transmission of communications which do not entail moving a particle of matter from one end of the line to the other’ (Martin 1998, 113).

Wiener’s prescience is remarkable, yet it skips ahead several steps in the emergence of the digitally connected city states that now dominate the global economy – mega-regions like Shanghai, Seoul, and Manila, whose eight-digit populations and economic heft rival those of small countries. From 1970 to 2021, for instance, metropolitan Jakarta’s population soared from about 4 million to 30 million people.

Many forces have driven the eye-popping growth of these places, including the waves of trade liberalization that encouraged manufacturers to relocate to low-wage, low-regulation regions, especially since the end of the Cold War. But a handful of pivotal developments in the way multinationals organize themselves and market their products served to turbo-charge the growth of megacities in the global south.

Beginning in the 1970s and accelerating in the 1980s, Japanese carmakers started adopting two transformative changes in their manufacturing processes: a new approach to ensuring quality control, and what would become a revolution in the way they managed inventory and suppliers. The premise of ‘just-in-time’ (JIT) manufacturing was that large industrial operations, like auto assembly plants, shouldn’t warehouse the myriad parts they used to make products. The practice of storing inventory represented a cost and could be pushed back to parts suppliers, which were told to deliver components only when the assembly plant needed them. Those components, in turn, had to satisfy the automakers’ exacting quality control standards. These process engineering changes meant higher expectations and increased competitive pressures for the suppliers, who could lose customers if they didn’t perform.

By the 1980s, the Big Three U.S. automakers were rapidly losing market share. German carmakers, with their long history of precision engineering and collaborative labour relations practices, retained their position, but Japanese and Korean manufacturers saw dramatic growth in their fortunes, including in North America, the birthplace of the private automobile. U.S. carmakers had to scramble and play catchup, and soon began building their own JIT systems.

Much more than conventional manufacturing, JIT depends on markedly improved communications, especially between firms. The system only works if a manufacturer knows precisely what it needs and when, and can then transmit this data to its suppliers in a timely way. Fundamental to the logic of this approach is seamlessly efficient information-sharing and highly efficient distribution. JIT also had a pebble-in-the-pond effect. Parts suppliers began to push their own suppliers to adopt the same outlook, and so on down the line, meaning that the ethic of just-in-time manufacturing and distribution spread rapidly through the automotive food chain and then beyond, into virtually every other industrial sector.

The parallel development involved a seismic shift in corporate thinking about brand. As journalist Naomi Klein first explained in No Logo: No Space, No Choice, No Jobs, consumer-products giants recognized that branding meant far more than logos and other marketing communications. They could be used as a way of organizing production, with global brands like Nike reinventing themselves as decentralized postmodern corporations in which product design, strategic marketing, and promotion were managed centrally, but the actual production could be outsourced to suppliers operating in low-wage countries – a highly scalable formula that generated huge margins.

These disruptive changes in manufacturing occurred alongside – and by no means independently of – the equally dramatic upheavals in ICT: the introduction of desktop and then laptop computers, the rapid uptake of email and other intra-company digital networks, and finally the commercialization of the internet in the mid-1990s. Business software tools proliferated, as did both advanced manufacturing and corporate technologies, from robots to bar-code scanners, data mining algorithms, wifi sensors, secure electronic funds transfers, biometric devices, logistics software, and all the other building blocks of what would become a highly optimized global economy driven by torrential flows of commercial and financial data.

In many post-industrial city regions, robust economic growth has come with increasing social disparities, income inequality, and housing shortages. Urbanization in the global south has produced vast slums, such as the favelas encircling some Latin American cities, as well as the rise of truly Dickensian forms of economic activity generated by global trade, such as teeming recycling depots sustained by the importation of mountains of consumer, packaging, and electronic waste from developed nations.

At the same time, many big cities, especially those anchored by diverse regional economies, post-secondary institutions, and major cultural facilities, have become magnets for what University of Toronto geographer Richard Florida dubbed the ‘creative class’ – a broad category that includes professionals, knowledge workers, and artists. Cities with a large number of ‘creatives,’ Florida found, tended to be more tolerant, denser, and walkable, and they also could offer quality-of-life amenities, from theatres to trails to high-quality schools. These kinds of cities also fuelled the growth of so-called edge cities – satellite suburbs that attracted high-tech manufacturing in part because local municipalities made sure to equip these places with all the digital and mobility needs that such employers sought (Audirac 2022).

These city regions, as it turns out, have tended to attract the highly skilled people who work in the nerve centres of major financial institutions or global companies, as well as tourists and other categories of travellers, such as those attending trade conventions in the pre-pandemic era. Consequently, so-called creative class cities are also places that have been particularly susceptible to accelerating gentrification and real estate speculation.

While international trade is nothing new, Columbia University urban sociologist Saskia Sassen has argued that game-changing advances in information and communications technology have unleashed these huge transformations in cities, characterized by the dispersal of manufacturing, the ascendence of international professional services and financial behemoths, and the concentration of capital and power in global cities – a widely used term she coined in the mid-1990s (219).

Since then, many urban experts have sought to come up with ways to name, characterize, and classify these new types of cities. The University in Leicester, in the U.K., maintains a detailed ranking of global cities based on a wide range of metrics; Sassen is one of the project’s founders. Another, the Spanish sociologist Manuel Castells, has charted the global growth of networks and information cities. Some groups have promoted notions such as ‘intelligent communities,’ ‘innovation clusters,’ and even ‘mega-regions.’ Yet in the 2010s, a new label, meant to express the ethic of the digitized, cosmopolitan metropolis of the near future, began to gain currency. As an urban brand, it was at once descriptive and aspirational:

Smart.

5. Coined by Klaus Schwab, the German engineer and economist who founded the World Economic Forum, Industry 4.0 describes the global move toward advanced automation and technologies like 3-D printing. It succeeded the third industrial revolution, which marked the widespread use of computing and digital technology.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!