Conclusion

1

Science, Technology, and Pandemic Cities

About two weeks after the World Health Organization declared a global pandemic in March 2020, Google launched a digital service ostensibly designed to help public health officials around the world respond to the COVID-19 outbreak. The project provided data and data visualization tools that could allow health officials to ‘see how your community is moving around differently due to COVID-19.’ As the search giant explained, ‘we’ve heard from public health officials that the same type of aggregated, anonymized insights we use in products such as Google Maps could be helpful as they make critical decisions to combat COVID-19.’

The Community Mobility Reports ‘charted movement trends over time by geography, across different categories of places such as retail and recreation, groceries and pharmacies, parks, transit stations, workplaces, and residential’ (‘See How Your Community …’ 2022). For any geographical unit – from nations to states, provinces, counties, and cities – these reports would show how traffic in these six categories compared to a ‘baseline,’ established presumably before the pandemic.

The idea, apparently, was to visually depict whether people were adhering to lockdown orders, travelling less, or changing their use of public spaces, like parks. (The reports were created by aggregating location data generated by smart phones with the Google Maps app – a technique developed for the company’s various mapping services. The data is available only if the user has enabled location tracking on their device.)

Google’s public relations teams pushed out news of this initiative, and media organizations from around the world eagerly offered up coverage. ‘With the data, public officials can, for instance, recommend changes in the business hours of grocery stores to thin out crowds,’ reported Singapore’s Straits Times. ‘Similarly, crowding at certain transportation hubs might indicate the need to run more parallel routes for social distancing.’ Senior Google executives stressed that the data being used had been anonymized to protect privacy.

Noting Singapore’s strict COVID-19 containment policies, the Straits Times in the same article also mentioned another digital technology that emerged early in the pandemic – so-called ‘contact-tracing apps’ that were already in wide use in China and South Korea. Once installed and activated on a smart phone, these apps would hold information about whether or not the owner had had a positive COVID-19 test, and if so, when. If an infected individual came within 2 metres of anyone else carrying a smart phone with a contact-tracing app, the two devices would acknowledge each other through a Bluetooth signal. When that happened, the app automatically notified the local public health authority, whose staff would reach out to the potentially exposed individual to determine if the exposure required a follow-up or isolation.

Google and Apple collaborated on the technology and made it available to governments scrambling to find ways of limiting the spread. In a paper published in Science, the prestigious academic journal, a team of Oxford University big-data experts modelled the potential of ‘digital contact tracing’ and concluded that ‘a contact-tracing app that builds a memory of proximity contacts and immediately notifies contacts of positive cases would be sufficient to stop the epidemic if used by enough people, in particular when combined with other measures such as physical distancing and widespread testing.’ In interviews, the principal investigators predicted that widespread deployment of these apps would massively reduce transmission, prevent resurgence, and help limit the social and psychological impacts of lockdowns (‘Controlling Coronavirus Transmission’ 2020). The research was backed by the Bill and Melinda Gates Foundation and the Wellcome Trust (Ferretti et al. 2020).

While Science’s contact-tracing paper was downloaded over 21,000 times and governments around the world hustled to promote these apps, the miracle of automated contact tracing did not materialize. Whether due to technical glitches or an insufficient number of users, the hype around contact-tracing apps ebbed almost as swiftly as it crested. A few assessments, in the U.K. and Switzerland, found some positive results, but the proportion of the overall population that had downloaded the apps remained relatively low, limiting the effectiveness of a technology that seemed, well, overdetermined. By the end of the pandemic’s first year, few health officials were paying attention to these apps, and many people had deleted them (Lewis 2021).

Google’s community mobility reports met a similar fate. After a burst of publicity, and the occasional media report showing how traffic patterns had changed, these data visualizations didn’t appear to be doing much to assist public health officials. They were much more focused on collecting actionable data, such as differences in infection rates among wealthier and lower-income neighbourhoods, or the accessibility of vaccines.

One of the few academic studies to cite the mobility reports, a 2021 paper on the effectiveness of stay-at-home orders published in the American Journal of Public Health, used the mobility reports in its analysis but cautioned that Google’s data came with significant caveats: it didn’t register movement by people who’d turned off location tracking or weren’t carrying their phones, and lacked sufficient granularity to provide a crisp picture of whether stay-at-home orders were affecting different neighbourhoods or socio-economic groups.

While neither of these technologies were developed as part of a smart city agenda, they had all the hallmarks: novel (and unanticipated) uses for big data and mobile devices within an urban context, pressed into action to support a public service. Though neither contact-tracing apps nor Google’s community mobility reports had an explicitly urban focus, these ‘solutions’ were oriented toward the ways in which people move around and encounter one another in public spaces and dense city regions. They were, in short, heavily hyped smart city technologies, and both failed to deliver when the chips were down.

Technologies falter and die, or, more commonly, are upstaged by better solutions. Some never develop traction: Betamax video. 8-track tapes. Zeppelins. Sony Discmans. Gas-powered refrigerators. Landfills are filled with still-born examples. It seems likely that we can add contact-tracing apps to this list, and possibly other more complicated smart city technologies that have, or will, struggle to lift off, like fully autonomous vehicles or hyper-loops.

There’s nothing wrong with this picture; innovation requires imagination and a knack for problem solving, but also an appetite for risk. Sometimes great ideas don’t pan out as expected, and, as every solid scientist will agree, those failures provide crucial information and insights about the path forward in any field of discovery. In other cases, emergent technologies – holograms, for example – seem to be solutions in search of problems.

The global pandemic, however, presented the world with problems of almost unfathomable complexity, and these were not, of course, limited to cities. But the residents of city regions experienced the pandemic in very particular ways; among the many societal, economic, and public health responses to COVID-19 were an ever-widening family of technologies that sought to confront the urban experience of the crisis. Some involved digital solutions, but others drew on older civil and chemical engineering technologies, and scientific insights traceable to lethal pandemics and infectious disease outbreaks from much earlier periods. What’s more, a few of the smartest solutions had nothing to do with technology at all.

One of the most compelling, and least gimmicky, involved ‘wastewater surveillance,’ which has nothing to do with the kinds of surveillance condemned by technology critic Shoshana Zuboff. Early in the pandemic, a handful of local public health agencies – Ottawa was an early mover – decided to begin systematically testing sewage for minute traces of the COVID-19 virus. Epidemiologists knew by that point that infected people will ‘shed’ fragments of the virus through their feces, and do so before they are experiencing symptoms.

Bits of COVID-19 in poop, in other words, represented a potential early-warning system. Across a population, those traces, if measured in a timely and accurate way, could let public health officials know about pre-symptomatic community transmission before people began showing up in long lines, feeling lousy and waiting to be tested. When this kind of sampling was conducted for an entire city region – i.e., the whole so-called ‘sewershed’ – the results could provide public health agencies with a short but valuable head start for putting in place additional precautions, e.g., ordering limits on visitors to seniors’ homes. The results were also more comprehensive than testing because they captured non-symptomatic infection and infections among people who didn’t bother getting tested.

This approach drew on generations – even centuries – of advances in microbiology, chemistry, and virology, as well as the generations of technical innovations that enable modern labs to not only detect, but ‘read,’ the tiniest bits of DNA. Yet the foundation of wastewater surveillance – the ‘platform,’ to use the tech world’s term – is the hard infrastructure designed by civil engineers through trial and error during the eighteenth and nineteenth centuries in response to two of the most basic needs of rapidly growing cities: fresh water and sewage disposal, both provided at scale.

Wastewater infrastructure has evolved massively, of course: wood drains were replaced by brick-lined sewers, which in turn have been eclipsed by concrete. Unlike nineteenth-century London, raw sewage is treated before being dumped into water bodies. And in many newer parts of cities, drains handling wastewater (i.e., from rain and runoff) and sewage (from toilets) function as separate networks, with differing treatment requirements.

The notion of testing wastewater for evidence of infectious disease is not new, dating back, interestingly enough, to years immediately following the introduction of the polio vaccine, in the 1950s. A devastating childhood illness, polio was a highly infectious and much feared disease, and research to find a vaccine culminated with the discovery of an effective agent by Jonas Salk in 1953. As with the COVID-19 vaccines, its introduction involved some significant obstacles. At least one of the early polio vaccine brands, made by a firm called Cutter, produced alarming side-effects (paralysis) in some patients and prompted the U.S. government to set up a system of intensive regulation and oversight (‘Historical Vaccine Safety Concerns’ 2020).

As the polio vaccine rolled out, epidemiologists wanted to know whether it was actually reducing the incidence of the disease, and developed techniques for testing sewage samples for residual evidence of the virus. (A 2001 study found that a single toilet flush containing poliovirus could be detected at a nearby treatment plant for more than four days [Sinclair et al. 2008].)

The results confirmed that an effective vaccination campaign did produce results, measured across a city or community. Over the following several decades, however, municipal works departments took over the practice of sampling sewage, using the tests to determine treatment levels and whether an excess of raw sewage was finding its way into water bodies, rather than providing data for public health purposes.

As the opioid epidemic gained momentum, public health officials in some jurisdictions revived the practice of wastewater surveillance, testing raw sewage to determine trace levels of these drugs. They could then triangulate the sampling data with other locally sourced information – e.g., the volumes of properly prescribed opioids – to gauge how much of the drug was entering a community through illicit channels. The sewage sampling on its own also provided public health authorities with real-time information on the extent of an epidemic of addiction that reached into every corner of society. These initiatives in particular prompted health authorities to begin looking for traces of the COVID-19 virus in sewage when the pandemic began in early 2020.

Testing sewage for traces of virus is the proverbial needle-in-ahaystack challenge, and it is not especially automated – at least not yet. Samples of wastewater have to be physically removed from sewer mains or drawn from holding tanks at wastewater treatment plants, then shipped to a lab for testing. Treatment plants are fitted out with equipment that automatically draws samples, but epidemiologists who worked on this approach were also interested in collecting ‘upstream’ data – for example, samples taken near hospitals or long-term care homes – to gain more granular information on transmission that might be taking place in these kinds of settings. Obtaining such samples, however, meant prying open manhole covers and inserting tubes into the pipes running beneath the streets.

The epidemiologists pushing the frontiers of wastewater surveillance are also drawing on insights about human behaviour, the nature of the labour force, and the extent of sewer infrastructure. A 2021 study published in the U.S. by the National Institutes of Health pointed out that morning samples seem to produce the best results, because that’s when most people use the bathroom. But, they cautioned, such tests might miss shift workers, who were among the groups more vulnerable to COVID-19 exposure due to the nature of their work, inflexible employers, or other factors. (People living in rural areas with septic tanks are also absent.)

‘Prioritizing vulnerable communities could catch upticks in transmission early and prevent health care systems from being overwhelmed,’ recommended the authors, all scholars at Mathematica, a 1,600-employee data analytics firm in Washington, D.C. ‘As transmission begins to decelerate, sampling might prioritize key transmission nodes, such as international airports, shipping ports, travel hubs, and public gathering spaces’ (Keshaviah et al. 2021).

Besides the mechanics of sampling and testing, the other complication that emerged had to do with bureaucratic silos. In Toronto, for example, works officials charged with running sewage treatment plants didn’t have much involvement with public health officials. To create an effective system, the two entities would need to work together to gather, transport, and test samples, so information about changing viral loads could be pressed into service. In sum, the potential benefits of this kind of testing depended on forging new bureaucratic connections, additional funding, and more lab capacity. The promise of this approach, in other words, didn’t just involve repurposing data that already existed in some giant server farm; the innovation extended to new approaches that municipalities had to adopt in order to deliver service.

The potential benefits, however, proved to be so compelling that governments in many parts of the world began investing significant sums to transform wastewater surveillance from an academic exercise into an operational reality. In the U.S., the Centers for Disease Control and Prevention announced in 2020 its plan to establish a national wastewater surveillance system involving state and local partners. As Nature reported in 2021, the number of such projects grew from a few dozen in 2020 to over two hundred a year later, the vast majority in affluent countries. The approaches varied widely, from the testing of sewage in commercial airliners to samples gathered near apartment buildings, correctional facilities, and campuses (Kreier 2021).

A Boston-based biotech firm, Biobot Analytics, developed ‘sampling kits’ for workplaces and pitched its product to employers: ‘Return to the office with confidence,’ its marketing pitch read. ‘Analyzing your building’s sewage generates powerful health data. Biobot tells you how to leverage that information to keep your employees safe and productive.’ Yet the company, founded by two MIT scientists, has also conducted scientific analyses of the advantages and limitations of the various approaches. ‘As it becomes an important pillar of pandemic and epidemic preparedness and response systems,’ a Biobot study published in 2021 concluded, ‘wastewater-based epidemiology can support public health policy as an established surveillance tool for emerging infectious diseases and biological threats’ (Sharara et al. 2021).

Yet some data and privacy experts counsel caution about these technologies for reasons that will be familiar to anyone who’s been paying attention to the thorny ethical questions raised by the deployment of digital sensors in public space, especially when these systems will be used to identify other trace substances in wastewater. For example, samples taken very close to specific buildings or communities could yield data that could be used to profile those areas, notes a 2022 study on the ethical, legal, and civic implications of the ‘datafication of wastewater.’ The authors, University of Ottawa’s Teresa Scassa and Toronto Metropolitan University planning scholar Pamela Robinson, point out that because wastewater surveillance depends on municipal infrastructure, decisions around how, where, and what to sample need to be considered more broadly and inclusively than is presently the case. As they conclude, ‘An otherwise harmless social necessity – the disposal of human waste – has become an opportunity for increased technological surveillance’ (Scassa et al. 2022).

In the conclusion of their own evaluation of wastewater surveillance, the Mathematica research team included a shout-out to a pair of health practitioners from a very different era, pioneers whose insights continue to inform public health practice.

One was Florence Nightingale, the British nurse who tended to soldiers in military hospitals during the Crimean War, in 1854. An amateur mathematician, Nightingale collected statistics from those hospitals on the causes of death of the patients and published them as novel and highly influential data visualizations. They showed that soldiers were far more likely to die from infections picked up in hospital settings than from the injuries they sustained on the battlefield.

The other reference was to Dr. John Snow, a London physician who found himself treating the victims of a cholera outbreak that had killed over five hundred people in Soho around the same time. Cholera, long considered to be spread through ‘miasma,’ or foul-smelling air, had been an urban scourge for generations, yet had become more pronounced with the crowding of the urbanization of the industrial era. Snow began marking the location of the homes of victims on what came to be known as ‘the ghost map’ – a seemingly modest form of data visualization that revolutionized epidemiology. He supplemented his findings by knocking on doors and talking to the people who lived and worked in the afflicted neighbourhood.

Gradually, two curious details emerged from his fact-finding: one, that many of the victims lived or worked near a water pump over a well on Broad Street; and two, that workers at a nearby brewery seemed less likely to fall ill or die. As Snow probed further, he realized the victims had depended on water from that Broad Street pump, which turned out to have been located close to several leaking underground privvies. As for those brewery workers, they tended to drink beer while on the job, not well water from the pump. Armed with his data, Snow recommended to local authorities that they remove the handle of the Broad Street pump. That decision, immortalized in the annals of urban public health, halted the outbreak.

‘Taking lessons from those who battled pandemics in centuries past, we see the need to take a big, bold approach to adapting and advancing our infrastructure for disease surveillance now, while the crisis window is open,’ the Mathematica team wrote. ‘If we do so strategically, with a view toward the next epidemic, we may revolutionize public health once more.’

The public health narrative that connects these breakthroughs from the middle of the nineteenth century to the early twenty-first century and the pandemic tells a story about how a handful of scientific and technological advances made urban life safer and more livable, and thus set the stage for the mass urbanization that occurred after World War II.

Some of those discoveries had nothing to do with cities at all. One of the most consequential occurred in Vienna during the late 1840s. Ignaz Semmelweis, a Hungarian obstetrician working in a Viennese hospital, observed unusually high rates of mortality among the women in one particular ward. When he began to probe further, he observed that physicians and medical students who were treating those patients had often come from the morgue, where they had been performing autopsies. By contrast, women in the wards staffed by midwives had very low mortality rates. Though the scientific understanding of bacteria was still in its infancy, Semmelweis reasoned that the physicians had something on their hands that they transmitted to their patients. ‘Wash your hands,’ he told them, and that simple, non-technological requirement profoundly altered the spread of infectious diseases (Flynn 2020).

Then, in the early 1860s, two scientists, Louis Pasteur, from France, and Robert Koch, from Germany, confirmed that microbes transmitted some food or water-borne diseases – the so-called ‘germ theory.’ Pasteur, who also invented some early vaccines, including the rabies vaccine, developed a method for killing harmful bacteria in wine and beer. Known as pasteurization, the technique involved rapidly heating and then cooling the liquids to prevent them from spoiling. A few years later, a German scientist figured out how to pasteurize milk, a process soon commercialized in Germany. A chemist, in turn, invented equipment for treating bottled milk. Though it met with some resistance, including from some scientists, the practice of milk pasteurization rapidly spread through the dairy industry in northern Europe and Scandinavia, and eventually reached North America, where activists urged dairies and local authorities to adopt the technique (Currier & Widness 2018).

In some cities, including Toronto, some public health leaders zealously advocated pasteurization to rein in local dairies that diluted their milk with contaminated water – a practice that contributed to high infant mortality rates. Dr. Charles Hastings, the city’s crusading medical officer of health from 1909 to 1929, ‘engaged in friendly persuasion, modern publicity methods (such as publishing lists of “first-class dairies” in the monthly Health Bulletin), and hard-nosed negotiating,’ according to public health historian Heather MacDougall. ‘His efforts ensured that by October 1915 … the city’s well-inspected and bacteriologically tested supply had become a model for other urban centres. The proof of the effectiveness of these changes was a sharp decline in outbreaks of milk-borne disease and a drop of roughly one third in the number of infant deaths’ (MacDougall 2018).

A parallel development, and one that had a more direct relationship with municipal infrastructure, involved chlorination for disinfecting the water supply. Though chlorine was isolated as a chemical in 1810, chemists and municipal engineers only began experimenting with the compound as a potential disinfectant around the turn of the century, first, experimentally, in Louisville, Kentucky, and then in Belgium and municipalities like Jersey City, which became the first place in the U.S. to begin continuously adding chlorine to water stored in the 700-acre Boonton Reservoir, located about 40 kilometres northwest of New York City (American Water Works Association 2006).

The introduction of chlorine led to dramatic declines in the prevalence of water-borne illnesses such as typhoid, dysentery, and cholera by the 1920s. Typhoid mortality in the U.S. fell from thirty deaths per 100,000 in 1900 to virtually zero by 1940. Yet the use of the chemical had unanticipated consequences. Chlorine interacted with some organic compounds in water to produce traces of a by-product chemical identified early on as a potential carcinogen. By the 1970s, federal laws in the U.S. and elsewhere required local water authorities to ensure not only a disinfected water supply, but also one free of harmful secondary substances. As the Centers for Disease Control and Prevention put it, the advent of chlorination marked ‘one of the ten greatest public health achievements of the twentieth century’ (‘History of Drinking Water Treatment’ 2012).

With the mounting public concern about the environment that welled up during the 1960s following the publication of Rachel Carson’s 1962 bestseller, Silent Spring, governments began to pass parallel laws regulating air pollution, lead additives in gasoline and paint, smokestack emissions, hazardous waste disposal, and, in places like England, the use of coal for home heating. These laws were typically national in scope, but their public health impacts tended to be urban – e.g., focused on reducing smog in big cities or countering respiratory illnesses afflicting children living near highways or downwind from factories.

Not coincidentally, the regulatory changes spurred investment in a wide array of engineering-driven technologies, from more fuel- efficient engines to scrubbers, high-efficiency gas furnaces, recycling facilities as alternatives to municipal incinerators, and remediation techniques for heavily contaminated brownfields. None of these innovations were driven by information and communications technology or anything that might now be described as smart city systems, yet they all contributed hugely to the betterment of quality of urban life.

Strangely, there was one very large category of shared space that did not benefit from the waves of laws aimed at improving public health: the interior atmosphere of buildings. Jeffrey Siegel, a University of Toronto professor of civil engineering, describes indoor air quality (IAQ) as the poor cousin of the world of sustainable architecture – an ‘incredibly neglected’ issue in terms of public health policy, despite the fact that most people spend the lion’s share of their time inside, and increasingly in sealed environments (Lorinc 2020).

Stale recycled air in hermetically sealed office buildings contributes to headaches, drowsiness, and colds. Landlords or property managers frequently neglect the maintenance of HVAC equipment. Those physical features are, in turn, exacerbated by the widespread use of synthetic fabrics for carpets or upholstery. These fossil fuel–based materials emit volatile organic compounds and give rise to a condition dubbed ‘sick building syndrome.’ In residential dwellings, poorly ventilated or damp homes and apartments are susceptible to mould and contribute to respiratory ailments like asthma or allergies. And in the most extreme example, odourless radon gas, which is sequestered in certain types of soil, leaks into basements, causing respiratory illnesses and lung cancer.

However, beyond smoking restrictions and general health and safety standards for industrial workplaces, Canada doesn’t really regulate indoor air quality, although other jurisdictions are more activist. ‘One reason EPAS (in many countries) did not focus on indoor air was that the indoor environment, especially in the home, was thought of as a private concern with which governments should not interfere,’ says Jan Sundell, one of the world’s leading IAQ researchers, who notes that serious building science research on IAQ only began in the 1980s (Sundell 2017).

The obscure work of building science researchers was amplified by a surge of media coverage. ‘Indoor Air Pollution’ was the alarmist headline on an extensive 1986 Washington Post feature. The building industry began to respond. In the years prior to the pandemic, HVAC companies were seeing a general increase in private demand for products that claim to improve IAQ, including more robust filters for residential furnaces, air purifiers that use ‘germicidal’ ultraviolet light to kill airborne microbes, and so-called ‘bipolar ionization’ air cleaners, which are installed in HVAC equipment, although the efficacy of this latter technology remains controversial.

Yet COVID-19 radically altered the IAQ market. Early in the pandemic, the prevailing scientific consensus was that the virus spread through contact with infected surfaces and the transmission of the disease through droplets that are spread when individuals cough, sneeze, or speak loudly – conclusions that informed public health directives about mask wearing and social distancing. But as the pandemic progressed, a growing number of public health experts began to find evidence that the virus spread with airborne aerosols that were much smaller and lighter than droplets, and circulated in closed environments, from elevators to movie theatres.

The response, in many jurisdictions, involved both low-tech and higher-tech solutions, although there were none that relied specifically on AI, big data, or the other tools in the smart city arsenal. In terms of the former, public health officials directed institutions such as school boards and child care centres to significantly boost existing ventilation rates, so that air in closed environments would be replaced much more frequently than had been the case pre-pandemic. Private building owners were also encouraged to take these measures.

As for the more technologically advanced measures, some jurisdictions began requiring institutions such as school boards to install so-called HEPA (for ‘high efficiency particulate air’) filters. These devices, which can filter 99.7 per cent of all airborne particles, including extremely small ones, are commonly used in operating rooms or industrial settings such as computer chip manufacturers.

These devices have a curious backstory, because the technology, though sophisticated, is not new. HEPA filters date back to the 1940s, invented as part of the Manhattan Project, the top-secret U.S. government nuclear warhead R&D venture. The filters were intended to protect thousands of employees working in a secret facility from inhaling radioactive particles. Their design centres on a tangled weave of ultra-fine fibres made from fibreglass or other substances that trap some of the most minute particles in the air.

Specialized HVAC equipment until the second year of the pandemic, HEPA filters were suddenly available at hardware stores and through building supply chains. Governments distributed thousands to schools and other congregate settings. Although estimates vary, market researchers estimated that the HEPA filter sector could expand by compound annual growth rates by 7.4 percent by 2026 – a surge few saw coming before COVID-19.

It’s worth noting that niche approaches to sustainable/energy- efficient building design, which considerably predate the pandemic, relied heavily on significant investments in ventilation technology as part of an overall philosophy for reducing carbon and providing fresh air in indoor environments. Some of these, in turn, involved ideas that date back to the earliest forms of architecture – designs that allow for windows that open and the virtuous use of cross-drafts for cooling, features that fell to the wayside with so much twentieth-century architecture.

What’s more, some smart city tech companies had also identified IAQ as a potential market, developing specialized IoT sensors that were, in turn, integrated into large smart building systems that also controlled lighting, energy consumption, heating and cooling, etc.

Yet it took a global crisis – and not fads from green building architecture or the smart city industry – to change public thinking about indoor air. In that regard, COVID-19’s wake-up call evokes another city-transforming public health disaster: the Great Smog of London in December 1952, which killed around 12,000 people. In the aftermath, the British government enacted tough restrictions on the use of coal for home heating and electricity generation, as well as imposing strict air pollution standards on industrial emitters. As the New York Times commented, ‘The Great Smog is considered a turning point in environmental history’ (Nagourney 2003).

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!