3

Planning Smart Cities

In 2018, after years spent working in Toronto’s development and infrastructure sectors, Monika Jaroszonek and Erin Morrow decided to open a small consulting shop whose mission was to inject some twenty-first-century data visualization and computer modelling magic into the practices of municipal planning.

Anyone who has worked in land-use planning understands that it is a highly bureaucratic profession tasked, unenviably, with taming or at least containing the torrents of capital that flow in and through cities, reshaping the way they look, function, and are experienced.

Planners both produce and consume thickets of geographical information: official plans, maps, zoning regulations, land-use policies, surveys, schematics, just to name a few. They have to understand everything from land-use economics to transportation demand patterns to the reasons why pedestrians may not follow a paved path through a park. In most cities, planners serve as a kind of buffer between developers, architects, and landowners on one side, and the general public and local politicians on the other. They exist in the modern world because, in the past, the messy realities of cities became intolerable, inspiring both radical politics and utopian reforms – an iterative dynamic that persists to this day in planning fads like the fifteen-minute city movement.

As an architect who dealt extensively with planners, Jaroszonek was interested in how complex planning policies contribute to built form. Morrow, a planner and Ratio City’s Chief Product Officer, wanted a better feel for ‘the underlying system, the bones of the city,’ as he put it to me in 2021. ‘Those are really hard problems. You need a lot of data, and it’s shocking how unavailable it is.’

What they did was combine huge existing databases of public information to create a set of web-based mapping software that layers regional and local land-use policies and zoning rules for every address, as well as density allowances, property lines, geographic features, and even the locations of heritage buildings. They fed all this location-specific detail into a software system programmed to create highly granular maps and 3-D schematics of any neighbourhood, allowing users to game out how various types of projects may alter a community.

Ratio City’s platform – a quintessential smart city technology – generates nifty visualizations, checks to see if projects conform to existing zoning bylaws, and estimates how changes in the number of proposed units, for example, might affect the form of an apartment complex. It also allows users to swoop around these models electronically, like they were video games, studying projects from different angles and scales.

Eventually, Jaroszonek and Morrow want to layer on estimates of how projects will affect traffic or transit use. Because it draws on historical data, the company’s software can depict trends in how neighbourhoods have changed in response to new land-use policies. However, there are no plans to use it to predict future trends. ‘It offers a complete view of what’s going on,’ says Jaroszonek. Adds Morrow: ‘We can see all these different policies impact a particular site.’

The firm’s clients are developers, but its goal is to remove public ambiguity in the approvals process by providing residents and municipal officials with accessible depictions, showing how their neighbourhoods will change. ‘A lot of the problem in the development process is information asymmetry,’ says Morrow. ‘People assume they’re not being told everything.’

Ratio City has a formidable competitor: Esri, an international firm that virtually invented spatial analytics and the wide array of geographical information systems (GIS) mapping software tools employed by planners working for governments, large corporations, and entities like busy container ports.7 The company – which is based outside Redlands, California, and has offices around the world – describes its service as ‘the science of where.’ Its website even contains a gallery of ‘maps we love.’

Founded in the U.S. in 1969 by Jack and Laura Dangermond, Esri is one of those rare tech companies to have successfully navigated the rapids and shoals of the computer industry. ‘Esri is now being used by some 350,000 businesses, government agencies and NGOS around the world who collectively create some 150 million new maps every day,’ according to a 2015 profile in Forbes magazine. ‘Over the years, the company embraced computer workstations, PCS, servers, the Web and mobile devices. As it surfed wave after technological wave, Esri also managed to defy predictions that it would be crushed by Google, whose multi-billion dollar investments in “geo” technology made it synonymous with digital mapping.’

Since the early 2000s, the company has effectively locked up the Canadian public sector market for GIS software, which is used for all sorts of applications, from emergency management to mapping municipal infrastructure and, during the COVID-19 pandemic, mapping the trajectory of the outbreak and then vaccinations. Another example: online interactive mapping tools that show the locations and dates of pedestrian and cyclist fatalities and injuries over time, and that can be used to support traffic safety improvements, such as calming, speed reductions, or the installation of bike lanes. The idea is that this kind of time-series data should not only inform decision-making, but also make it possible to determine whether the actions taken have produced a benefit in terms of road safety.

As a 2021 City of Toronto staff report noted in its recommendation to extend the company’s licence, ‘Esri software is used extensively across the City for business support and service delivery. Other vendors in the market do not have the same software service offerings that meet the scale and complexity of the City’s use’ (City of Toronto 2021).

The software developed by Esri and Ratio City can be described as ‘smart’ in the sense that these companies – and others like them – have devised innovative ways to compile and visualize technical data so it can be used to support policy or operational decisions that run the gamut from land-use planning to public health.

These kinds of analytical tools have become essential for cities seeking to deal with the fallout of other digital technologies that have severely disrupted city planning.

The case of Airbnb offers a vivid illustration. Originally conceived as a convenient app that would provide travellers with a broader array of accommodation choices, the evolution of Airbnb – and its imitators – revealed the staggeringly disruptive power of the digital platform economy. At its pre-pandemic peak, Airbnb was driving developers’ marketing plans, stoking the financialization of the global housing market, accelerating gentrification, and distorting, in some cases, local retail scenes in ‘authentic’ neighbourhoods that had become destination hot spots – a dynamic stoked by social media and amplified by the viral spread of traveller selfies. The traffic incentivized merchants, landlords, and retailers to reorient themselves to tourists with money to spend, thereby pricing out their traditional customers.

In some destination cities, like Barcelona, municipal officials banned or severely limited Airbnb because it was destabilizing older working-class neighbourhoods. In other places, such as Toronto, city officials were forced to respond to the rise of the so-called ‘ghost hotel’ – apartment buildings overrun by units purchased for short-term rentals, some of which served as party palaces. Before COVID struck, the platform’s popularity also ratcheted up rents as condo owners and landlords discovered they could earn more in a month with short-term stays than from leases signed with long-term tenants.

With municipalities struggling to get ahead of these market distortions, Esri has promoted its 3-D mapping tools to visualize how changes in short-term rental rules and zoning could impact affordability. In a 2019 blog post, the company demonstrated how its platform could generate graphics depicting how these regulatory changes could impact development in Honolulu, a city that was struggling to figure out how to protect housing affordability given the runaway popularity of short-term rentals. The post explained,

According to a 2015 report, 66,000 housing units will be needed in Hawaii by 2025 to meet demand, with nearly 26,000 of those dwellings required in Honolulu. A number of proposed zoning changes aim to address this looming housing deficit. [These] include restrictions on the square footage of residential units to combat monster homes; an easing of height restrictions on low-rise apartments to allow five-story walk-ups rather than the existing three-story limits; and a proposal allowing home owners to build and rent accessory dwelling units.

Interestingly, Esri is working both sides of the short-term rental street; the company also promotes another of its software products, ArcGIS, to investors looking to purchase Airbnb apartments in optimal locations. As it explains in a 2020 ‘storymap’ posted on its website, the company scoped out a potential strategy that someone with $1 million to invest might undertake with its software. Using Yelp data on restaurants, GIS information on proximity to desirable neighbourhoods, and rental rate data from a website called Inside Airbnb, Esri’s software could direct clients to optimal locations in San Diego, which, it noted, was ‘the most profitable city for 2-bedroom listings.’

The short-term rental sector is just one of the balls that city planners must now juggle as they seek to manage the so-called ‘wicked problems’ of twenty-first-century urban regions. Migration and the provision of affordable housing are dramatically out of sync. Sprawl and sprawl-fuelled congestion drive up both the cost of living (due to driving expenses for homeowners) and municipal services while pumping carbon into the atmosphere. Yet the moral imperative to confront climate change using planning tools such as intensification and green building techniques is easier said than done, and meets resistance in multiple forms.

Orbiting all these more policy-oriented matters are tenacious questions about uprooting the income inequality and racism that became so embedded in municipal governance; intense debates over the importance of architecture, urban design, and green space in civic life; and deeply conflicting views on how to accommodate children and seniors in a bustling metropolis. In the 1960s, Henri Lefebvre, the radical urban philosopher, asked: Who has a right to the city? We must also ask: Who plans? Who governs? And what constitutes sustainable, equitable planning in a complex urban world that’s constantly being redefined by capital, migration, and climate change?

These are the problems that pile up relentlessly on the desks of city planners.

Even before the advent of state-of-the-art mapping software, municipal planners had to synthesize all sorts of information: census data, maps, surveys, real estate values, statistics on communicable diseases, and so on. Transportation planners conducted traffic counts or built statistical models predicting road usage. Transit authorities used data such as labour market statistics as well as economic models to test the relationship between fares and ridership rates to make decisions about new routes and frequency of service, among others. According to MIT science and technology historian Jennifer S. Light, planners learned how to combine census and household survey data with aerial photography and military satellite sensing to map growth, housing conditions, and urban ‘blight.’

Beginning in the 1960s, municipalities began investing in costly mainframe computer systems to perform tasks like monitoring traffic. By the mid-1970s, Light wrote in her 2003 book, From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War America, analysts working for the City of Los Angeles were developing mathematical models that combined information from databases of digitized aerial images, census statistics, and building inspection reports to make predictions about future housing development scenarios – a precursor of today’s smart city analytics.

Cities gradually started adding more devices or systems capable of providing a wider array of real-time data, including automatic licence plate readers, closed-circuit TVS, and intelligent transportation systems (ITS) that could automatically adjust traffic lights depending on the volume of traffic sensed by electronic loops, or sensors, in the pavement. Local utilities added electronic energy meters. Transit agencies adopted smart cards instead of tickets and tokens. These devices were generating rivers of data.

In 2010, following a devastating landslide that left hundreds dead and thousands homeless, municipal officials in Rio de Janeiro realized they’d have to be better prepared for intensely disruptive events in anticipation of the 2014 World Cup and the 2016 summer Olympics. As the Center for Public Impact (CPI), a think tank supported by the Boston Consulting Group’s foundation, noted, the transit system barely worked, violence was rampant, and the tens of thousands of poor residents living in the favelas perched on the slopes of the steep hills surrounding Rio were particularly exposed to natural calamities, especially landslides. Citing the leadership of then mayor Eduardo Paes, CPI explained, ‘The city needed an initiative which would help it bring together information concerning the environment, transport, crime and medical services to establish a sense of coordination and control.’ In short, an operations centre.

A New York Times correspondent visited the newly built hub in 2012 and sent back an admiring dispatch on the eve of Carnaval. ‘City employees in white jumpsuits work quietly in front of a giant wall of screens – a sort of virtual Rio, rendered in real time,’ Natasha Singer wrote. ‘Video streams in from subway stations and major intersections. A sophisticated weather program predicts rainfall across the city. A map glows with the locations of car accidents, power failures and other problems.’

The facility had been constructed by IBM at Paes’s request and was touted as a game changer for complex cities like Rio. ‘There is nothing quite like it in the world’s other major cities,’ Singer noted. ‘I.B.M. has created similar data centers elsewhere for single agencies like police departments. But never before has it built a citywide system integrating data from some 30 agencies, all under a single roof.’ The technology, she added, would dismantle the silos between municipal departments because the operations centres’ analysts could use all sorts of software tools to combine real-time data about the city and even predict how development patterns might influence future disasters. As Paes told the Times, the act of combining data would help the whole city.

But there was another more corporate objective at play, one that anticipated, by about five years, the goals that Sidewalk Labs had when it launched its ill-fated Toronto smart city venture. IBM was positioning the Rio project as a kind of calling card, a highly integrated and presumably costly investment that could benefit other cities looking to make themselves smarter. The market opportunity seemed huge and had become an integral part of IBM’s strategy to boost global revenues to US$150 billion.

Rio’s operations centre caught the eye of Rob Kitchin, a professor of geography at the National University of Ireland, Maynooth, who quoted it in a widely cited study entitled ‘The real time city? Big Data and smart urbanism.’ Kitchin,8 who has become one of the world’s leading authorities on smart cities, noted that similar hubs had cropped up in New York, Dublin, and London, which had created a dashboard. The dashboard, he explains, is a data visualization tool that ‘tracks the performance of the city with respect to twelve key areas – jobs and economy, transport, environment, policing and crime, fire and rescue, communities, housing, health, and tourism.’ Unlike Rio’s facility, the dashboard data isn’t updated in real time.

Kitchin pointed out that the network of operations centres, apps, and dashboards can be regarded as a new species of urban information-gathering system in that they collectively ‘provide a powerful means for making sense of, managing and living in the city in the here-and-now, and for envisioning and predicting future scenarios.’ The benefit for municipal planners in particular is that these technologies enabled evidence-based decision-making. ‘Rather than basing decisions on anecdote or intuition or clientelist politics or periodic/partial evidence,’ he observed, ‘it is possible to assess what is happening at any one time and to react and plan accordingly.’

There is some truth to this conceptualization of the utility of real-time municipal data for planning purposes. For example, when city officials can track cycling activity using apps installed on cyclists’ smart phones, they can ‘see’ where bike lanes are used and needed. Similarly, if transportation or transit planners can track daily traffic or ridership volumes over an extended period, using data from cellphone signals or tap-on/tap-off fare cards, they can add service or identify areas experiencing increases in work-related car trips. Such insights could lead to planning that informs infrastructure and private investment, as well as choices about programming public spaces.

Nor are planners relying solely on municipally generated data and analytics. Numerous planning apps have also emerged, such as Walk Score, which rates neighbourhood walkability in cities around the world. The website Inside Airbnb ‘scrapes’ address, rate, and other host details from Airbnb’s main site, cross-references this information with housing and rental market data, and then maps it all. The site was created by a handful of New York City affordable housing activists. Visitors can see the density and locations of Airbnb units in any given neighbourhood in any city. The site, in effect, is a data visualization tool that gives planners and residents valuable housing market information (and policy insights) into phenomena such as condo towers that have become overrun by short-term rental investors and ghost hotel operators.

Advances in computing power and coding tools, as well as the long-anticipated maturation of artificial intelligence, are creating entirely new ways of leveraging data in order to observe what’s happening in a city. For example, the City of Stockholm, through a research partnership with MIT and Sweden’s KTH Royal Institute of Technology, has set up a project to install solar-powered sensors on buses, garbage trucks, and taxis to gather data on noise, air, and road quality. ‘Building an opportunistic sensing platform that can be deployed and configured on-demand,’ the investigators say, ‘we provide cities with denser spatiotemporal data about the urban environment, enabling decision-making and fostering public engagement on environmental issues.’

Some of the most compelling case studies in the use of data for planning involve the combination of new forms of technology and non-digital ways of thinking about the quality of urban spaces.

In the early 1960s, Danish architect and planner Jan Gehl began meticulously documenting pedestrian activity in a new car-free zone in central Copenhagen to prove to area merchants that they weren’t going to lose business. Carried out by volunteers, Gehl’s ‘public life surveys’ tracked pedestrian and cyclist activity, bench usage, sidewalk café seating, and so on, with the results painting a picture of how and when people used their streets. Those surveys were carried out by volunteers, not machines, and thus were grounded in subjective observations about the rhythms of city life.

In the late 2000s, New York City hired Gehl to conduct similar surveys and analysis on Times Square and several of Broadway’s skewed intersections. The street-life surveys revealed a conspicuous dearth of younger and older pedestrians – a detail non-video sensors wouldn’t pick up – while an analysis of the chronically congested intersection showed the road allowance occupied almost 90 per cent of all the open space in the Square.

In 2008, NYC’S transportation commissioner, Janette Sadik-Khan, used Gehl’s findings to order a radical remake of Times Square, closing large segments of the road and creating public spaces fitted out with tables and chairs. The model has been replicated elsewhere in the city, reclaiming hundreds of thousands of square feet of space in Manhattan from traffic.

In Toronto, the King Street Transit Pilot Project, which launched in 2017, offered a similarly compelling example of how city officials succeeded in integrating technology and planning judgment to improve public services and public space. In 2015, the city set up a big data innovation team to tease out insights from information generated by electronic traffic counters, cycling apps, vehicle detectors, and other sources that produced continuous flows of digitized transportation information.

The plan envisioned significantly restricted private vehicle use on King Street, a downtown arterial, in order to improve streetcar service. The city’s data analysts used low-resolution cameras installed in traffic signal controllers at intersections to monitor pedestrian and vehicle volumes, and then drew on anonymized Bluetooth signals from smart phones to calculate how much time riders spent on streetcars traversing the area. Project officials also tracked daily revenues through point-of-sale payment devices to assess how declines in vehicle traffic impacted King Street businesses.

The city published a monthly dashboard of key metrics to demonstrate changes in travel times, cyclist and pedestrian activity, and commerce. Restaurants, in turn, were allowed to build partially enclosed patios extending into the street – a move that laid the groundwork for the city’s CaféTO pandemic program, which allowed scores of eateries to expand into cordoned-off street-parking spaces.

The metrics affirmed the experiences of commuters, residents, and local businesses: that streetcars were moving faster, pedestrian and cycling activity was up, and merchants hadn’t seen the drop in business some had feared. In 2019, council voted to make King’s transit priority corridor permanent.

As with downtown Copenhagen and Times Square, the King Street project illustrated how planners and analytics experts can make innovative uses of granular urban data in order to deliver city-building goals, and that it is possible to do so without compromising privacy or directing scarce funds to expensive smart city tech firms.

Over the course of the decade separating Rio’s decision to install its cutting-edge operations centre and Sidewalk Labs’ decision to abandon its Toronto venture, the narrative about the novel uses of digitized urban data changed. Increasingly, it seemed as if the technology tail was wagging the planning dog.

In some cases, smart city solutions to planning problems have been dramatically oversold, both by municipal officials and tech giants. In 2012, Eduardo Paes told the New York Times, ‘We want to put Rio ahead of every city in the world concerning operations of daily life and emergency response.’ Rio did manage to deliver both the World Cup and the Olympics, but IBM’s much-touted operations centre did not lead to reductions in crime, poverty, police violence, or income inequality, according to a Rio-based watchdog group set up to monitor social conditions in advance of the 2016 games.

In other places, the tech solutions just seem somewhat far-fetched. The party district of downtown Eindhoven, a tech hub of 220,000 people in southeastern Netherlands near the German border, is a case in point. The area, a pedestrian-only street known as the Stratumseind, is about 225 metres long, 15 metres wide, and crammed with bars. On some nights, as many as 15,000 people will stream into this strip, and fights often break out. Rising violence was becoming a significant public safety issue.

In 2015, Tinus Kanter, an Eindhoven municipal official, began working on a smart city solution to the safety issues on that stretch. In partnership with Stratumseind businesses, police, and the lighting giant Philips, which is headquartered in Eindhoven, the city transformed the stretch into a ‘living lab’ with a range of technologies designed to drain some of the negative energy out of Saturday-night revelries.

Video cameras were installed at each end to track how many people were entering or leaving the area, without capturing facial images. The project team then developed and installed a series of audio sensors programmed to detect aggressive sounds, while an off-site AI algorithm scanned and interpreted social media for posts that mentioned Stratumseind or contained geotagged images of the strip. Software devised by the city and its tech partners combines these data streams and sends red flags when trouble is detected, including to the police. Depending on the signals received about the crowd’s behaviour, lighting provided by the company shifts to softer hues when things get ugly.

Kanter comes by his interest in crowd control honestly: before joining the civil service, he ran a heavy-metal music festival. He stresses that the city insisted on ‘privacy by design,’ so the systems do not capture personal information. The municipality also took more conventional steps, adding planters, terraces, and seating to break up the space. ‘What I see now is that the street is becoming nicer and more open,’ says Kanter, who adds that Eindhoven has been carefully tracking project data. ‘We think that gathering numbers is a good thing because [they] provide scientific proof.’

However, what the data shows in terms of safety isn’t especially clear. Kanter insists there’s less fighting, although he can’t prove the lighting and the sensors are the reason. Albert Meijer, a University of Utrecht professor of public innovation who has studied Stratumseind, says the technology alone didn’t markedly improve safety. What did change, he adds, is that media coverage of the area shifted from focusing on the brawling to focusing on the devices, which, in turn, has attracted municipal delegations from around the world, and which may have been the point all along. ‘Philips,’ he says, ‘wanted to show its new street lighting to sell around the world.’

Other municipalities and smart tech firms have gone even further with the use of external sources of social media data and analytics. An Israeli company called Zencity, for example, developed a software-based smart city service using AI. Their platform gathers and assesses citizen feedback on local planning matters – a process it calls ‘sentiment analysis.’ The feedback comes from online surveys as well as social media chatter, tourist ratings, complaints to the municipal 311 lines, etc. ‘It’s representing the voices of the silent majority,’ says former Zencity manager Nir Zernyak. The company presents its system as a decision-making tool for municipal politicians and officials.

Zernyak cites an Oregon example. Beaverton, a suburb of Portland, has attempted to ban so-called ‘car camping’ by setting up two ‘safe parking’ locations for homeless people who live in their cars. The city or partner agencies help the users to access social services. In its marketing materials, Zencity claims its ‘actionable, data-based insights’ revealed that Beaverton may have to change its zoning rules to allow for these sites and that not all neighbourhoods wanted one – conclusions that may not have required this kind of outsourced data-crunching.

Perhaps the most contentious source of new urban data comes from sensors designed to monitor and manage the use of public spaces. Often, these types of applications seem benign. In New South Wales, for example, Street Furniture Australia, an industry group, recruited academic planners and landscape architects to evaluate street furniture fitted out with wirelessly connected sensors that perform tasks such as monitoring when a trash can needs to be emptied or how park benches are used. Other applications included park-based workstations equipped with wifi and USB ports, as well as tables that can be booked in advance via a smart phone app.

The data generated from these ‘smart social spaces’ is aggregated on a ‘smart asset management dashboard’ that municipal officials use to monitor how these hubs are used. The idea behind the pilot, explains Nancy Marshall, a University of Sydney planning professor who is part of the evaluation team, is to find ways to encourage people to use public spaces, but the group also wants to conduct ‘behaviour mapping.’ She says none of the sensors gather personal information.

How this intel gets used is an open question. Information that flows from park bench or picnic table sensors could prompt planners to add amenities if heavy traffic is indicated. But it’s not difficult to imagine less positive applications. For example, if the data shows a lot of late-night traffic, local residents worried about crime might use the information as extra fodder for municipal officials to remove benches, or tips for police to increase patrols. Marshall stresses that the data from the pilot projects isn’t shared with NSW law enforcement officials, but such assurances hardly guarantee that the municipalities that eventually purchase these systems will be as restrained.

New York University planning and urban analytics expert Constantine Kontokosta offers another caution. Trash bin sensors designed to monitor when a container needs emptying could, in theory, provide data that allows city officials to apply algorithms to optimize collection routes, by using GPS mapping tools to direct trucks only to full bins, thus saving money on fuel and labour.

But in a 2018 paper in the Journal of Planning Education and Research, Kontokosta writes that such analysis might come into conflict with other municipal goals and practices, such as the need to abide by collective agreement rules. ‘The computing challenges are solvable,’ he notes. ‘[T]he real uncertainty lies with how to integrate data-driven processes into public sector management.’

7. Originally conceptualized by Ottawa geographer Roger Tomlinson, GIS are densely layered digital maps that contain a wide range of information associated with a particular place – natural features, buildings, boundaries, infrastructure, businesses, land-use and zoning rules, census data, aerial photos, pollution sources, etc.

8. Kitchin maintains a clearing house of scholarship on smart cities, available here: https://progcity.maynoothuniversity.ie/2022/01/smart-city-cases-reading-lists

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!