Planning April 2015


Leveraging the potential of big data for planning.

By Mary Hammon

Nine miles south of downtown Chicago, the derelict site of the former U.S. Steel South Works plant has served as a reminder of the nation's industrial revolution — and for the last 20 years, its decline. The steel plant — once America's largest — produced steel beams for some of Chicago's famous skyscrapers and employed 20,000 area residents at its peak.

But the days when smokestacks were a sign of progress and prosperity are gone. Today, on the site of this former gem of the industrial revolution, the developers and designers of an innovative 589-acre project called Chicago Lakeside are using big data to help create a new age of urban prosperity on Chicago's South Side.

They are not alone in their quest to leverage big data to improve urban environments and the quality of life of the people who live in them. Planners, designers, developers, and citizens in cities from coast to coast — and across the world — are putting big data to use.

Bigger than the Chicago Loop, the Chicago Lakeside Development — the vision of developer Dan McCaffery — has been called a "micro-city," "city within a city," and "city of the future."

At build-out, which will likely take decades, the $4 billion Chicago Lakeside community could contain more than 500 buildings, including housing for 50,000 people, 17.5 million square feet of retail space, and a new high school, plus a 1,500-slip boat marina and 125 acres of open space and parks with bike paths to reconnect Lakeside and surrounding neighborhoods to Lake Michigan. An extension of Lake Shore Drive and new commuter rail and bus services will link Lakeside and vicinity to downtown Chicago.

Also envisioned for Lakeside is what the developer and design firm Skidmore, Owings & Merrill call next-generation infrastructure: clean energy sources, state-of-the-art internet technology, and storm- and wastewater management strategies and green technologies that will help the community achieve a net-positive water balance.

But turning a nearly 600-acre brownfield into a sustainable, mixed use community means thinking outside the box. Developed as part of a collaboration between the University of Chicago, Argonne National Laboratory, SOM, and McCaffery Interests, a new computation tool called LakeSIM is taking planning and data visualization to a whole new level. LakeSIM uses data — and lots of it — to run millions of detailed simulations to allow designers to model the complex relationships between elements of urban design such as energy, transportation, waste and water infrastructure, climate, and other environmental factors.

Watching these systems interact over time will ultimately help designers and planners to determine the size and shape of the development's infrastructure, which is currently nonexistent. "The scale and complexity of the Lakeside project called for the creation of an innovative new decision-making tool for designers, developers, and infrastructure engineers to manage risk, holistically evaluate phasing and cost, and provide all involved a comprehensive look at what seems like an infinity of variables. Only big data and advanced computation can do this kind of work," says Doug Voigt, AICP, the director of urban design and planning for SOM.

Voigt notes that big data offers quantitative support (statistics, models, maps, and simulations) that planners can present to city officials, developers, and others to show how they reached their decisions, which he believes is extremely valuable.

The big deal about big data

Planners are comfortable with data — and have always used numbers to inform planning decisions. According to the Chicago Architecture Foundation's big data exhibition (which has been on display at the organization's headquarters since May 2014) when creating the 1909 Plan of Chicago, Daniel Burnham and Edward Bennett gathered massive amounts of data on the city and used the information to inform their design decisions.

By today's standards, however, the data and tools for analysis traditionally used in planning are relatively "small." While sample population or transportation data allows planners to draw broader conclusions about people who live in a city and their habits, large datasets known as big data give far more detailed information about a larger portion of a population, which can then lead to better decisions regarding the needs of a particular community or the urban population at large.

But what exactly is big data?

As a society, we are producing and capturing more data than ever before: 2.5 billion gigabytes a day in 2012, according to IBM. It's estimated that 90 percent of the world's data has been produced in the last few years. And this resource is only getting bigger. Experts predict that the total amount of data in the world will more than double every two years.

Massive amounts of data are being generated by sensors, cell phones, tablets, GPS devices, retail rewards cards, and credit cards — as well as digital interactions like swiping a public transit card, texting, emailing, watching and posting online videos, shopping online, and using Facebook, Twitter, and YouTube.

Welcome to the world of big data. Simply defined, it is an accumulation of data that is too large and complex for processing with traditional database management tools. What really sets big data apart, for planners, is the variety and complexity of the data and the speed at which the information is collected, which can allow for near- or real-time analysis.

"[With big data] there is more precision available," says Joel Gurin, CEO and founder of the Center for Open Data Enterprise and author of the book Open Data Now. "[Big Data] is very timely because it's collected constantly. ... It gives you a much, much more precise tool to manage the things that are important to a city."

As a tool, big data offers significant benefits for planners and others in the business of planning, managing, and improving cities:

  • More detailed data for enhanced, data-driven decisions.
  • A more efficient way to manage resources and evaluate existing programs and policies.
  • Near- or real-time analysis that allows users to see what's happening in cities (e.g., transportation) second by second.
  • Quantitative support in the form of statistics, models, maps, and simulations that planners can show city officials, developers, and others to demonstrate how and why they came to certain decisions.
  • New ways to cross-analyze and visualize datasets to gain new insights and identify new relationships.
  • Enhanced public-private partnerships to obtain, aggregate, analyze, and apply data to cities in ways that benefit all parties involved and ultimately improve the quality of urban life.

"Big data lets you see new things and justify design decisions in ways that you were never able to before," says Matt Shaxted, a computational designer with SOM.

The roads most (and least cycled) — this visualization of Oregon Strava data illustrates which roads in the Portland metro region were traveled the most (red) and least (blue) by cyclists over a 12-month period

The roads most (and least cycled) — this visualization of Oregon Strava data illustrates which roads in the Portland metro region were traveled the most (red) and least (blue) by cyclists over a 12-month period. Image courtesy Strava.

'Up-cycling' big data in Oregon

Home to one of America's most bike-friendly cities (Portland), the state of Oregon actively promotes and supports its bicycling culture. But when it came to expanding and improving the state's bicycle infrastructure, the Oregon Department of Transportation faced a challenge: insufficient data to know how Oregon cyclists were actually getting around.

While bicycle counts provide a snapshot of what happens at one location for a short period, they don't provide data about bicyclist behavior or bicyclists' "unofficial routes" and shortcuts. To solve this lack-of-data problem, ODOT purchased a repackaged dataset from Strava (, a competitive cycling app that enables cyclists to track their miles and times, map routes, compete with other users, and share their activities with friends. The data contained information on routes taken by 35,000 Oregon cyclists.

Gathering that data on its own would have been too formidable — and too expensive — a task for ODOT, says Talia Jacobson, the agency's active transportation policy lead. "[Using others' data] is a tremendously cost-effective way to get a really rich statewide dataset," she says.

Using Strava data, ODOT can identify patterns of use across the state, instead of just numbers of riders riding over a particular counting strip. "Strava can tell us where there are routes that cut through paths that riders are taking that aren't part of our models of the system," says Jacobson.

While ODOT is still in the early stages of analyzing the data, it has already put the information to work. So far, the data has helped inform decisions about ODOT's placement of three permanent bike counters on Oregon's scenic bikeways, confirming the placement of two and making the agency rethink placement of the third. In the third instance, Strava data indicated that cyclists were avoiding a particular section of the highway, which would have led to underrepresented counts.

The Strava data is also making ODOT rethink the placement of rumble strips on certain roads. While rumble strips improve auto safety, they can be unsafe for cyclists. Jacobson says the Strava data is showing ODOT which roads are popular with cyclists — leading to better decisions about spots where rumble strips could ensure the safety of both motor vehicle drivers and cyclists.

ODOT wants to enlist big data in another way as well. In partnership with Portland State University, ODOT is working with an app called ORcycle that allows users to record their trips and report locations where they experience crashes, near collisions, or other unsafe or uncomfortable conditions, all of which are often underreported, according to Jacobson. The app and data were the products of a pilot project, and ODOT is currently looking at several possible next steps.

Fighting blight in Detroit

After getting its start mapping Detroit's foreclosure auction catalog (, Loveland Technologies, a Detroit-based software data company, is using big data and technology to help fight the blight that has long plagued the Motor City.

Loveland's latest creation, Motor City Mapping (, is a critical component of the city's $1.5 million antiblight campaign. The initiative started in September 2013, when the Obama administration convened the Detroit Blight Removal Task Force ( to develop a way to remove every blighted structure and clear every blighted vacant lot in the city — and to get everything done as quickly and in as environmentally conscious a manner as possible.

According to Dexter Slusarski, senior data manager for Loveland Technologies and an urban planner, the challenge was that the task force couldn't comprehend the whole problem until they understood its entire scope. That would mean collecting information on every building and vacant lot within the 142 square miles of the city. That's just what they did. With funding from The Kresge Foundation, Skillman Foundation, JPMorgan Chase Foundation, Michigan State Housing Development Authority, and Rock Ventures, Loveland created Motor City Mapping to capture, store, and visualize all the data collected from the property surveys.

In December 2013 and January 2014, a smartphone app created by Loveland enabled 150 volunteers to survey Detroit's 375,000 land parcels over the course of about 30 working days by driving around, photographing properties, and texting photos and descriptions to a database in a process called "blexting" (blight + texting). Any Detroit resident can download the Blexting app and take an active role in surveying neighborhoods and keeping the dataset up to date.

Once blexted, the data undergoes a quality review by Loveland staff and is posted to the Motor City Mapping website, resulting in a rich and usable online interactive map and dataset that allows a variety of stakeholders and citizens to understand the city — down to the neighborhood and street level — and take action as never before.

Motor City Mapping has proven an excellent complement to Loveland's other data-gathering and visualization efforts, which include working with a wide variety of datasets from Detroit Water and Sewage, DTE Energy, and the United States Postal Service. Loveland worked with its partners, Data Driven Detroit, to help develop a "vacancy index" that combines these datasets.

The vacancy index, along with the Motor City Mapping survey information, gives city departments a deeper look into vacancy numbers. The city water department is using these datasets to help identify unoccupied properties or vacant lots with running water. The Detroit Land Bank is using the data to more efficiently target properties for demolition.

"The ramifications of having this dataset and updating it constantly are massive ... because the entire region now has a dataset that they can really point to and start to make smart decisions, analyze patterns, and have real conversations about things that are a reality as opposed to things that are [unsubstantiated estimates and assumptions]... . That kind of ability allows people to make really great decisions," says Slusarski.

As a resident of the city, and a planner, he is also excited about how the project is empowering citizens and neighborhoods to engage in the process and take action on problems.

Screenshots courtesy; photo courtesy detroit Blight removal task force.

Downsides of big data

While the planning potential of big data is substantial, and exciting, there are challenges as well:

PRIVACY, CONTROL, AND ACCESS. MIT researchers recently found that it requires only four data points (each containing the date and time of a purchase, and the location and name of the store) from a 30-day database of credit card purchases by 1.1 million people to identify 90 percent of them; even with just two data points, 40 percent of the people in the dataset could be identified.

As the use and analysis of big data increase and more city governments open their data to the public, the question of what information is made available will become even more of a concern.

"Open data is a good thing, but [governments] also collect a lot of private information that people don't perceive could become public," says Jennifer Evans-Cowley, AICP, a professor of city and regional planning with The Ohio State University. She points to smart parking meters as an example. "Could someone request that information and know who you are and where you are parking?" A big question to ask, especially in the wake of the movement to open government data to the public, she says, is to what degree information can be anonymized or not.

FALLIBILITY AND QUALITY OF DATA. Another risk is assuming all data is good data and all data-based conclusions are inherently accurate. "Big data can start to seem infallible or perfect," says Laura Schewel, CEO of Streetlight Data (, a data analysis start-up that uses cell phone data to study how people use cities. "That is something smart mayors and planners should really be aware of. Biases in the data, errors in analysis, and low-quality datasets can all lead to incorrect or misguided conclusions."

For example, she says, big data may be biased against low-income people and non-native English speakers. Because big data relies heavily on smartphones and social networking, it has a tendency to underrepresent disadvantaged parts of the population. "If you leave those people out of your measurements, no matter what your intentions are, decisions will start to be biased against them," Schewel explains.

Therefore it is important to be aware of potential weaknesses and holes in the data being collected. To eliminate bias in her company's transportation data samples, Streetlight takes extra steps to collect and analyze data from all mobile phones, not just smartphones. "It's not easy," she says. "But we think it's really critical."

STRUCTURE AND COMPATIBILITY. For big data to be of any use, it has to be collected in a way that allows sorting and analysis. And in order to aggregate and cross-analyze datasets — another exciting potential of big data — those datasets must be compatible. Otherwise, the data is worthless. "City, state, and the federal governments have a lot of data, but it has been collected in often outmoded ways, technology has changed, it can be inaccurate — all kinds of different problems," says the Center for Open Data Enterprise's Gurin.

On the bright side, one thing that demonstrates the high value of that data, he adds, is that private companies are investing a lot of money, time, and energy to modernize the data and clean it up so they can use it. Moving forward, that means more thought and planning is needed to identify and implement consistent ways to collect and organize data.

LEARNING CURVE AND SKILLS GAP. Data skills, special software, and technical capacity are necessary in order to process, sort through, and analyze big data. According to Evans-Cowley, this is a limitation for more wide-scale adoption of its use in planning.

"While it would be great to use it, the average planner doesn't know how to go about obtaining cell phone data, the tools needed to analyze it, and what would you use it for," she says. "I think we're on the cusp of a lot of technology innovation. And following that I think we'll see all of the tools. And then there is the retraining [needed] for planners to learn the new skills ... to use and analyze the data that's out there."

A role for planners

Most of the players driving big data and the smart cities movement right now are computer scientists, data analysts, technology experts, and companies like IBM and Cisco. They know data and technology and have the ability to use and apply them.

But when it comes to using big data to positively impact cities and quality of life, planners have something the others don't: extensive, holistic knowledge of what makes cities work, their challenges, and the needs of the people who live in them.

"There's a trap of thinking about big data as being smart on its own," says Mary Sue Barrett, president of the Metropolitan Planning Council in Chicago. "Its potential really comes from the people — planners and more — who step back and ask good questions about what's there, what's missing, and what the potential is."

Kate McMahon, AICP, chair of the American Planning Association's technology division, agrees: "Big data is still in its infancy. It is a new, new field, and it's going to evolve and change a lot. ... There needs to be somebody at the table that is talking about the public interest. ... That is planners."

Mary Hammon is Planning's assistant editor.


Motor City Mapping tutorial:

Blexting app how-to: