Chicago And Big Data

Editor’s note: Nick Rojas is a business consultant and writer who lives in Los Angeles, Calif.

As a civilization, we may not be getting smarter. However, the technologies we use certainly are. Since the introduction of the smartphone, we’ve witnessed the emergence of smart homes, smart power grids, and even smart football stadiums for helping fans find parking spaces. Lately, the trend has been to go big. Infrastructure applications are ever increasing in size and scope for transmitting, collecting, and processing big data in an effort to better understand and navigate our surroundings.

Examples of scalability with regard to smart technology now include “smart cities” like Chicago. It appears that the third largest city in the United States will become a leading model for collecting and managing big data from sensor nodes and cell phones throughout the area.

That’s 237 square miles and 2.7 million residents participating (knowingly or not) in an experiment expected to yield some impressive enhancements. Examples include:

  1. Traffic control systems: Enhance the flow of traffic at intersections by determining changes related to work and holiday schedules.
  2. Maintenance management: This could enhance the life of fleet vehicles, etc.
  3. Smart grids: Power distribution is always subject to change as it relates to environmental metrics like temperature
  4. Monitoring trends in criminal activities: Track car thefts and even the stolen cars themselves more efficiently.
  5. Public works: The storage, transport, and application of road salt in the winter as it relates to snow accumulation.

Chicago’s calling it “The Array of Things” (AoT). Temperature, humidity, light, sound, carbon monoxide, nitrogen dioxide, motion, low-resolution infrared, cell phone signals, and foot traffic (via Bluetooth) are among the metrics being used to better understand the city and the interaction which occurs among its residents and visitors. It will be done with about 500 sensors installed cumulatively over a period of three years. The data, hardware and software will be open source, which is expected to promote transparency and invite independent developers to come up with their own applications. There will be no surveillance devices incorporated into the array.

shutterstock_198885692

Of course, data alone has little value until it’s analyzed. This is where we see some overlap between AoT and more conventional models of processing data which are mined from the Internet. The Predictive Analytics Initiative and SmartData Platform will be avenues through which the raw data is processed to make data-driven predictions for many public departments and services (and no, there won’t be minority reports or a pre-crime division of the Chicago Police Department).

In a press release, the city of Chicago stated that “Chicago’s SmartData Platform is a tool that will provide leaders the ability to analyze millions of lines of data in real-time; this helps make smarter, earlier decisions to address a wide range of urban challenges.”

How did we get here? The concept of utilizing big data to augment a city’s infrastructure required some technological evolution. When we include the communications technology that we now take for granted, we can appreciate AoT as an amalgam of seemingly countless technologies combined in a concerted effort to create an information infrastructure.

After phones began to digitize the human voice, they expanded to include ways to collect, store and process data; and the term “smart” was coined. Developments in integrated circuits, wireless communications and software make smartphones possible and are represented by over 250,000 patents (80,000 patents related to telecommunications connectivity alone).

As smartphone technology developed, so did sensors. Sensor nodes, mesh networks, and similar techniques for collecting and transmitting environmental data have been quietly developing for applications in agriculture, for example. That along with open source controllers like Arduino have not only provided user-friendly off-the-shelf solutions for projects like AoT, it makes both the data and the means for collecting it accessible to the public, right down to the circuit board.

Go into any Radio Shack‎ and you’ll see Arduino products available with peripherals and user instructions. That’s an example of what kind of technology is being used in this endeavor.

Patents expire and enter the public domain. And as with other products, economies of scale demonstrate how advanced technologies mature and become more accessible to the public. IT has become so pervasive that you see kits that offer open source platforms for hobbyists, giving them the ability to develop numerous applications. Likewise, wireless communication and networking devices geared toward data acquisition are also quite accessible. It’s actually within the budget of an individual with modest resources to develop their own “smart home”, nearly from scratch.

This is an indication that technologies have matured to the point that local and state governments can now create an infrastructure on a larger scale to collect big data based a broad array of data types. More importantly, it can be done with transparency, so that everyone has access to the data, hardware, and software. If one so desired, they could develop their own devices and utilize the data directly from AoT in ways that benefit their personal needs.

The least predictable aspects of smart city models are the policies that control them. Will it be implemented in a way that explicitly adheres to the laws of the land? There may also be competition in regard to where resources are directed and what problems should be solved first, if/when the project proves successful.

One size fits all? Whether or not Chicago will be a model for the world to follow remains to be seen. As every city is different, so are their analytics. Dholera, India for example, will not be tracking road salt as they pursue their own ambitious plans for a smart city.

No doubt, people interested in smart cities will be keeping a watchful eye on Chicago. It has one main attribute that is sure to promote public acceptance and even participation. The emphasis on open source is not found in most cities, which makes it an example to others as the concept matures and develops.

As trends suggest bigger concentrations of people in urban areas, it’s also projected that smart cities as an industry will become a $1.5 trillion global phenomenon.

What about civil liberties? Relatively speaking, it’s not a threat. Data types like temperature and gas composition are not exactly the stuff of Orwellian conspiracies.

We live in a time when most people are unaware of the privacy they voluntarily forfeit or how book burnings are made obsolete by having data simply erased from Google search results. The open-source Chicago model, if accompanied by robust policies and oversight, are a much smaller threat than the things we see happening with cell phones every day. Raising the question in the context of this article is necessary but in the grand scheme of things, there are certainly bigger fish to fry.

All things considered, we should look forward to a time when a mobile device can find a good parking space, or big data can reduce spending in government. It may be the first time in history that your government doesn’t ask you to wait in line at the motor vehicle department. OK…so that may be a stretch. In any event, it’s sure to change the way we think of cities.