Skip to main content
Article | Transportation and Power Systems

Making sense of cities

How sensory technologies can help us understand urban systems and make them smarter

In more and more homes across America, sensors are taking over — from waking us up in the morning to locking our doors at night. But the impact of these technologies is not limited to our own lives and the mundane tasks we do every day — they reverberate across cities throughout America.

We expect a lot from these cities – they need to keep us safe, provide services, conserve energy and stay clean. And in the same way that sensors help us monitor conditions in our homes, they can help cities track their important systems, including transportation, energy and environmental.

By capturing detailed information about these various systems — such as the amount of pollution in the air, the number of cars on the road and the amount of energy a particular building uses every hour — sensors serve as useful tools for problem-solving and decision-making for city planners, scientists and policymakers as well as residents.

At Argonne, researchers are using sensory technologies to create research platforms that can help stakeholders understand how their cities are performing. These efforts unite research experts in energy, transportation, buildings, computing and machine learning, and leverage a wide range of sensory devices to build platforms.

The purpose of developing exploratory sensing research platforms is to enable others to perform research and collect insights previously unavailable due to lack of information,” said Argonne Center for Transportation Research (CTR) engineer Eric Rask.

For example, if a research or municipal group is looking for data regarding freeway conditions that are currently not accessible, we can work to create an exploratory platform with a set of sensors that attempts to get that information in a new or more effective way.”

Learning from connected and autonomous vehicles

The Laboratory’s toolkit includes sensors common among vehicle technologies, including Light Detection and Ranging (Lidar), radar and camera vision systems. Used for vehicle automation, awareness and collision avoidance, these emerging systems help vehicles detect objects in the environment, like pedestrians, traffic lights, lampposts and surrounding cars. These technologies help form the foundation of driverless cars, a technology projected to spread across cities in years to come

With automated cars, you need to be aware of a lot of things, not just the vehicle in front of you but also where the traffic lights are, where the lines on the road are, what else is behind you and more,” Rask said. As a result, the sensors in these vehicles are gathering lots of information about the surrounding environment.”

Rask and fellow CTR engineers are tinkering with these suites of sensors with two motives in mind. With an eye towards efficient vehicle operation, the first motive is to learn about what is needed to make automated driving possible (and more efficient). This involves developing an understanding of how sensors are configured and how they will work together systematically to ultimately enable control of the vehicle and provide information useful for improved operation. The second motive is to figure out what other opportunities these suites of sensors enable, beyond autonomous driving.

What we’re exploring is not just how sensory information is being used for vehicle automation, but also how it can be used for evaluating the environment surrounding a vehicle, for example localized traffic or curb-space usage,” Rask said.

Data gathered and shared by vehicles have the potential to help city planners and developers better understand things like travel habits, traffic patterns and road conditions. This information could support decisions on infrastructure and operations, such as decisions on where to put new traffic lights, when to run salt trucks and snow plows, and where to disperse traffic patrollers. Moreover, as new sensing capabilities begin to see prevalence due to autonomous vehicles, the use cases and newly available data are just beginning to be explored by a range of stakeholders and access to state-of-the-art sensing data is not always easy to obtain.

Learning through wireless sensing

Along with leveraging sensors in cars, Argonne utilizes a mobile platform known as Waggle to support urban studies. Waggle is a novel wireless sensor system that consists of modular nodes that can hold various sensors, including ones that measure temperature, humidity, surface temperature and ambient light.

Developed at Argonne, the platform combines sensors with edge computing capabilities to actively analyze and respond to sensory data. Environmental data is collected and transmitted wirelessly via the cloud. Camera and audio data is processed in-situ, with only the results sent to the cloud.  With nodes distributed across various parts of a city, researchers can tap into the environmental and urban activity data collected in near-real time.

Among the most well-known applications of Waggle is ArrayOfThings (AoT), a large-scale urban sensing project in Chicago. Funded by the National Science Foundation, the project uses sensors to measure factors that affect livability in the city.

The project has many goals, one of which is to support research into more intelligent infrastructure.

If someone wants to test a new technology, such as a safer traffic signal light, they would need a programmable platform. Array of Things is intentionally designed to allow for this type of work,” said Argonne senior computer scientist Charlie Catlett, founding director of the Urban Center for Computation Data. You can select and program the sensory devices in it to do what you need; there’s really no other open resource out there like it.”

Another goal of the project is to deliver sufficient detailed data on cities to scientists and engineers, policy makers and the public. This includes insights on transportation, energy and air quality, all of which are closely linked to traffic, power demand and health-related issues such as asthma.

Air quality, along with noise and traffic/pedestrian flow, are among the key areas of common interest to scientists, governments and residents, and we can measure that using Array of Things,” said Catlett.

University of Dallas researchers have adopted Waggle to study air quality in Chattanooga, Tenn., a city that consistently ranks high in childhood asthma incidence. Through a pilot project funded by the National Science Foundation, these researchers are deploying Waggle nodes across lampposts in the metropolitan area. The sensors will detect asthma-aggravating particulate matter and pollen and gather data related to air quality, such as location, temperature, pressure, humidity and levels of six other pollutants.

Waggle is also equipped with camera sensors that can be programmed to identify various objects, including birds, cars, bicycles and even types of clouds. In a project funded by the Illinois Department of Transportation, Argonne researchers will use these sensors to detect cars at railroad crossings. Statistics gathered will help transportation planners identify crossings that are most disruptive and prioritize the development of underpasses or overpasses to alleviate the problem.

Our goal in this project is to be able to, on the node, recognize the cars and count them quickly enough to create reliable statistics,” said Argonne computer scientist Nicola Ferrier.

Along with detecting and counting the cars at these crossings, researchers will also use the cameras to detect when the cross arm goes up and down, and to calculate the pass-through rate of vehicles. To do so, researchers are combining sensory data collection with machine learning techniques, which are powerful tools for object recognition.

In the past, humans would have to write specialized code to extract features from images to try and infer that an image of a car is indeed a car,” said Ferrier. Machine learning does all this for us, more robustly and with greater accuracy. The only caveat is that you have to have a lot of data to train the system properly, and that’s where these sensors come into play.”

Camera sensors will process images within the nodes.  The edge computing capabilities of Waggle allows images to be examined around the clock, over brief intervals. This approach for monitoring crossings defies convention; traditionally, hired workers visit crossings and count vehicles using handheld machines, for a limited number of hours and days. This manual approach leaves behind mostly anecdotal information on which to base decision-making, rather than providing a full picture of the process, Catlett says.

Without devices like Waggle that can processes images for computation quickly, it’s hard to get statistics to do traffic planning effectively, but we’re helping to slowly change things,” he said.