Skip to main content
Computing, Environment and Life Sciences

Gauging Social Distancing in Urban Environments

Argonne is designing and coding a social distancing” detector using Python and OpenCV to determine the percentage of people following social distancing guidelines

The world is in the midst of a historic pandemic. Our new normal” involves mask wearing, more frequent hand washing, and social distancing” — maintaining a distance of at least 6 feet from others to reduce the spread of the Coronavirus. Argonne is designing and coding a social distancing” detector using Python and OpenCV to determine the percentage of people following social distancing guidelines.

The process involves analyzing video of pedestrians — typically from surveillance camera footage — frame by frame, calculating the distance between each pair of people, and indicating whether two people are standing less than 6 feet apart. We used OpenCV, a computer vision function library, because it greatly simplifies the process of loading a video, separating it into individual frames for analysis and editing, and generating final results.

Analysts identify a region of interest, compensating for the issue of camera distortion by creating a bird’s-eye-view image and making the conversion rate between physical distance and pixel distance constant. They calculate the number of pixels that make up six feet, warping the coordinates of these two points using the same function used to warp the image and using the distance formula to calculate the number of pixels between them. This distance is the number of pixels that make up six feet: the minimum safe distance. Because the points and image are warped using the same function, this pixel distance is the same throughout the bird’s-eye-view image.

The first step of the operation loop is person detection, which is accomplished using a real-time object detection program called YOLO that recognizes a wide variety of objects. Argonne’s program includes a filter that retains only the person recognition. Each person is represented by a bounding box.” Users take a single point from each bounding box, warp it using the same function used in the setup, and map the coordinates of the warped box points onto the bird’s-eye-view image. Because everything is now mapped onto the bird’s-eye-view image, the distance formula can be used to calculate the distances between each pair of points. These distances are then compared to the minimum safe distance, which was also calculated in the setup.

Argonne will add the ability to detect groups of people walking together (family groups that do not violate social distancing guidelines) by adding in an algorithm that can associate objects across multiple frames and assign unique IDs to each person detected, enabling analysts to recognize groups of people walking together by tracking their specific object IDs, and disregard them as violators even if they are standing too close together.

The research is part of the SAGE project to design and build a national-scale, reusable cyberinfrastructure to enable AI at the edge. Funded by the National Science Foundation, the research team includes Northwestern University, the University of Chicago, George Mason University, the University of California San Diego, Northern Illinois University, the University of Utah, and the Lincoln Park Zoo.