Skip to main content
Argonne National Laboratory

Tackling a Big Data Challenge

ALCF supercomputers help meet the Large Hadron Collider’s growing computing needs.

CERN’s Large Hadron Collider (LHC), the world’s most powerful particle accelerator, generates a colossal amount of data. Each year scientists are faced with about 30 petabytes (or 30 million gigabytes) of data to analyze for a wide range of physics experiments, including studies on the Higgs boson and dark matter.

As part of a multiyear collaboration, Argonne researchers used the Mira system at the Argonne Leadership Computing Facility (ALCF) to simulate particle collision experiments with a massively parallel supercomputer at a large scale for the first time. This work shed light on a path forward for interpreting future data from the LHC.

The team’s work involved developing a workflow that enabled researchers on the ATLAS experiment to run jobs automatically through their workflow management system, simplifying the integration of ALCF resources as production compute endpoints for LHC experiments. The effort successfully showed that such supercomputers can help drive future discoveries at the LHC by accelerating the pace at which simulated data can be produced.

Argonne researchers are now gearing up to use the lab’s Aurora exascale supercomputer to help accelerate the search for new physics at the ATLAS experiment.

Learn more about Argonne’s work with the LHC.