Skip to main content
Mathematics and Computer Science

HEP Data Analytics on HPC

Developing and deploying new tools and algorithms to enable HPC facilities to meet new data analysis demands

Understanding agreement between data and theory is critical for new discoveries, and this growth is presenting important computing challenges to the HEP community. New capabilities at leadership computing facilities drive us to rethink what problems in computational physics can be practically addressed within an HEP scientific workflow. We will develop and deploy new tools and algorithms that will enable high-performance computing (HPC) facilities to meet the new data analysis demands in both energy and intensity frontiers and allow computationally expensive physics studies to be completed on time scales that are not currently feasible.

Specifically, we will transform how the physics tasks are carried out in three major areas: high-dimensional parameter fitting, workflow automation, and introduction of HPC data resources into applications. The tools and techniques developed here will enable high-level control of analyses involving optimization steering and high-dimensional parameter estimation scheduling.