Skip to main content
Seminar | Mathematics and Computer Science

Practical Primal-Dual Hybrid Gradient for Large-Scale Linear Programming using Restarts and Other Enhancements

LANS Seminar

Abstract: Traditionally, linear programming (LP) is solved using Simplex or Interior Point Methods whose core computational operation is factorization. Recently, there has been a push in the optimization community towards developing methods whose core computational operation is instead matrix-vector multiplications. Compared with factorization, matrix-vector multiplications are less likely to run out of memory on large-scale problems and are easily parallelized. However, there is a major drawback to these methods: they are often slow at finding high-accuracy solutions. This talk addresses this issue.

Our method, PDLP, is based on primal-dual hybrid gradient, popularized by Chambolle and Pock (2011), and adds several important enhancements. One of these enhancements, which will be the focal point of the talk, is restarts. Restarts are already a popular method for unconstrained optimization; we show they are also extremely useful both in theory and practice for primal-dual methods. In particular, one can prove that PDHG’s runtime improves from $O(\kappa^2 \log(1/\epsilon))$ to $O(\kappa \log(1/\epsilon))$ using an adaptive restart scheme where $\kappa$ is the condition number and $\epsilon$ is the desired accuracy.

Finally, we evaluate the performance of PDLP. Compared with SCS, an ADMM based solver, with a target of $10^{-8}$ relative accuracy and 1 hour time limit on Netlib, our Julia prototype of PDLP achieves a 19x reduction in the geometric mean of solve times and reduces the number of instances unsolved from 55% to 6%. A C++ implementation of our code is available now in Google OR tools. At $10^{-4}$ relative accuracy our implementation has almost half the geometric mean solve time of the next fastest open source solver on Mittelmann’s LP benchmark set for barrier methods.

This talk is based on joint work with: David Applegate, Mateo Diaz, Haihao Lu, Miles Lubin, Brendan O’Donoghue, and Warren Schudy.