Skip to main content
Seminar | Mathematics and Computer Science

Limited-Memory Structured Quasi-Newton Methods

LANS Seminar

Abstract: For large optimization problems, limited-memory compact quasi-Newton methods use low-rank updates to effectively estimate the Hessian matrix of second derivatives. However, when additional second derivative information is available, it is desirable to exploit the given information.

This presentation describes the compact representation of two structured” BFGS quasi-Newton update formulas, which combine available Hessian information with quasi-Newton updates. The compact representations enable effective structured limited memory techniques and the computation of search directions using the Sherman-Morrison-Woodbury inverse. Implementations of two limited-memory structured BFGS algorithms are compared on a set of benchmark (CUTEst) problems, displaying desirable improvements.

Attend via BlueJeans