A Lyapunov Analysis of Accelerated Methods in Optimization

Ashia C. Wilson, Ben Recht, Michael I. Jordan.

Year: 2021, Volume: 22, Issue: 113, Pages: 1−34


Abstract

Accelerated optimization methods, such as Nesterov's accelerated gradient method, play a significant role in optimization. Several accelerated methods are provably optimal under standard oracle models. Such optimality results are obtained using a technique known as "estimate sequences," which yields upper bounds on convergence properties. The technique of estimate sequences has long been considered difficult to understand and deploy, leading many researchers to generate alternative, more intuitive methods and analyses. We show there is an equivalence between the technique of estimate sequences and a family of Lyapunov functions in both continuous and discrete time. This connection allows us to develop a unified analysis of many existing accelerated algorithms, introduce new algorithms, and strengthen the connection between accelerated algorithms and continuous-time dynamical systems.

PDF BibTeX