Accelerated Zeroth-Order and First-Order Momentum Methods from Mini to Minimax Optimization
Feihu Huang, Shangqian Gao, Jian Pei, Heng Huang.
Year: 2022, Volume: 23, Issue: 36, Pages: 1−70
Abstract
In the paper, we propose a class of accelerated zeroth-order and first-order momentum methods for both nonconvex mini-optimization and minimax-optimization. Specifically, we propose a new accelerated zeroth-order momentum (Acc-ZOM) method for black-box mini-optimization where only function values can be obtained. Moreover, we prove that our Acc-ZOM method achieves a lower query complexity of ˜O(d3/4ϵ−3) for finding an ϵ-stationary point, which improves the best known result by a factor of O(d1/4) where d denotes the variable dimension. In particular, our Acc-ZOM does not need large batches required in the existing zeroth-order stochastic algorithms. Meanwhile, we propose an accelerated zeroth-order momentum descent ascent (Acc-ZOMDA) method for black-box minimax optimization, where only function values can be obtained. Our Acc-ZOMDA obtains a low query complexity of ˜O((d1+d2)3/4κ4.5yϵ−3) without requiring large batches for finding an ϵ-stationary point, where d1 and d2 denote variable dimensions and κy is condition number. Moreover, we propose an accelerated first-order momentum descent ascent (Acc-MDA) method for minimax optimization, whose explicit gradients are accessible. Our Acc-MDA achieves a low gradient complexity of ˜O(κ4.5yϵ−3) without requiring large batches for finding an ϵ-stationary point. In particular, our Acc-MDA can obtain a lower gradient complexity of ˜O(κ2.5yϵ−3) with a batch size O(κ4y), which improves the best known result by a factor of O(κ1/2y). Extensive experimental results on black-box adversarial attack to deep neural networks and poisoning attack to logistic regression demonstrate efficiency of our algorithms.