Stochastic Proximal Methods for Non-Smooth Non-Convex Constrained Sparse Optimization
Michael R. Metel, Akiko Takeda; 22(115):1−36, 2021.
Abstract
This paper focuses on stochastic proximal gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer and convex constraints. To the best of our knowledge we present the first non-asymptotic convergence bounds for this class of problem. We present two simple stochastic proximal gradient algorithms, for general stochastic and finite-sum optimization problems. In a numerical experiment we compare our algorithms with the current state-of-the-art deterministic algorithm and find our algorithms to exhibit superior convergence.
[abs]
[pdf][bib]© JMLR 2021. (edit, beta) |