Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity
Qiankun Shi, Jie Peng, Kun Yuan, Xiao Wang, Qing Ling; 26(268):1−58, 2025.
Abstract
In this paper, we establish tight lower bounds for Byzantine-robust distributed first-order stochastic methods in both strongly convex and non-convex stochastic optimization. We reveal that when the distributed nodes have heterogeneous data, the convergence error comprises two components: a non-vanishing Byzantine error and a vanishing optimization error. We establish the lower bounds on the Byzantine error and on the minimum number of queries to a stochastic gradient oracle for achieving an arbitrarily small optimization error. Nevertheless, we also identify significant discrepancies between our established lower bounds and the existing upper bounds. To fill this gap, we leverage the techniques of Nesterov's acceleration and variance reduction to develop novel Byzantine-robust distributed stochastic optimization methods that provably match these lower bounds, up to at most logarithmic factors, implying that our established lower bounds are tight.
[abs]
[pdf][bib]| © JMLR 2025. (edit, beta) |
