New Sequence Results and Improved Algorithmic Guarantees for Asynchronous Iterations in Optimization
Authors: Hamid Reza Feyzmahdavian, Mikael Johansson; Published in Journal of Machine Learning Research, 24(158):1−75, 2023.
Abstract
This paper presents novel convergence results for asynchronous iterations used in the analysis of parallel and distributed optimization algorithms. The results are easy to apply and provide explicit estimates on how the level of asynchrony affects the convergence rates of the iterates. Our findings simplify, streamline, and strengthen existing convergence proofs for various asynchronous optimization methods. Furthermore, they enable us to establish convergence guarantees for popular algorithms that previously lacked a complete theoretical understanding. Specifically, we leverage our results to derive improved iteration complexity bounds for proximal incremental aggregated gradient methods. Additionally, we obtain tighter guarantees by considering the average delay rather than the maximum delay for the asynchronous stochastic gradient descent method. We also provide less conservative analyses of the speedup conditions for asynchronous block-coordinate implementations of Krasnoselskii–Mann iterations. Lastly, we quantify the convergence rates for totally asynchronous iterations under different assumptions on communication delays and update rates.
[abs]