Stochastic Optimization under Distributional Drift

Authors: Joshua Cutler, Dmitriy Drusvyatskiy, Zaid Harchaoui; Volume 24, Issue 147, Pages 1-56, 2023.

Abstract

This study addresses the problem of minimizing a convex function that undergoes unknown and potentially stochastic changes, which may depend on both time and the decision variable. Such problems are commonly encountered in machine learning and signal processing, known by various names such as concept drift, stochastic tracking, and performative prediction. We introduce new convergence guarantees for stochastic algorithms with iterate averaging, focusing on bounds that hold both in expectation and with high probability. Our estimates separate the effects of optimization error, gradient noise, and time drift. Notably, we identify a regime with low drift-to-noise ratio where the efficiency of the proximal stochastic gradient method benefits significantly from a step decay schedule. Numerical experiments are conducted to illustrate our findings.

[Abstract]

[PDF][BibTeX]