Online Optimization over Riemannian Manifolds

Authors: Xi Wang, Zhipeng Tu, Yiguang Hong, Yingyi Wu, Guodong Shi; Volume 24, Issue 84, Pages 1-67, 2023.

Abstract

In recent years, there has been a significant increase in research on online optimization. This paper introduces online gradient descent and online bandit algorithms for Riemannian manifolds in both full information and bandit feedback settings. These algorithms are designed for geodesically convex and strongly geodesically convex functions. Upper bounds on the regrets for these algorithms over Hadamard manifolds are established. Additionally, a universal lower bound for achievable regret on Hadamard manifolds is found. The analysis demonstrates the impact of time horizon, dimension, and sectional curvature bounds on the regret bounds. Moreover, when the manifold allows positive sectional curvature, it is proven that a similar regret bound can be achieved by handling non-constrictive project maps. Numerical studies are conducted on problems defined on symmetric positive definite matrix manifold, hyperbolic spaces, and Grassmann manifolds to validate the theoretical findings using synthetic and real-world data.

[abs]

[pdf][bib]

[code]