On the Geometry of Stein Variational Gradient Descent
By Andrew Duncan, Nikolas Nüsken, Lukasz Szpruch; Published in 2023, Volume 24, Issue 56.
Abstract
Sampling or approximating high-dimensional probability distributions is a fundamental task in Bayesian inference problems. This paper focuses on the Stein variational gradient descent methodology, which is a class of algorithms that use iterated steepest descent steps based on a reproducing kernel Hilbert space norm. This approach leads to interacting particle systems, and the mean field limit of these systems can be described as a gradient flow on the space of probability distributions with a specific geometric structure. By leveraging this perspective, we gain insights into the convergence properties of the algorithm, particularly in the context of selecting an appropriate positive definite kernel function. Our analysis leads us to consider nondifferentiable kernels with adjusted tails, which we demonstrate to significantly improve performance through various numerical experiments.
[abs]