Bayes-Newton Methods for Approximate Bayesian Inference with Positive Semi-Definite Guarantees
Authors: William J. Wilkinson, Simo Särkkä, Arno Solin; Volume 24, Issue 83, Pages 1-50, 2023.
Abstract
In this study, we propose a framework that formulates natural gradient variational inference (VI), expectation propagation (EP), and posterior linearization (PL) as extensions of Newton’s method for optimizing the parameters of a Bayesian posterior distribution. This perspective explicitly considers inference algorithms within the framework of numerical optimization. We demonstrate that common approximations to Newton’s method from the optimization literature, such as Gauss-Newton and quasi-Newton methods (e.g., the BFGS algorithm), remain valid within this ‘Bayes-Newton’ framework. As a result, we introduce a set of novel algorithms that guarantee the production of positive semi-definite (PSD) covariance matrices, which is not the case with standard VI and EP. Our unified perspective provides new insights into the connections between various inference schemes. Additionally, we show that all the presented methods are applicable to models with a Gaussian prior and non-conjugate likelihood, which we demonstrate using (sparse) Gaussian processes and state space models.
[Abstract]