Monotonic Alpha-divergence Minimisation for Variational Inference
Kamélia Daudel, Randal Douc, François Roueff; 24(62):1−76, 2023.
Abstract
This paper presents a new family of iterative algorithms designed for minimizing alpha-divergence in the context of Variational Inference. The algorithms ensure a systematic decrease in the alpha-divergence between the variational and posterior distributions at each step. The variational distribution can be a mixture model, and the framework allows for simultaneous optimization of the weights and component parameters of this mixture model. The approach builds on existing methods for alpha-divergence minimization, such as Gradient or Power Descent schemes, and also introduces a new integrated Expectation Maximization algorithm. Empirical evidence shows that this methodology produces improved results on various multimodal target distributions and a real data example.
[abs]