Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics
Authors: Kamélia Daudel, Joe Benton, Yuyang Shi, Arnaud Doucet; 24(243):1−83, 2023.
Abstract
Many algorithms have been proposed that utilize the Variational Rényi (VR) bound to minimize the alpha-divergence between a target posterior distribution and a variational distribution. While these algorithms have shown promising empirical results, they rely on biased stochastic gradient descent procedures and lack theoretical guarantees. In this paper, we introduce and analyze the VR-IWAE bound, which is a generalization of the importance weighted auto-encoder (IWAE) bound. We demonstrate that the VR-IWAE bound possesses several desirable properties and, notably, uses unbiased gradient estimators to achieve the same stochastic gradient descent procedure as the VR bound in the reparameterized case. Additionally, we provide two complementary theoretical analyses of the VR-IWAE bound and the standard IWAE bound. These analyses offer insights into the advantages and limitations of these bounds. Finally, we validate our theoretical findings through experiments using both toy and real-data examples.
[abs]