Unbiased Multilevel Monte Carlo Methods for Intractable Distributions: MLMC Meets MCMC

Tianze Wang, Guanyang Wang; 24(249):1−40, 2023.

Abstract

The construction of unbiased estimators from Markov chain Monte Carlo (MCMC) outputs is a challenging problem that has recently gained significant attention in the statistics and machine learning communities. However, the existing unbiased MCMC framework is only applicable when the quantity of interest is an expectation, which limits its practicality in many applications. In this paper, we propose a general method for constructing unbiased estimators for functions of expectations and extend it to construct unbiased estimators for nested expectations. Our approach combines and generalizes the unbiased MCMC and Multilevel Monte Carlo (MLMC) methods. Unlike traditional sequential methods, our estimator can be implemented on parallel processors. We demonstrate that our estimator has a finite variance and computational complexity, and can achieve $\\varepsilon$-accuracy within the optimal $O(1/\\varepsilon^2)$ computational cost under mild conditions. Numerical experiments validate our theoretical findings and showcase the advantages of unbiased estimators in the massively parallel regime.

[abs]

[pdf][bib]
      
[code]