Doubly Robust Stein-Kernelized Monte Carlo Estimator: Simultaneous Bias-Variance Reduction and Supercanonical Convergence
Authors: Henry Lam, Haofeng Zhang; Journal of Machine Learning Research, 24(85):1−58, 2023.
Abstract
The standard Monte Carlo computation is well-known for its canonical square-root convergence speed in terms of sample size. Recently, two techniques have been proposed to improve the convergence speed beyond the canonical rate by incorporating reproducing kernels and Stein’s identity. These techniques are based on control variate and importance sampling, respectively. However, these methods are not effective when the sample generator is biased and subject to noise corruption. In this paper, we introduce a more general framework, called the doubly robust Stein-kernelized estimator, which combines both techniques and performs better in terms of mean squared error rates across various scenarios. We provide numerical examples to demonstrate the superior performance of our proposed method.
[abs]