Enhancing parallel computing in multiple-try Metropolis with local balancing
Philippe Gagnon, Florian Maire, Giacomo Zanella; 24(248):1−59, 2023.
Abstract
Multiple-try Metropolis (MTM) is a widely used Markov chain Monte Carlo method that can be effectively implemented in parallel computing. During each iteration, it generates multiple candidates for the next state of the Markov chain and randomly selects one based on a weight function. However, the conventional weight function, which is proportional to the target density, can lead to problematic behaviors, especially in high dimensions and during the convergence phase. In this study, we propose using weight functions similar to the locally-balanced proposal distributions introduced by Zanella (2020) to address these issues. The resulting MTM algorithms exhibit improved performance and avoid the previously observed pathological behaviors. To analyze these algorithms, we examine the performance of ideal schemes that represent MTM algorithms sampling an infinite number of candidates per iteration, as well as the discrepancy between these schemes and the MTM algorithms that sample a finite number of candidates. Our analysis reveals a significant distinction between the convergence and stationary phases: local balancing plays a crucial role in achieving fast convergence during the former, while both the conventional and novel weight functions yield similar performance during the latter. We include numerical experiments, including an application in precision medicine that involves a computationally-expensive forward model, highlighting the benefits of parallel computing in MTM iterations.
[abs]