[2502.10328] Accelerated Parallel Tempering via Neural Transports
About this article
Abstract page for arXiv paper 2502.10328: Accelerated Parallel Tempering via Neural Transports
Statistics > Machine Learning arXiv:2502.10328 (stat) [Submitted on 14 Feb 2025 (v1), last revised 25 Mar 2026 (this version, v4)] Title:Accelerated Parallel Tempering via Neural Transports Authors:Leo Zhang, Peter Potaptchik, Jiajun He, Yuanqi Du, Arnaud Doucet, Francisco Vargas, Hai-Dang Dau, Saifuddin Syed View a PDF of the paper titled Accelerated Parallel Tempering via Neural Transports, by Leo Zhang and 7 other authors View PDF HTML (experimental) Abstract:Markov Chain Monte Carlo (MCMC) algorithms are essential tools in computational statistics for sampling from unnormalised probability distributions, but can be fragile when targeting high-dimensional, multimodal, or complex target distributions. Parallel Tempering (PT) enhances MCMC's sample efficiency through annealing and parallel computation, propagating samples from tractable reference distributions to intractable targets via state swapping across interpolating distributions. The effectiveness of PT is limited by the often minimal overlap between adjacent distributions in challenging problems, which requires increasing the computational resources to compensate. We introduce a framework that accelerates PT by leveraging neural samplers -- including normalising flows, diffusion models, and controlled diffusions -- to reduce the required overlap. Our approach utilises neural samplers in parallel, circumventing the computational burden of neural samplers while preserving the asymptotic consistency of classical PT. ...