[2603.22741] Algorithmic warm starts for Hamiltonian Monte Carlo
About this article
Abstract page for arXiv paper 2603.22741: Algorithmic warm starts for Hamiltonian Monte Carlo
Computer Science > Data Structures and Algorithms arXiv:2603.22741 (cs) [Submitted on 24 Mar 2026] Title:Algorithmic warm starts for Hamiltonian Monte Carlo Authors:Matthew S. Zhang, Jason M. Altschuler, Sinho Chewi View a PDF of the paper titled Algorithmic warm starts for Hamiltonian Monte Carlo, by Matthew S. Zhang and 2 other authors View PDF Abstract:Generating samples from a continuous probability density is a central algorithmic problem across statistics, engineering, and the sciences. For high-dimensional settings, Hamiltonian Monte Carlo (HMC) is the default algorithm across mainstream software packages. However, despite the extensive line of work on HMC and its widespread empirical success, it remains unclear how many iterations of HMC are required as a function of the dimension $d$. On one hand, a variety of results show that Metropolized HMC converges in $O(d^{1/4})$ iterations from a warm start close to stationarity. On the other hand, Metropolized HMC is significantly slower without a warm start, e.g., requiring $\Omega(d^{1/2})$ iterations even for simple target distributions such as isotropic Gaussians. Finding a warm start is therefore the computational bottleneck for HMC. We resolve this issue for the well-studied setting of sampling from a probability distribution satisfying strong log-concavity (or isoperimetry) and third-order derivative bounds. We prove that \emph{non-Metropolized} HMC generates a warm start in $\tilde{O}(d^{1/4})$ iterations, after w...