[2402.08151] Perturbative adaptive importance sampling for Bayesian LOO cross-validation
About this article
Abstract page for arXiv paper 2402.08151: Perturbative adaptive importance sampling for Bayesian LOO cross-validation
Statistics > Methodology arXiv:2402.08151 (stat) [Submitted on 13 Feb 2024 (v1), last revised 25 Mar 2026 (this version, v4)] Title:Perturbative adaptive importance sampling for Bayesian LOO cross-validation Authors:Joshua C Chang, Xiangting Li, Tianyi Su, Shixin Xu, Hao-Ren Yao, Julia Porcino, Carson Chow View a PDF of the paper titled Perturbative adaptive importance sampling for Bayesian LOO cross-validation, by Joshua C Chang and 6 other authors View PDF HTML (experimental) Abstract:Importance sampling (IS) is an efficient stand-in for model refitting in performing (LOO) cross-validation (CV) on a Bayesian model. IS inverts the Bayesian update for a single observation by reweighting posterior samples. The so-called importance weights have high variance -- we resolve this issue through adaptation by transformation. We observe that removing a single observation perturbs the posterior by $\mathcal{O}(1/n)$, motivating bijective transformations of the form $T(\theta)=\theta + h Q(\theta)$ for $0<h\ll 1.$ We introduce several such transformations: partial moment matching, which generalizes prior work on affine moment-matching with a tunable step size; log-likelihood descent, which partially invert the Bayesian update for an observation; and gradient flow steps that minimize the KL divergence or IS variance. The gradient flow and likelihood descent transformations require Jacobian determinants, which are available via auto-differentiation; we additionally derive closed-form ...