[2506.07816] Accelerating Constrained Sampling: A Large Deviations Approach
About this article
Abstract page for arXiv paper 2506.07816: Accelerating Constrained Sampling: A Large Deviations Approach
Statistics > Machine Learning arXiv:2506.07816 (stat) [Submitted on 9 Jun 2025 (v1), last revised 5 Apr 2026 (this version, v3)] Title:Accelerating Constrained Sampling: A Large Deviations Approach Authors:Yingli Wang, Changwei Tu, Xiaoyu Wang, Lingjiong Zhu View a PDF of the paper titled Accelerating Constrained Sampling: A Large Deviations Approach, by Yingli Wang and 3 other authors View PDF HTML (experimental) Abstract:The problem of sampling a target probability distribution on a constrained domain arises in many applications including machine learning. For constrained sampling, various Langevin algorithms such as projected Langevin Monte Carlo (PLMC), based on the discretization of reflected Langevin dynamics (RLD) and more generally skew-reflected non-reversible Langevin Monte Carlo (SRNLMC), based on the discretization of skew-reflected non-reversible Langevin dynamics (SRNLD), have been proposed and studied in the literature. This work focuses on the long-time behavior of SRNLD, where a skew-symmetric matrix is added to RLD. Although acceleration for SRNLD has been studied, it is not clear how one should design the skew-symmetric matrix in the dynamics to achieve good performance in practice. We establish a large deviation principle (LDP) for the empirical measure of SRNLD when the skew-symmetric matrix is chosen such that its product with the outward unit normal vector field on the boundary is zero. By explicitly characterizing the rate functions, we show that th...