[2603.00772] Initialization-Aware Score-Based Diffusion Sampling
About this article
Abstract page for arXiv paper 2603.00772: Initialization-Aware Score-Based Diffusion Sampling
Statistics > Machine Learning arXiv:2603.00772 (stat) [Submitted on 28 Feb 2026] Title:Initialization-Aware Score-Based Diffusion Sampling Authors:Tiziano Fassina, Gabriel Cardoso, Sylvan Le Corff, Thomas Romary View a PDF of the paper titled Initialization-Aware Score-Based Diffusion Sampling, by Tiziano Fassina and 3 other authors View PDF HTML (experimental) Abstract:Score-based generative models (SGMs) aim at generating samples from a target distribution by approximating the reverse-time dynamics of a stochastic differential equation. Despite their strong empirical performance, classical samplers initialized from a Gaussian distribution require a long time horizon noising typically inducing a large number of discretization steps and high computational cost. In this work, we present a Kullback-Leibler convergence analysis of Variance Exploding diffusion samplers that highlights the critical role of the backward process initialization. Based on this result, we propose a theoretically grounded sampling strategy that learns the reverse-time initialization, directly minimizing the initialization error. The resulting procedure is independent of the specific score training procedure, network architecture, and discretization scheme. Experiments on toy distributions and benchmark datasets demonstrate competitive or improved generative quality while using significantly fewer sampling steps. Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG) Cite as: arXiv:2603.00772 ...