[2602.22505] Sharp Convergence Rates for Masked Diffusion Models
Summary
This paper presents a comprehensive analysis of masked diffusion models, focusing on their convergence rates and theoretical underpinnings, particularly for the Euler method and First-Hitting Sampler (FHS).
Why It Matters
Understanding the convergence rates of masked diffusion models is crucial for improving their performance in machine learning applications. This research addresses gaps in theoretical analysis, providing tighter bounds and more robust guarantees, which can enhance the reliability of these models in practical scenarios.
Key Takeaways
- Introduces a total-variation based analysis for the Euler method, improving convergence guarantees.
- Establishes the first convergence lower bound for the Euler sampler, enhancing theoretical understanding.
- Analyzes the First-Hitting Sampler (FHS) and demonstrates its efficiency in sampling without significant error.
- Addresses limitations of previous analyses that relied on Kullback-Leibler divergence.
- Presents a novel error decomposition approach that may have broader implications in the field.
Computer Science > Machine Learning arXiv:2602.22505 (cs) [Submitted on 26 Feb 2026] Title:Sharp Convergence Rates for Masked Diffusion Models Authors:Yuchen Liang, Zhiheng Tan, Ness Shroff, Yingbin Liang View a PDF of the paper titled Sharp Convergence Rates for Masked Diffusion Models, by Yuchen Liang and 3 other authors View PDF Abstract:Discrete diffusion models have achieved strong empirical performance in text and other symbolic domains, with masked (absorbing-rate) variants emerging as competitive alternatives to autoregressive models. Among existing samplers, the Euler method remains the standard choice in many applications, and more recently, the First-Hitting Sampler (FHS) has shown considerable promise for masked diffusion models. Despite their practical success, the theoretical understanding of these samplers remains limited. Existing analyses are conducted in Kullback-Leibler (KL) divergence, which often yields loose parameter dependencies and requires strong assumptions on score estimation. Moreover, these guarantees do not cover recently developed high-performance sampler of FHS. In this work, we first develop a direct total-variation (TV) based analysis for the Euler method that overcomes these limitations. Our results relax assumptions on score estimation, improve parameter dependencies, and establish convergence guarantees without requiring any surrogate initialization. Also for this setting, we provide the first convergence lower bound for the Euler samp...