[2511.19473] WavefrontDiffusion: Dynamic Decoding Schedule for Improved Reasoning
About this article
Abstract page for arXiv paper 2511.19473: WavefrontDiffusion: Dynamic Decoding Schedule for Improved Reasoning
Computer Science > Machine Learning arXiv:2511.19473 (cs) [Submitted on 22 Nov 2025 (v1), last revised 2 Mar 2026 (this version, v3)] Title:WavefrontDiffusion: Dynamic Decoding Schedule for Improved Reasoning Authors:Haojin Yang, Rui Hu, Zequn Sun, Rui Zhou, Yujun Cai, Yiwei Wang View a PDF of the paper titled WavefrontDiffusion: Dynamic Decoding Schedule for Improved Reasoning, by Haojin Yang and 5 other authors View PDF HTML (experimental) Abstract:Diffusion Language Models (DLMs) have shown strong potential for text generation and are becoming a competitive alternative to autoregressive models. The denoising strategy plays an important role in determining the quality of their outputs. Mainstream denoising strategies include Standard Diffusion and BlockDiffusion. Standard Diffusion performs global denoising without restricting the update range, often finalizing incomplete context and causing premature end-of-sequence predictions. BlockDiffusion updates fixed-size blocks in a preset order, but its rigid structure can break apart coherent semantic units and disrupt reasoning. We present WavefrontDiffusion, a dynamic decoding approach that expands a wavefront of active tokens outward from finalized positions. This adaptive process follows the natural flow of semantic structure while keeping computational cost equal to block-based methods. Across four benchmarks in reasoning and code generation, WavefrontDiffusion achieves state-of-the-art performance while producing outputs...