[2602.18647] Information-Guided Noise Allocation for Efficient Diffusion Training
Summary
The paper presents InfoNoise, a data-adaptive noise scheduling method for diffusion training, enhancing efficiency and performance by utilizing information theory to optimize noise allocation.
Why It Matters
This research addresses inefficiencies in traditional noise scheduling for diffusion models, which can hinder performance across various datasets. By introducing an information-guided approach, it has the potential to significantly improve training speed and model quality, making it relevant for researchers and practitioners in machine learning and AI.
Key Takeaways
- InfoNoise optimizes noise allocation using conditional entropy rates.
- The method can achieve up to 1.4x training speedup on CIFAR-10.
- It reduces the need for manual tuning of noise schedules across datasets.
- InfoNoise enhances performance on discrete datasets with fewer training steps.
- The approach is grounded in information theory, offering a principled framework.
Computer Science > Machine Learning arXiv:2602.18647 (cs) [Submitted on 20 Feb 2026] Title:Information-Guided Noise Allocation for Efficient Diffusion Training Authors:Gabriel Raya, Bac Nguyen, Georgios Batzolis, Yuhta Takida, Dejan Stancevic, Naoki Murata, Chieh-Hsin Lai, Yuki Mitsufuji, Luca Ambrogioni View a PDF of the paper titled Information-Guided Noise Allocation for Efficient Diffusion Training, by Gabriel Raya and 8 other authors View PDF HTML (experimental) Abstract:Training diffusion models typically relies on manually tuned noise schedules, which can waste computation on weakly informative noise regions and limit transfer across datasets, resolutions, and representations. We revisit noise schedule allocation through an information-theoretic lens and propose the conditional entropy rate of the forward process as a theoretically grounded, data-dependent diagnostic for identifying suboptimal noise-level allocation in existing schedules. Based on these insight, we introduce InfoNoise, a principled data-adaptive training noise schedule that replaces heuristic schedule design with an information-guided noise sampling distribution derived from entropy-reduction rates estimated from denoising losses already computed during training. Across natural-image benchmarks, InfoNoise matches or surpasses tuned EDM-style schedules, in some cases with a substantial training speedup (about $1.4\times$ on CIFAR-10). On discrete datasets, where standard image-tuned schedules exhibit...