[2604.01552] ZEUS: Accelerating Diffusion Models with Only Second-Order Predictor
About this article
Abstract page for arXiv paper 2604.01552: ZEUS: Accelerating Diffusion Models with Only Second-Order Predictor
Computer Science > Machine Learning arXiv:2604.01552 (cs) [Submitted on 2 Apr 2026] Title:ZEUS: Accelerating Diffusion Models with Only Second-Order Predictor Authors:Yixiao Wang, Ting Jiang, Zishan Shao, Hancheng Ye, Jingwei Sun, Mingyuan Ma, Jianyi Zhang, Yiran Chen, Hai Li View a PDF of the paper titled ZEUS: Accelerating Diffusion Models with Only Second-Order Predictor, by Yixiao Wang and 8 other authors View PDF HTML (experimental) Abstract:Denoising generative models deliver high-fidelity generation but remain bottlenecked by inference latency due to the many iterative denoiser calls required during sampling. Training-free acceleration methods reduce latency by either sparsifying the model architecture or shortening the sampling trajectory. Current training-free acceleration methods are more complex than necessary: higher-order predictors amplify error under aggressive speedups, and architectural modifications hinder deployment. Beyond 2x acceleration, step skipping creates structural scarcity -- at most one fresh evaluation per local window -- leaving the computed output and its backward difference as the only causally grounded information. Based on this, we propose ZEUS, an acceleration method that predicts reduced denoiser evaluations using a second-order predictor, and stabilizes aggressive consecutive skipping with an interleaved scheme that avoids back-to-back extrapolations. ZEUS adds essentially zero overhead, no feature caches, and no architectural modifica...