[2603.04005] Training-Free Rate-Distortion-Perception Traversal With Diffusion
About this article
Abstract page for arXiv paper 2603.04005: Training-Free Rate-Distortion-Perception Traversal With Diffusion
Computer Science > Information Theory arXiv:2603.04005 (cs) [Submitted on 4 Mar 2026] Title:Training-Free Rate-Distortion-Perception Traversal With Diffusion Authors:Yuhan Wang, Suzhi Bi, Ying-Jun Angela Zhang View a PDF of the paper titled Training-Free Rate-Distortion-Perception Traversal With Diffusion, by Yuhan Wang and 2 other authors View PDF HTML (experimental) Abstract:The rate-distortion-perception (RDP) tradeoff characterizes the fundamental limits of lossy compression by jointly considering bitrate, reconstruction fidelity, and perceptual quality. While recent neural compression methods have improved perceptual performance, they typically operate at fixed points on the RDP surface, requiring retraining to target different tradeoffs. In this work, we propose a training-free framework that leverages pre-trained diffusion models to traverse the entire RDP surface. Our approach integrates a reverse channel coding (RCC) module with a novel score-scaled probability flow ODE decoder. We theoretically prove that the proposed diffusion decoder is optimal for the distortion-perception tradeoff under AWGN observations and that the overall framework with the RCC module achieves the optimal RDP function in the Gaussian case. Empirical results across multiple datasets demonstrate the framework's flexibility and effectiveness in navigating the ternary RDP tradeoff using pre-trained diffusion models. Our results establish a practical and theoretically grounded approach to adapt...