[2602.19506] Relational Feature Caching for Accelerating Diffusion Transformers

[2602.19506] Relational Feature Caching for Accelerating Diffusion Transformers

arXiv - Machine Learning 4 min read Article

Summary

This paper introduces Relational Feature Caching (RFC) to enhance the efficiency of diffusion transformers by improving feature prediction accuracy and reducing computational redundancy.

Why It Matters

As machine learning models, particularly diffusion transformers, become more complex, optimizing their performance is crucial. RFC addresses significant prediction errors in existing caching methods, potentially leading to more efficient AI applications in various fields, including computer vision and generative AI.

Key Takeaways

  • RFC improves the accuracy of feature predictions by leveraging input-output relationships.
  • The framework introduces relational feature estimation (RFE) to better predict output changes.
  • Relational cache scheduling (RCS) minimizes full computations based on expected prediction errors.
  • Extensive experiments show RFC outperforms previous caching methods significantly.
  • This approach can lead to more efficient AI models, enhancing their practical applications.

Computer Science > Computer Vision and Pattern Recognition arXiv:2602.19506 (cs) [Submitted on 23 Feb 2026] Title:Relational Feature Caching for Accelerating Diffusion Transformers Authors:Byunggwan Son, Jeimin Jeon, Jeongwoo Choi, Bumsub Ham View a PDF of the paper titled Relational Feature Caching for Accelerating Diffusion Transformers, by Byunggwan Son and 3 other authors View PDF HTML (experimental) Abstract:Feature caching approaches accelerate diffusion transformers (DiTs) by storing the output features of computationally expensive modules at certain timesteps, and exploiting them for subsequent steps to reduce redundant computations. Recent forecasting-based caching approaches employ temporal extrapolation techniques to approximate the output features with cached ones. Although effective, relying exclusively on temporal extrapolation still suffers from significant prediction errors, leading to performance degradation. Through a detailed analysis, we find that 1) these errors stem from the irregular magnitude of changes in the output features, and 2) an input feature of a module is strongly correlated with the corresponding output. Based on this, we propose relational feature caching (RFC), a novel framework that leverages the input-output relationship to enhance the accuracy of the feature prediction. Specifically, we introduce relational feature estimation (RFE) to estimate the magnitude of changes in the output features from the inputs, enabling more accurate fea...

Related Articles

Llms

CLI for Google AI Search (gai.google) — run AI-powered code/tech searches headlessly from your terminal

Google AI (gai.google) gives Gemini-powered answers for technical queries — think AI-enhanced search with code understanding. I built a C...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Big increase in the amount of people using AI to write their replies with AI

I find it interesting that we’ve all randomly decided to use the “-“ more often recently on reddit, and everyone’s grammar has drasticall...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] MXFP8 GEMM: Up to 99% of cuBLAS performance using CUDA + PTX

New blog post by Daniel Vega-Myhre (Meta/PyTorch) illustrating GEMM design for FP8, including deep-dives into all the constraints and des...

Reddit - Machine Learning · 1 min ·
IIT Delhi launches 8th batch of Advanced AI, ML, and DL online programme: Check who is eligible, applicat
Machine Learning

IIT Delhi launches 8th batch of Advanced AI, ML, and DL online programme: Check who is eligible, applicat

News News: The Continuing Education Programme (CEP) at IIT Delhi has announced the launch of the 8th batch of its Advanced Certificate Pr...

AI News - General · 9 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime