[2604.03803] Rényi Attention Entropy for Patch Pruning
About this article
Abstract page for arXiv paper 2604.03803: Rényi Attention Entropy for Patch Pruning
Computer Science > Computer Vision and Pattern Recognition arXiv:2604.03803 (cs) [Submitted on 4 Apr 2026] Title:Rényi Attention Entropy for Patch Pruning Authors:Hiroaki Aizawa, Yuki Igaue View a PDF of the paper titled R\'enyi Attention Entropy for Patch Pruning, by Hiroaki Aizawa and 1 other authors View PDF HTML (experimental) Abstract:Transformers are strong baselines in both vision and language because self-attention captures long-range dependencies across tokens. However, the cost of self-attention grows quadratically with the number of tokens. Patch pruning mitigates this cost by estimating per-patch importance and removing redundant patches. To identify informative patches for pruning, we introduce a criterion based on the Shannon entropy of the attention distribution. Low-entropy patches, which receive selective and concentrated attention, are kept as important, while high-entropy patches with attention spread across many locations are treated as redundant. We also extend the criterion from Shannon to Rényi entropy, which emphasizes sharp attention peaks and supports pruning strategies that adapt to task needs and computational limits. In experiments on fine-grained image recognition, where patch selection is critical, our method reduced computation while preserving accuracy. Moreover, adjusting the pruning policy through the Rényi entropy measure yields further gains and improves the trade-off between accuracy and computation. Comments: Subjects: Computer Vision...