[2604.01339] Regularizing Attention Scores with Bootstrapping
About this article
Abstract page for arXiv paper 2604.01339: Regularizing Attention Scores with Bootstrapping
Computer Science > Computer Vision and Pattern Recognition arXiv:2604.01339 (cs) [Submitted on 1 Apr 2026] Title:Regularizing Attention Scores with Bootstrapping Authors:Neo Christopher Chung, Maxim Laletin View a PDF of the paper titled Regularizing Attention Scores with Bootstrapping, by Neo Christopher Chung and Maxim Laletin View PDF HTML (experimental) Abstract:Vision transformers (ViT) rely on attention mechanism to weigh input features, and therefore attention scores have naturally been considered as explanations for its decision-making process. However, attention scores are almost always non-zero, resulting in noisy and diffused attention maps and limiting interpretability. Can we quantify uncertainty measures of attention scores and obtain regularized attention scores? To this end, we consider attention scores of ViT in a statistical framework where independent noise would lead to insignificant yet non-zero scores. Leveraging statistical learning techniques, we introduce the bootstrapping for attention scores which generates a baseline distribution of attention scores by resampling input features. Such a bootstrap distribution is then used to estimate significances and posterior probabilities of attention scores. In natural and medical images, the proposed \emph{Attention Regularization} approach demonstrates a straightforward removal of spurious attention arising from noise, drastically improving shrinkage and sparsity. Quantitative evaluations are conducted usin...