[2603.21928] The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation
About this article
Abstract page for arXiv paper 2603.21928: The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation
Computer Science > Computer Vision and Pattern Recognition arXiv:2603.21928 (cs) [Submitted on 23 Mar 2026] Title:The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation Authors:Guannan Lai, Da-Wei Zhou, Zhenguo Li, Han-Jia Ye View a PDF of the paper titled The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation, by Guannan Lai and 3 other authors View PDF HTML (experimental) Abstract:Continual Test-Time Adaptation (CTTA) aims to enable models to adapt online to unlabeled data streams under distribution shift without accessing source data. Existing CTTA methods face an efficiency-generalization trade-off: updating more parameters improves adaptation but severely reduces online inference efficiency. An ideal solution is to achieve comparable adaptation with minimal feature updates; we call this minimal subspace the golden subspace. We prove its existence in a single-step adaptation setting and show that it coincides with the row space of the pretrained classifier. To enable online maintenance of this subspace, we introduce the sample-wise Average Gradient Outer Product (AGOP) as an efficient proxy for estimating the classifier weights without retraining. Building on these insights, we propose Guided Online Low-rank Directional adaptation (GOLD), which uses a lightweight adapter to project features onto the golden subspace and learns a compact scaling vector while the subspace is dynamically updated vi...