[2604.02659] Low-Rank Compression of Pretrained Models via Randomized Subspace Iteration
About this article
Abstract page for arXiv paper 2604.02659: Low-Rank Compression of Pretrained Models via Randomized Subspace Iteration
Computer Science > Machine Learning arXiv:2604.02659 (cs) [Submitted on 3 Apr 2026] Title:Low-Rank Compression of Pretrained Models via Randomized Subspace Iteration Authors:Farhad Pourkamali-Anaraki View a PDF of the paper titled Low-Rank Compression of Pretrained Models via Randomized Subspace Iteration, by Farhad Pourkamali-Anaraki View PDF HTML (experimental) Abstract:The massive scale of pretrained models has made efficient compression essential for practical deployment. Low-rank decomposition based on the singular value decomposition (SVD) provides a principled approach for model reduction, but its exact computation is expensive for large weight matrices. Randomized alternatives such as randomized SVD (RSVD) improve efficiency, yet they can suffer from poor approximation quality when the singular value spectrum decays slowly, a regime commonly observed in modern pretrained models. In this work, we address this limitation from both theoretical and empirical perspectives. First, we establish a connection between low-rank approximation error and predictive performance by analyzing softmax perturbations, showing that deviations in class probabilities are controlled by the spectral error of the compressed weights. Second, we demonstrate that RSVD is inadequate, and we propose randomized subspace iteration (RSI) as a more effective alternative. By incorporating multiple power iterations, RSI improves spectral separation and provides a controllable mechanism for enhancing a...