[2603.26048] Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression
About this article
Abstract page for arXiv paper 2603.26048: Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression
Statistics > Machine Learning arXiv:2603.26048 (stat) [Submitted on 27 Mar 2026] Title:Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression Authors:Haoming Shi, Eric C. Chi, Hengrui Luo View a PDF of the paper titled Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression, by Haoming Shi and 2 other authors View PDF Abstract:We study rank selection for low-rank tensor regression under random covariates design. Under a Gaussian random-design model and some mild conditions, we derive population expressions for the expected training-testing discrepancy (optimism) for both CP and Tucker decomposition. We further demonstrate that the optimism is minimized at the true tensor rank for both CP and Tucker regression. This yields a prediction-oriented rank-selection rule that aligns with cross-validation and extends naturally to tensor-model averaging. We also discuss conditions under which under- or over-ranked models may appear preferable, thereby clarifying the scope of the method. Finally, we showcase its practical utility on a real-world image regression task and extend its application to tensor-based compression of neural network, highlighting its potential for model selection in deep learning. Comments: Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG); Statistics Theory (math.ST) Cite as: arXiv:2603.26048 [stat.ML] (or arXiv:2603.26048v1 [stat.ML] for this version) https://d...