[2603.20388] From Cross-Validation to SURE: Asymptotic Risk of Tuned Regularized Estimators
About this article
Abstract page for arXiv paper 2603.20388: From Cross-Validation to SURE: Asymptotic Risk of Tuned Regularized Estimators
Mathematics > Statistics Theory arXiv:2603.20388 (math) [Submitted on 20 Mar 2026] Title:From Cross-Validation to SURE: Asymptotic Risk of Tuned Regularized Estimators Authors:Karun Adusumilli, Maximilian Kasy, Ashia Wilson View a PDF of the paper titled From Cross-Validation to SURE: Asymptotic Risk of Tuned Regularized Estimators, by Karun Adusumilli and 2 other authors View PDF Abstract:We derive the asymptotic risk function of regularized empirical risk minimization (ERM) estimators tuned by $n$-fold cross-validation (CV). The out-of-sample prediction loss of such estimators converges in distribution to the squared-error loss (risk function) of shrinkage estimators in the normal means model, tuned by Stein's unbiased risk estimate (SURE). This risk function provides a more fine-grained picture of predictive performance than uniform bounds on worst-case regret, which are common in learning theory: it quantifies how risk varies with the true parameter. As key intermediate steps, we show that (i) $n$-fold CV converges uniformly to SURE, and (ii) while SURE typically has multiple local minima, its global minimum is generically well separated. Well-separation ensures that uniform convergence of CV to SURE translates into convergence of the tuning parameter chosen by CV to that chosen by SURE. Subjects: Statistics Theory (math.ST); Machine Learning (cs.LG); Econometrics (econ.EM); Machine Learning (stat.ML) Cite as: arXiv:2603.20388 [math.ST] (or arXiv:2603.20388v1 [math.S...