[2603.03401] Beyond Cross-Validation: Adaptive Parameter Selection for Kernel-Based Gradient Descents
About this article
Abstract page for arXiv paper 2603.03401: Beyond Cross-Validation: Adaptive Parameter Selection for Kernel-Based Gradient Descents
Statistics > Machine Learning arXiv:2603.03401 (stat) [Submitted on 3 Mar 2026] Title:Beyond Cross-Validation: Adaptive Parameter Selection for Kernel-Based Gradient Descents Authors:Xiaotong Liu, Yunwen Lei, Xiangyu Chang, Shao-Bo Lin View a PDF of the paper titled Beyond Cross-Validation: Adaptive Parameter Selection for Kernel-Based Gradient Descents, by Xiaotong Liu and 3 other authors View PDF HTML (experimental) Abstract:This paper proposes a novel parameter selection strategy for kernel-based gradient descent (KGD) algorithms, integrating bias-variance analysis with the splitting method. We introduce the concept of empirical effective dimension to quantify iteration increments in KGD, deriving an adaptive parameter selection strategy that is implementable. Theoretical verifications are provided within the framework of learning theory. Utilizing the recently developed integral operator approach, we rigorously demonstrate that KGD, equipped with the proposed adaptive parameter selection strategy, achieves the optimal generalization error bound and adapts effectively to different kernels, target functions, and error metrics. Consequently, this strategy showcases significant advantages over existing parameter selection methods for KGD. Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG); Methodology (stat.ME) Cite as: arXiv:2603.03401 [stat.ML] (or arXiv:2603.03401v1 [stat.ML] for this version) https://doi.org/10.48550/arXiv.2603.03401 Focus to learn more...