[2603.25221] Gap Safe Screening Rules for Fast Training of Robust Support Vector Machines under Feature Noise
About this article
Abstract page for arXiv paper 2603.25221: Gap Safe Screening Rules for Fast Training of Robust Support Vector Machines under Feature Noise
Computer Science > Machine Learning arXiv:2603.25221 (cs) [Submitted on 26 Mar 2026] Title:Gap Safe Screening Rules for Fast Training of Robust Support Vector Machines under Feature Noise Authors:Tan-Hau Nguyen, Thu-Le Tran, Kien Trung Nguyen View a PDF of the paper titled Gap Safe Screening Rules for Fast Training of Robust Support Vector Machines under Feature Noise, by Tan-Hau Nguyen and 2 other authors View PDF HTML (experimental) Abstract:Robust Support Vector Machines (R-SVMs) address feature noise by adopting a worst-case robust formulation that explicitly incorporates uncertainty sets into training. While this robustness improves reliability, it also leads to increased computational cost. In this work, we develop safe sample screening rules for R-SVMs that reduce the training complexity without affecting the optimal solution. To the best of our knowledge, this is the first study to apply safe screening techniques to worst-case robust models in supervised machine learning. Our approach safely identifies training samples whose uncertainty sets are guaranteed to lie entirely on either side of the margin hyperplane, thereby reducing the problem size and accelerating optimization. Owing to the nonstandard structure of R-SVMs, the proposed screening rules are derived from the Lagrangian duality rather than the Fenchel-Rockafellar duality commonly used in recent methods. Based on this analysis, we first establish an ideal screening rule, and then derive a practical rule b...