[2602.18946] Exponential Convergence of (Stochastic) Gradient Descent for Separable Logistic Regression
Summary
This paper presents a novel approach to gradient descent and stochastic gradient descent, demonstrating exponential convergence for separable logistic regression using a stable step-size schedule.
Why It Matters
Understanding the convergence behavior of gradient descent methods is crucial for optimizing machine learning algorithms. This research provides insights that can enhance the efficiency of training models, particularly in scenarios where stability is a concern, thus broadening the applicability of these methods in real-world applications.
Key Takeaways
- Exponential convergence can be achieved with a stable step-size schedule.
- The proposed method does not require prior knowledge of optimization horizons or target accuracy.
- Stochastic gradient descent can also achieve exponential convergence using a lightweight adaptive step-size rule.
Computer Science > Machine Learning arXiv:2602.18946 (cs) [Submitted on 21 Feb 2026] Title:Exponential Convergence of (Stochastic) Gradient Descent for Separable Logistic Regression Authors:Sacchit Kale, Piyushi Manupriya, Pierre Marion, Francis bach, Anant Raj View a PDF of the paper titled Exponential Convergence of (Stochastic) Gradient Descent for Separable Logistic Regression, by Sacchit Kale and 3 other authors View PDF HTML (experimental) Abstract:Gradient descent and stochastic gradient descent are central to modern machine learning, yet their behavior under large step sizes remains theoretically unclear. Recent work suggests that acceleration often arises near the edge of stability, where optimization trajectories become unstable and difficult to analyze. Existing results for separable logistic regression achieve faster convergence by explicitly leveraging such unstable regimes through constant or adaptive large step sizes. In this paper, we show that instability is not inherent to acceleration. We prove that gradient descent with a simple, non-adaptive increasing step-size schedule achieves exponential convergence for separable logistic regression under a margin condition, while remaining entirely within a stable optimization regime. The resulting method is anytime and does not require prior knowledge of the optimization horizon or target accuracy. We also establish exponential convergence of stochastic gradient descent using a lightweight adaptive step-size rule...