[2602.14789] On the Stability of Nonlinear Dynamics in GD and SGD: Beyond Quadratic Potentials
Summary
This paper explores the stability of nonlinear dynamics in gradient descent (GD) and stochastic gradient descent (SGD), revealing that linear analysis may misrepresent stability and convergence behaviors in optimization algorithms.
Why It Matters
Understanding the stability of optimization algorithms like GD and SGD is crucial for improving machine learning models. This research challenges traditional linear analysis, suggesting that nonlinear dynamics play a significant role in convergence, which could lead to better training strategies in complex models.
Key Takeaways
- Stable solutions in GD correspond to flat minima, which are desirable for optimization.
- Linearized dynamics may not accurately capture the stability of nonlinear behaviors.
- Nonlinear dynamics can lead to divergence in SGD, even if a single batch is unstable.
- A new criterion for stable oscillations in GD is established, relying on high-order derivatives.
- If all batches in SGD are linearly stable, the overall nonlinear dynamics remain stable in expectation.
Computer Science > Machine Learning arXiv:2602.14789 (cs) [Submitted on 16 Feb 2026] Title:On the Stability of Nonlinear Dynamics in GD and SGD: Beyond Quadratic Potentials Authors:Rotem Mulayoff, Sebastian U. Stich View a PDF of the paper titled On the Stability of Nonlinear Dynamics in GD and SGD: Beyond Quadratic Potentials, by Rotem Mulayoff and Sebastian U. Stich View PDF Abstract:The dynamical stability of the iterates during training plays a key role in determining the minima obtained by optimization algorithms. For example, stable solutions of gradient descent (GD) correspond to flat minima, which have been associated with favorable features. While prior work often relies on linearization to determine stability, it remains unclear whether linearized dynamics faithfully capture the full nonlinear behavior. Recent work has shown that GD may stably oscillate near a linearly unstable minimum and still converge once the step size decays, indicating that linear analysis can be misleading. In this work, we explicitly study the effect of nonlinear terms. Specifically, we derive an exact criterion for stable oscillations of GD near minima in the multivariate setting. Our condition depends on high-order derivatives, generalizing existing results. Extending the analysis to stochastic gradient descent (SGD), we show that nonlinear dynamics can diverge in expectation even if a single batch is unstable. This implies that stability can be dictated by a single batch that oscillate...