[2602.12384] Why Deep Jacobian Spectra Separate: Depth-Induced Scaling and Singular-Vector Alignment
Summary
This paper explores the mechanisms behind the implicit bias in gradient-based training of deep networks, focusing on the scaling and alignment of singular values in deep Jacobians.
Why It Matters
Understanding the dynamics of deep Jacobians is crucial for improving deep learning models. This research provides insights into how depth influences model behavior, which can lead to better training techniques and model architectures in machine learning.
Key Takeaways
- Depth-induced scaling of singular values affects training dynamics.
- Strong spectral separation leads to singular-vector alignment in Jacobians.
- The study proposes a new approximation regime for understanding singular-value dynamics.
Computer Science > Machine Learning arXiv:2602.12384 (cs) [Submitted on 12 Feb 2026] Title:Why Deep Jacobian Spectra Separate: Depth-Induced Scaling and Singular-Vector Alignment Authors:Nathanaël Haas, Francçois Gatine, Augustin M Cosse, Zied Bouraoui View a PDF of the paper titled Why Deep Jacobian Spectra Separate: Depth-Induced Scaling and Singular-Vector Alignment, by Nathana\"el Haas and 3 other authors View PDF HTML (experimental) Abstract:Understanding why gradient-based training in deep networks exhibits strong implicit bias remains challenging, in part because tractable singular-value dynamics are typically available only for balanced deep linear models. We propose an alternative route based on two theoretically grounded and empirically testable signatures of deep Jacobians: depth-induced exponential scaling of ordered singular values and strong spectral separation. Adopting a fixed-gates view of piecewise-linear networks, where Jacobians reduce to products of masked linear maps within a single activation region, we prove the existence of Lyapunov exponents governing the top singular values at initialization, give closed-form expressions in a tractable masked model, and quantify finite-depth corrections. We further show that sufficiently strong separation forces singular-vector alignment in matrix products, yielding an approximately shared singular basis for intermediate Jacobians. Together, these results motivate an approximation regime in which singular-value d...