[2603.21502] Quotient Geometry, Effective Curvature, and Implicit Bias in Simple Shallow Neural Networks
About this article
Abstract page for arXiv paper 2603.21502: Quotient Geometry, Effective Curvature, and Implicit Bias in Simple Shallow Neural Networks
Computer Science > Machine Learning arXiv:2603.21502 (cs) [Submitted on 23 Mar 2026] Title:Quotient Geometry, Effective Curvature, and Implicit Bias in Simple Shallow Neural Networks Authors:Hang-Cheng Dong, Pengcheng Cheng View a PDF of the paper titled Quotient Geometry, Effective Curvature, and Implicit Bias in Simple Shallow Neural Networks, by Hang-Cheng Dong and 1 other authors View PDF HTML (experimental) Abstract:Overparameterized shallow neural networks admit substantial parameter redundancy: distinct parameter vectors may represent the same predictor due to hidden-unit permutations, rescalings, and related symmetries. As a result, geometric quantities computed directly in the ambient Euclidean parameter space can reflect artifacts of representation rather than intrinsic properties of the predictor. In this paper, we develop a differential-geometric framework for analyzing simple shallow networks through the quotient space obtained by modding out parameter symmetries on a regular set. We first characterize the symmetry and quotient structure of regular shallow-network parameters and show that the finite-sample realization map induces a natural metric on the quotient manifold. This leads to an effective notion of curvature that removes degeneracy along symmetry orbits and yields a symmetry-reduced Hessian capturing intrinsic local geometry. We then study gradient flows on the quotient and show that only the horizontal component of parameter motion contributes to fi...