[2602.22937] MSINO: Curvature-Aware Sobolev Optimization for Manifold Neural Networks
Summary
The paper presents MSINO, a novel curvature-aware optimization framework for training neural networks on Riemannian manifolds, enhancing stability and convergence through advanced mathematical techniques.
Why It Matters
This research is significant as it addresses the limitations of traditional optimization methods in machine learning by introducing curvature awareness, which can lead to more efficient training of neural networks in complex geometric spaces. This has implications for various applications, including robotics and surface imaging.
Key Takeaways
- MSINO replaces standard derivative supervision with a covariant Sobolev loss.
- The framework provides curvature-aware convergence guarantees for neural training.
- Applications include surface imaging and robotics on Lie groups.
- It improves stability through Laplace Beltrami smoothness regularization.
- Derives geometry-dependent constants for enhanced optimization performance.
Computer Science > Machine Learning arXiv:2602.22937 (cs) [Submitted on 26 Feb 2026] Title:MSINO: Curvature-Aware Sobolev Optimization for Manifold Neural Networks Authors:Suresan Pareth View a PDF of the paper titled MSINO: Curvature-Aware Sobolev Optimization for Manifold Neural Networks, by Suresan Pareth View PDF HTML (experimental) Abstract:We introduce Manifold Sobolev Informed Neural Optimization (MSINO), a curvature aware training framework for neural networks defined on Riemannian manifolds. The method replaces standard Euclidean derivative supervision with a covariant Sobolev loss that aligns gradients using parallel transport and improves stability via a Laplace Beltrami smoothness regularization term. Building on classical results in Riemannian optimization and Sobolev theory on manifolds, we derive geometry dependent constants that yield (i) a Descent Lemma with a manifold Sobolev smoothness constant, (ii) a Sobolev Polyak Lojasiewicz inequality giving linear convergence guarantees for Riemannian gradient descent and stochastic gradient descent under explicit step size bounds, and (iii) a two step Newton Sobolev method with local quadratic contraction in curvature controlled neighborhoods. Unlike prior Sobolev training in Euclidean space, MSINO provides training time guarantees that explicitly track curvature and transported Jacobians. Applications include surface imaging, physics informed learning settings, and robotics on Lie groups such as SO(3) and SE(3). ...