[2602.14086] Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing
Summary
This paper explores Neural Optimal Transport in infinite-dimensional Hilbert spaces, addressing spurious solutions and proposing a Gaussian smoothing method to enhance accuracy in distribution capture.
Why It Matters
Understanding spurious solutions in Neural Optimal Transport is crucial for improving machine learning models, particularly in high-dimensional spaces. This research provides theoretical insights and practical solutions that can enhance model performance in various applications, making it relevant for researchers and practitioners in machine learning.
Key Takeaways
- Spurious solutions in Neural Optimal Transport can misrepresent target distributions.
- The paper introduces a Gaussian smoothing strategy to resolve ill-posedness.
- Theoretical contributions include proving the well-posedness of the formulation under regular source measures.
- Empirical results show the proposed method outperforms existing baselines in suppressing spurious solutions.
- Characterization of regularity in smoothed measures is dependent on the covariance operator's kernel.
Computer Science > Machine Learning arXiv:2602.14086 (cs) [Submitted on 15 Feb 2026] Title:Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing Authors:Jae-Hwan Choi, Jiwoo Yoon, Dohyun Kwon, Jaewoong Choi View a PDF of the paper titled Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing, by Jae-Hwan Choi and 3 other authors View PDF HTML (experimental) Abstract:We study Neural Optimal Transport in infinite-dimensional Hilbert spaces. In non-regular settings, Semi-dual Neural OT often generates spurious solutions that fail to accurately capture target distributions. We analytically characterize this spurious solution problem using the framework of regular measures, which generalize Lebesgue absolute continuity in finite dimensions. To resolve ill-posedness, we extend the semi-dual framework via a Gaussian smoothing strategy based on Brownian motion. Our primary theoretical contribution proves that under a regular source measure, the formulation is well-posed and recovers a unique Monge map. Furthermore, we establish a sharp characterization for the regularity of smoothed measures, proving that the success of smoothing depends strictly on the kernel of the covariance operator. Empirical results on synthetic functional data and time-series datasets demonstrate that our approach effectively suppresses spurious solutions and outperforms existing baselines. Comments: Subjects: Machi...