[2602.14086] Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing

[2602.14086] Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing

arXiv - Machine Learning 3 min read Article

Summary

This paper explores Neural Optimal Transport in infinite-dimensional Hilbert spaces, addressing spurious solutions and proposing a Gaussian smoothing method to enhance accuracy in distribution capture.

Why It Matters

Understanding spurious solutions in Neural Optimal Transport is crucial for improving machine learning models, particularly in high-dimensional spaces. This research provides theoretical insights and practical solutions that can enhance model performance in various applications, making it relevant for researchers and practitioners in machine learning.

Key Takeaways

  • Spurious solutions in Neural Optimal Transport can misrepresent target distributions.
  • The paper introduces a Gaussian smoothing strategy to resolve ill-posedness.
  • Theoretical contributions include proving the well-posedness of the formulation under regular source measures.
  • Empirical results show the proposed method outperforms existing baselines in suppressing spurious solutions.
  • Characterization of regularity in smoothed measures is dependent on the covariance operator's kernel.

Computer Science > Machine Learning arXiv:2602.14086 (cs) [Submitted on 15 Feb 2026] Title:Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing Authors:Jae-Hwan Choi, Jiwoo Yoon, Dohyun Kwon, Jaewoong Choi View a PDF of the paper titled Neural Optimal Transport in Hilbert Spaces: Characterizing Spurious Solutions and Gaussian Smoothing, by Jae-Hwan Choi and 3 other authors View PDF HTML (experimental) Abstract:We study Neural Optimal Transport in infinite-dimensional Hilbert spaces. In non-regular settings, Semi-dual Neural OT often generates spurious solutions that fail to accurately capture target distributions. We analytically characterize this spurious solution problem using the framework of regular measures, which generalize Lebesgue absolute continuity in finite dimensions. To resolve ill-posedness, we extend the semi-dual framework via a Gaussian smoothing strategy based on Brownian motion. Our primary theoretical contribution proves that under a regular source measure, the formulation is well-posed and recovers a unique Monge map. Furthermore, we establish a sharp characterization for the regularity of smoothed measures, proving that the success of smoothing depends strictly on the kernel of the covariance operator. Empirical results on synthetic functional data and time-series datasets demonstrate that our approach effectively suppresses spurious solutions and outperforms existing baselines. Comments: Subjects: Machi...

Related Articles

Machine Learning

TMLR reviews stalled [D]

I submitted a regular submission (12 pages or less) to TMLR in February that had status change to “under review” 6 weeks ago. TMLR states...

Reddit - Machine Learning · 1 min ·
Top 10 AI certifications and courses for 2026
Ai Startups

Top 10 AI certifications and courses for 2026

This article reviews the top 10 AI certifications and courses for 2026, highlighting their significance in a rapidly evolving field and t...

AI Events · 15 min ·
Machine Learning

Artificial intelligence - Machine Learning, Robotics, Algorithms

AI Events ·
Machine Learning

Looking to join a team working on AI/CV research (aiming to publish) [R]

Hi, I am currently working as a research assistant in my college, but I want to do more serious research and learn more from it. I’m inte...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime