[2602.12706] Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations

[2602.12706] Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations

arXiv - Machine Learning 4 min read Article

Summary

The paper presents the Physics-Informed Laplace Neural Operator (PILNO), a novel approach to solving partial differential equations (PDEs) that integrates governing physics into neural network training, enhancing performance in small-data scenarios.

Why It Matters

This research addresses the limitations of traditional data-driven models in solving PDEs, particularly in small-data regimes. By incorporating physics into the training process, PILNO improves accuracy and generalization, making it a significant advancement for applications in engineering and scientific computing.

Key Takeaways

  • PILNO enhances the Laplace Neural Operator by embedding physics into training.
  • The model improves accuracy in small-data settings and reduces variability.
  • It effectively targets out-of-distribution input functions for better generalization.
  • PILNO utilizes virtual inputs and temporal-causality weighting for robust training.
  • The approach shows promise across various benchmarks, including Burgers' equation and Darcy flow.

Computer Science > Machine Learning arXiv:2602.12706 (cs) [Submitted on 13 Feb 2026] Title:Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations Authors:Heechang Kim, Qianying Cao, Hyomin Shin, Seungchul Lee, George Em Karniadakis, Minseok Choi View a PDF of the paper titled Physics-Informed Laplace Neural Operator for Solving Partial Differential Equations, by Heechang Kim and 5 other authors View PDF HTML (experimental) Abstract:Neural operators have emerged as fast surrogate solvers for parametric partial differential equations (PDEs). However, purely data-driven models often require extensive training data and can generalize poorly, especially in small-data regimes and under unseen (out-of-distribution) input functions that are not represented in the training data. To address these limitations, we propose the Physics-Informed Laplace Neural Operator (PILNO), which enhances the Laplace Neural Operator (LNO) by embedding governing physics into training through PDE, boundary condition, and initial condition residuals. To improve expressivity, we first introduce an Advanced LNO (ALNO) backbone that retains a pole-residue transient representation while replacing the steady-state branch with an FNO-style Fourier multiplier. To make physics-informed training both data-efficient and robust, PILNO further leverages (i) virtual inputs: an unlabeled ensemble of input functions spanning a broad spectral range that provides abundant physics-only superv...

Related Articles

Machine Learning

Can I trick a public AI to spit out an outcome I prefer?

I am aware of an organization that evaluates proposals by feeding them into a public version of AI. Is there a way to make that AI rate m...

Reddit - Artificial Intelligence · 1 min ·
Llms

Curated 550+ free AI tools useful for building projects (LLMs, APIs, local models, RAG, agents)

Over the last few days I was collecting free or low cost AI tools that are actually useful if you want to build stuff, not just try rando...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Artificial intelligence - Machine Learning, Robotics, Algorithms

AI Events ·
Machine Learning

Fed Chair Jerome Powell, Treasury's Bessent and top bank CEOs met over Anthropic's Mythos model

submitted by /u/esporx [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime