[2602.21551] From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators

[2602.21551] From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators

arXiv - Machine Learning 3 min read Article

Summary

This article presents a novel Gaussian Particle Representation for interpreting PDE operators, enhancing interpretability and efficiency in fluid dynamics modeling using neural operators.

Why It Matters

The research addresses the limitations of current neural operator approaches in modeling PDE dynamics, particularly their interpretability and computational costs. By introducing a Gaussian basis representation, the authors provide a method that is both efficient and interpretable, which is crucial for advancing fluid dynamics simulations and applications in various scientific fields.

Key Takeaways

  • Introduces a Gaussian basis representation for PDE operators.
  • Achieves near-linear complexity for fixed modal budgets, enhancing efficiency.
  • Supports irregular geometries and seamless transitions between 2D and 3D.
  • Provides intrinsic interpretability, addressing a key limitation of neural operators.
  • Demonstrates state-of-the-art accuracy on standard PDE benchmarks.

Computer Science > Machine Learning arXiv:2602.21551 (cs) [Submitted on 25 Feb 2026] Title:From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators Authors:Zhihao Li, Yu Feng, Zhilu Lai, Wei Wang View a PDF of the paper titled From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators, by Zhihao Li and 3 other authors View PDF HTML (experimental) Abstract:Learning PDE dynamics for fluids increasingly relies on neural operators and Transformer-based models, yet these approaches often lack interpretability and struggle with localized, high-frequency structures while incurring quadratic cost in spatial samples. We propose representing fields with a Gaussian basis, where learned atoms carry explicit geometry (centers, anisotropic scales, weights) and form a compact, mesh-agnostic, directly visualizable state. Building on this representation, we introduce a Gaussian Particle Operator that acts in modal space: learned Gaussian modal windows perform a Petrov-Galerkin measurement, and PG Gaussian Attention enables global cross-scale coupling. This basis-to-basis design is resolution-agnostic and achieves near-linear complexity in N for a fixed modal budget, supporting irregular geometries and seamless 2D-to-3D extension. On standard PDE benchmarks and real datasets, our method attains state-of-the-art competitive accuracy while providing intrinsic interpretability. Subjects: Machine Learning (cs.LG); Artificial Intelligence (...

Related Articles

Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Machine Learning

[D] TMLR reviews seem more reliable than ICML/NeurIPS/ICLR

This year I submitted a paper to ICML for the first time. I have also experienced the review process at TMLR and ICLR. From my observatio...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] icml, no rebuttal ack so far..

Almost all the papers I reviewed have received at least one ack, but I haven’t gotten a single rebuttal acknowledgment yet. Is there anyo...

Reddit - Machine Learning · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime