[2602.21551] From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators
Summary
This article presents a novel Gaussian Particle Representation for interpreting PDE operators, enhancing interpretability and efficiency in fluid dynamics modeling using neural operators.
Why It Matters
The research addresses the limitations of current neural operator approaches in modeling PDE dynamics, particularly their interpretability and computational costs. By introducing a Gaussian basis representation, the authors provide a method that is both efficient and interpretable, which is crucial for advancing fluid dynamics simulations and applications in various scientific fields.
Key Takeaways
- Introduces a Gaussian basis representation for PDE operators.
- Achieves near-linear complexity for fixed modal budgets, enhancing efficiency.
- Supports irregular geometries and seamless transitions between 2D and 3D.
- Provides intrinsic interpretability, addressing a key limitation of neural operators.
- Demonstrates state-of-the-art accuracy on standard PDE benchmarks.
Computer Science > Machine Learning arXiv:2602.21551 (cs) [Submitted on 25 Feb 2026] Title:From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators Authors:Zhihao Li, Yu Feng, Zhilu Lai, Wei Wang View a PDF of the paper titled From Basis to Basis: Gaussian Particle Representation for Interpretable PDE Operators, by Zhihao Li and 3 other authors View PDF HTML (experimental) Abstract:Learning PDE dynamics for fluids increasingly relies on neural operators and Transformer-based models, yet these approaches often lack interpretability and struggle with localized, high-frequency structures while incurring quadratic cost in spatial samples. We propose representing fields with a Gaussian basis, where learned atoms carry explicit geometry (centers, anisotropic scales, weights) and form a compact, mesh-agnostic, directly visualizable state. Building on this representation, we introduce a Gaussian Particle Operator that acts in modal space: learned Gaussian modal windows perform a Petrov-Galerkin measurement, and PG Gaussian Attention enables global cross-scale coupling. This basis-to-basis design is resolution-agnostic and achieves near-linear complexity in N for a fixed modal budget, supporting irregular geometries and seamless 2D-to-3D extension. On standard PDE benchmarks and real datasets, our method attains state-of-the-art competitive accuracy while providing intrinsic interpretability. Subjects: Machine Learning (cs.LG); Artificial Intelligence (...