[2603.27929] Physics-Guided Transformer (PGT): Physics-Aware Attention Mechanism for PINNs
About this article
Abstract page for arXiv paper 2603.27929: Physics-Guided Transformer (PGT): Physics-Aware Attention Mechanism for PINNs
Computer Science > Machine Learning arXiv:2603.27929 (cs) [Submitted on 30 Mar 2026] Title:Physics-Guided Transformer (PGT): Physics-Aware Attention Mechanism for PINNs Authors:Ehsan Zeraatkar, Rodion Podorozhny, Jelena Tešić View a PDF of the paper titled Physics-Guided Transformer (PGT): Physics-Aware Attention Mechanism for PINNs, by Ehsan Zeraatkar and 2 other authors View PDF HTML (experimental) Abstract:Reconstructing continuous physical fields from sparse, irregular observations is a central challenge in scientific machine learning, particularly for systems governed by partial differential equations (PDEs). Existing physics-informed methods typically enforce governing equations as soft penalty terms during optimization, often leading to gradient imbalance, instability, and degraded physical consistency under limited data. We introduce the Physics-Guided Transformer (PGT), a neural architecture that embeds physical structure directly into the self-attention mechanism. Specifically, PGT incorporates a heat-kernel-derived additive bias into attention logits, encoding diffusion dynamics and temporal causality within the representation. Query coordinates attend to these physics-conditioned context tokens, and the resulting features are decoded using a FiLM-modulated sinusoidal implicit network that adaptively controls spectral response. We evaluate PGT on the one-dimensional heat equation and two-dimensional incompressible Navier-Stokes systems. In sparse 1D reconstructi...