[2508.08935] LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks
About this article
Abstract page for arXiv paper 2508.08935: LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks
Computer Science > Machine Learning arXiv:2508.08935 (cs) [Submitted on 12 Aug 2025 (v1), last revised 5 Apr 2026 (this version, v3)] Title:LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks Authors:Ze Tao, Hanxuan Wang, Fujun Liu View a PDF of the paper titled LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks, by Ze Tao and 2 other authors View PDF Abstract:Physics-informed neural networks (PINNs) have attracted considerable attention for their ability to integrate partial differential equation priors into deep learning frameworks; however, they often exhibit limited predictive accuracy when applied to complex problems. To address this issue, we propose LNN-PINN, a physics-informed neural network framework that incorporates a liquid residual gating architecture while preserving the original physics modeling and optimization pipeline to improve predictive accuracy. The method introduces a lightweight gating mechanism solely within the hidden-layer mapping, keeping the sampling strategy, loss composition, and hyperparameter settings unchanged to ensure that improvements arise purely from architectural refinement. Across four benchmark problems, LNN-PINN consistently reduced RMSE and MAE under identical training conditions, with absolute error plots further confirming its accuracy gains. Moreover, the framework demonstrates strong adaptability and stability across varying dimensions, boundary conditions, and operator...