[2510.13772] Tensor Gaussian Processes: Efficient Solvers for Nonlinear PDEs
About this article
Abstract page for arXiv paper 2510.13772: Tensor Gaussian Processes: Efficient Solvers for Nonlinear PDEs
Computer Science > Machine Learning arXiv:2510.13772 (cs) [Submitted on 15 Oct 2025 (v1), last revised 26 Mar 2026 (this version, v2)] Title:Tensor Gaussian Processes: Efficient Solvers for Nonlinear PDEs Authors:Qiwei Yuan, Zhitong Xu, Yinghao Chen, Yiming Xu, Houman Owhadi, Shandian Zhe View a PDF of the paper titled Tensor Gaussian Processes: Efficient Solvers for Nonlinear PDEs, by Qiwei Yuan and 4 other authors View PDF HTML (experimental) Abstract:Machine learning solvers for partial differential equations (PDEs) have attracted growing interest. However, most existing approaches, such as neural network solvers, rely on stochastic training, which is inefficient and typically requires a great many training epochs. Gaussian process (GP)/kernel-based solvers, while mathematical principled, suffer from scalability issues when handling large numbers of collocation points often needed for challenging or higher-dimensional PDEs. To overcome these limitations, we propose TGPS, a tensor-GP-based solver that introduces factor functions along each input dimension using one-dimensional GPs and combines them via tensor decomposition to approximate the full solution. This design reduces the task to learning a collection of one-dimensional GPs, substantially lowering computational complexity, and enabling scalability to massive collocation sets. For efficient nonlinear PDE solving, we use a partial freezing strategy and Newton's method to linerize the nonlinear terms. We then develo...