[2604.00689] Performance of Neural and Polynomial Operator Surrogates
About this article
Abstract page for arXiv paper 2604.00689: Performance of Neural and Polynomial Operator Surrogates
Computer Science > Machine Learning arXiv:2604.00689 (cs) [Submitted on 1 Apr 2026] Title:Performance of Neural and Polynomial Operator Surrogates Authors:Josephine Westermann, Benno Huber, Thomas O'Leary-Roseberry, Jakob Zech View a PDF of the paper titled Performance of Neural and Polynomial Operator Surrogates, by Josephine Westermann and 3 other authors View PDF HTML (experimental) Abstract:We consider the problem of constructing surrogate operators for parameter-to-solution maps arising from parametric partial differential equations, where repeated forward model evaluations are computationally expensive. We present a systematic empirical comparison of neural operator surrogates, including a reduced-basis neural operator trained with $L^2_\mu$ and $H^1_\mu$ objectives and the Fourier neural operator, against polynomial surrogate methods, specifically a reduced-basis sparse-grid surrogate and a reduced-basis tensor-train surrogate. All methods are evaluated on a linear parametric diffusion problem and a nonlinear parametric hyperelasticity problem, using input fields with algebraically decaying spectral coefficients at varying rates of decay $s$. To enable fair comparisons, we analyze ensembles of surrogate models generated by varying hyperparameters and compare the resulting Pareto frontiers of cost versus approximation accuracy, decomposing cost into contributions from data generation, setup, and evaluation. Our results show that no single method is universally superi...