[2602.20974] MAST: A Multi-fidelity Augmented Surrogate model via Spatial Trust-weighting
Summary
The paper introduces MAST, a Multi-fidelity Augmented Surrogate model that improves predictive accuracy in engineering design by effectively combining low-fidelity and high-fidelity data through spatial trust-weighting.
Why It Matters
MAST addresses the critical challenge of balancing computational cost and predictive accuracy in engineering and scientific computing. By enhancing existing multi-fidelity surrogate models, it offers a more efficient solution for practitioners facing budget constraints and varying fidelity requirements, ultimately advancing the field of machine learning.
Key Takeaways
- MAST blends low-fidelity observations with high-fidelity predictions for improved accuracy.
- The model utilizes explicit discrepancy modeling and distance-based weighting.
- MAST demonstrates superior performance across various budget and fidelity conditions compared to existing methods.
- The approach maintains stability and robustness, addressing common pitfalls in multi-fidelity modeling.
- This innovation can significantly reduce computational costs in engineering design processes.
Computer Science > Machine Learning arXiv:2602.20974 (cs) [Submitted on 24 Feb 2026] Title:MAST: A Multi-fidelity Augmented Surrogate model via Spatial Trust-weighting Authors:Ahmed Mohamed Eisa Nasr, Haris Moazam Sheikh View a PDF of the paper titled MAST: A Multi-fidelity Augmented Surrogate model via Spatial Trust-weighting, by Ahmed Mohamed Eisa Nasr and Haris Moazam Sheikh View PDF HTML (experimental) Abstract:In engineering design and scientific computing, computational cost and predictive accuracy are intrinsically coupled. High-fidelity simulations provide accurate predictions but at substantial computational costs, while lower-fidelity approximations offer efficiency at the expense of accuracy. Multi-fidelity surrogate modelling addresses this trade-off by combining abundant low-fidelity data with sparse high-fidelity observations. However, existing methods suffer from expensive training cost or rely on global correlation assumptions that often fail in practice to capture how fidelity relationships vary across the input space, leading to poor performance particularly under tight budget constraints. We introduce MAST, a method that blends corrected low-fidelity observations with high-fidelity predictions, trusting high-fidelity near observed samples and relying on corrected low-fidelity elsewhere. MAST achieves this through explicit discrepancy modelling and distance-based weighting with closed-form variance propagation, producing a single heteroscedastic Gaussian ...