[2512.05556] Beyond Linear Surrogates: High-Fidelity Local Explanations for Black-Box Models

[2512.05556] Beyond Linear Surrogates: High-Fidelity Local Explanations for Black-Box Models

arXiv - Machine Learning 4 min read Article

Summary

The paper presents a novel method for generating high-fidelity local explanations for black-box machine learning models using multivariate adaptive regression splines (MARS) and N-ball sampling strategies.

Why It Matters

As black-box models become prevalent in critical applications, understanding their predictions is essential. This research enhances explainability by improving the fidelity of local explanations, which can help practitioners trust and effectively utilize these models in high-stakes environments.

Key Takeaways

  • Proposes a new method for local explanations using MARS and N-ball sampling.
  • Achieves a 32% reduction in root mean square error compared to existing methods.
  • Enhances the faithfulness of local approximations for black-box models.
  • Evaluated on five benchmark datasets, showing statistically significant improvements.
  • Advances the field of explainable AI, benefiting researchers and practitioners.

Computer Science > Machine Learning arXiv:2512.05556 (cs) [Submitted on 5 Dec 2025 (v1), last revised 19 Feb 2026 (this version, v2)] Title:Beyond Linear Surrogates: High-Fidelity Local Explanations for Black-Box Models Authors:Sanjeev Shrestha, Rahul Dubey, Hui Liu View a PDF of the paper titled Beyond Linear Surrogates: High-Fidelity Local Explanations for Black-Box Models, by Sanjeev Shrestha and 2 other authors View PDF HTML (experimental) Abstract:With the increasing complexity of black-box machine learning models and their adoption in high-stakes areas, it is critical to provide explanations for their predictions. Existing local explanation methods lack in generating high-fidelity explanations. This paper proposes a novel local model agnostic explanation method to generate high-fidelity explanations using multivariate adaptive regression splines (MARS) and N-ball sampling strategies. MARS is used to model non-linear local boundaries that effectively captures the underlying behavior of the reference model, thereby enhancing the local fidelity. The N-ball sampling technique samples perturbed samples directly from a desired distribution instead of reweighting, leading to further improvement in the faithfulness. The performance of the proposed method was computed in terms of root mean squared error (RMSE) and evaluated on five different benchmark datasets with different kernel width. Experimental results show that the proposed method achieves higher local surrogate fidel...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Machine Learning

[P] SpeakFlow - AI Dialogue Practice Coach with GLM 5.1

Built SpeakFlow for the Z.AI Builder Series hackathon. AI dialogue practice coach that evaluates your spoken responses in real-time. Two ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] ICML Anonymized git repos for rebuttal

A number of the papers I'm reviewing for have submitted additional figures and code through anonymized git repos (e.g. https://anonymous....

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime