[2509.23437] Better Hessians Matter: Studying the Impact of Curvature Approximations in Influence Functions

[2509.23437] Better Hessians Matter: Studying the Impact of Curvature Approximations in Influence Functions

arXiv - Machine Learning 4 min read Article

Summary

This paper investigates the impact of Hessian approximations on influence functions in deep learning, demonstrating that better approximations improve data attribution performance.

Why It Matters

Understanding the effectiveness of Hessian approximations is crucial for enhancing the accuracy of influence functions, which are vital for interpreting model predictions and ensuring transparency in machine learning. This research provides insights that can guide future developments in computational efficiency and model interpretability.

Key Takeaways

  • Better Hessian approximations lead to improved influence score quality.
  • The study identifies critical approximation steps affecting attribution accuracy.
  • Mismatch between K-FAC and GGN eigenvalues is a major source of error.
  • Findings support ongoing research into Hessian approximation methods.
  • The paper provides a framework for balancing computational tractability with accuracy.

Computer Science > Machine Learning arXiv:2509.23437 (cs) [Submitted on 27 Sep 2025 (v1), last revised 15 Feb 2026 (this version, v2)] Title:Better Hessians Matter: Studying the Impact of Curvature Approximations in Influence Functions Authors:Steve Hong, Runa Eschenhagen, Bruno Mlodozeniec, Richard Turner View a PDF of the paper titled Better Hessians Matter: Studying the Impact of Curvature Approximations in Influence Functions, by Steve Hong and 3 other authors View PDF HTML (experimental) Abstract:Influence functions offer a principled way to trace model predictions back to training data, but their use in deep learning is hampered by the need to invert a large, ill-conditioned Hessian matrix. Approximations such as Generalised Gauss-Newton (GGN) and Kronecker-Factored Approximate Curvature (K-FAC) have been proposed to make influence computation tractable, yet it remains unclear how the departure from exactness impacts data attribution performance. Critically, given the restricted regime in which influence functions are derived, it is not necessarily clear better Hessian approximations should even lead to better data attribution performance. In this paper, we investigate the effect of Hessian approximation quality on influence-function attributions in a controlled classification setting. Our experiments show that better Hessian approximations consistently yield better influence score quality, offering justification for recent research efforts towards that end. We furth...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
University of Tartu thesis: transfer learning boosts Estonian AI models
Machine Learning

University of Tartu thesis: transfer learning boosts Estonian AI models

AI News - General · 4 min ·
ACM Prize in Computing Honors Matei Zaharia for Foundational Contributions to Data and Machine Learning Systems
Machine Learning

ACM Prize in Computing Honors Matei Zaharia for Foundational Contributions to Data and Machine Learning Systems

AI News - General · 6 min ·
Sam Altman's Coworkers Say He Can Barely Code and Misunderstands Basic Machine Learning Concepts
Machine Learning

Sam Altman's Coworkers Say He Can Barely Code and Misunderstands Basic Machine Learning Concepts

AI News - General · 2 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime