[2602.22015] Function-Space Empirical Bayes Regularisation with Student's t Priors

[2602.22015] Function-Space Empirical Bayes Regularisation with Student's t Priors

arXiv - Machine Learning 3 min read Article

Summary

This paper presents a novel function-space empirical Bayes regularisation framework using heavy-tailed Student's t priors to improve Bayesian deep learning's uncertainty estimates.

Why It Matters

The study addresses the limitations of traditional Gaussian priors in Bayesian deep learning, which often fail to capture the complexities of neural network outputs. By introducing Student's t priors, the authors enhance the robustness of predictions and out-of-distribution detection, which is crucial for reliable AI applications.

Key Takeaways

  • Introduces ST-FS-EB framework for Bayesian deep learning.
  • Utilizes heavy-tailed Student's t priors for better uncertainty estimates.
  • Demonstrates improved performance in in-distribution and out-of-distribution scenarios.
  • Employs variational inference for posterior distribution approximation.
  • Challenges existing methods that rely solely on Gaussian priors.

Computer Science > Machine Learning arXiv:2602.22015 (cs) [Submitted on 25 Feb 2026] Title:Function-Space Empirical Bayes Regularisation with Student's t Priors Authors:Pengcheng Hao, Ercan Engin Kuruoglu View a PDF of the paper titled Function-Space Empirical Bayes Regularisation with Student's t Priors, by Pengcheng Hao and 1 other authors View PDF HTML (experimental) Abstract:Bayesian deep learning (BDL) has emerged as a principled approach to produce reliable uncertainty estimates by integrating deep neural networks with Bayesian inference, and the selection of informative prior distributions remains a significant challenge. Various function-space variational inference (FSVI) regularisation methods have been presented, assigning meaningful priors over model predictions. However, these methods typically rely on a Gaussian prior, which fails to capture the heavy-tailed statistical characteristics inherent in neural network outputs. By contrast, this work proposes a novel function-space empirical Bayes regularisation framework -- termed ST-FS-EB -- which employs heavy-tailed Student's $t$ priors in both parameter and function spaces. Also, we approximate the posterior distribution through variational inference (VI), inducing an evidence lower bound (ELBO) objective based on Monte Carlo (MC) dropout. Furthermore, the proposed method is evaluated against various VI-based BDL baselines, and the results demonstrate its robust performance in in-distribution prediction, out-of-...

Related Articles

Llms

LLM agents can trigger real actions now. But what actually stops them from executing?

We ran into a simple but important issue while building agents with tool calling: the model can propose actions but nothing actually enfo...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

OkCupid gave 3 million dating-app photos to facial recognition firm, FTC says

submitted by /u/Mathemodel [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
Llms

Are LLMs a Dead End? (Investors Just Bet $1 Billion on “Yes”)

| AI Reality Check | Cal Newport Chapters 0:00 What is Yan LeCun Up To? 14:55 How is it possible that LeCun could be right about LLM’s be...

Reddit - Artificial Intelligence · 1 min ·
20+ Best AI Project Ideas for 2026: Trending AI Projects
Ai Startups

20+ Best AI Project Ideas for 2026: Trending AI Projects

This article presents over 20 AI project ideas tailored for various skill levels, providing a roadmap for building portfolio-ready projec...

AI Events ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime