[2602.22015] Function-Space Empirical Bayes Regularisation with Student's t Priors
Summary
This paper presents a novel function-space empirical Bayes regularisation framework using heavy-tailed Student's t priors to improve Bayesian deep learning's uncertainty estimates.
Why It Matters
The study addresses the limitations of traditional Gaussian priors in Bayesian deep learning, which often fail to capture the complexities of neural network outputs. By introducing Student's t priors, the authors enhance the robustness of predictions and out-of-distribution detection, which is crucial for reliable AI applications.
Key Takeaways
- Introduces ST-FS-EB framework for Bayesian deep learning.
- Utilizes heavy-tailed Student's t priors for better uncertainty estimates.
- Demonstrates improved performance in in-distribution and out-of-distribution scenarios.
- Employs variational inference for posterior distribution approximation.
- Challenges existing methods that rely solely on Gaussian priors.
Computer Science > Machine Learning arXiv:2602.22015 (cs) [Submitted on 25 Feb 2026] Title:Function-Space Empirical Bayes Regularisation with Student's t Priors Authors:Pengcheng Hao, Ercan Engin Kuruoglu View a PDF of the paper titled Function-Space Empirical Bayes Regularisation with Student's t Priors, by Pengcheng Hao and 1 other authors View PDF HTML (experimental) Abstract:Bayesian deep learning (BDL) has emerged as a principled approach to produce reliable uncertainty estimates by integrating deep neural networks with Bayesian inference, and the selection of informative prior distributions remains a significant challenge. Various function-space variational inference (FSVI) regularisation methods have been presented, assigning meaningful priors over model predictions. However, these methods typically rely on a Gaussian prior, which fails to capture the heavy-tailed statistical characteristics inherent in neural network outputs. By contrast, this work proposes a novel function-space empirical Bayes regularisation framework -- termed ST-FS-EB -- which employs heavy-tailed Student's $t$ priors in both parameter and function spaces. Also, we approximate the posterior distribution through variational inference (VI), inducing an evidence lower bound (ELBO) objective based on Monte Carlo (MC) dropout. Furthermore, the proposed method is evaluated against various VI-based BDL baselines, and the results demonstrate its robust performance in in-distribution prediction, out-of-...