[2604.04673] Minimaxity and Admissibility of Bayesian Neural Networks
About this article
Abstract page for arXiv paper 2604.04673: Minimaxity and Admissibility of Bayesian Neural Networks
Mathematics > Statistics Theory arXiv:2604.04673 (math) [Submitted on 6 Apr 2026] Title:Minimaxity and Admissibility of Bayesian Neural Networks Authors:Daniel Andrew Coulson, Martin T. Wells View a PDF of the paper titled Minimaxity and Admissibility of Bayesian Neural Networks, by Daniel Andrew Coulson and Martin T. Wells View PDF Abstract:Bayesian neural networks (BNNs) offer a natural probabilistic formulation for inference in deep learning models. Despite their popularity, their optimality has received limited attention through the lens of statistical decision theory. In this paper, we study decision rules induced by deep, fully connected feedforward ReLU BNNs in the normal location model under quadratic loss. We show that, for fixed prior scales, the induced Bayes decision rule is not minimax. We then propose a hyperprior on the effective output variance of the BNN prior that yields a superharmonic square-root marginal density, establishing that the resulting decision rule is simultaneously admissible and minimax. We further extend these results from the quadratic loss setting to the predictive density estimation problem with Kullback--Leibler loss. Finally, we validate our theoretical findings numerically through simulation. Comments: Subjects: Statistics Theory (math.ST); Machine Learning (cs.LG); Machine Learning (stat.ML) MSC classes: Primary 62C10, 62C15, 62C20, 62M45, secondary 62A15, 62F15 Cite as: arXiv:2604.04673 [math.ST] (or arXiv:2604.04673v1 [math.ST] ...