[2302.08724] Piecewise Deterministic Markov Processes for Bayesian Neural Networks
About this article
Abstract page for arXiv paper 2302.08724: Piecewise Deterministic Markov Processes for Bayesian Neural Networks
Statistics > Machine Learning arXiv:2302.08724 (stat) [Submitted on 17 Feb 2023 (v1), last revised 6 Apr 2026 (this version, v4)] Title:Piecewise Deterministic Markov Processes for Bayesian Neural Networks Authors:Ethan Goan, Dimitri Perrin, Kerrie Mengersen, Clinton Fookes View a PDF of the paper titled Piecewise Deterministic Markov Processes for Bayesian Neural Networks, by Ethan Goan and 3 other authors View PDF HTML (experimental) Abstract:Inference on modern Bayesian Neural Networks (BNNs) often relies on a variational inference treatment, imposing violated assumptions of independence and the form of the posterior. Traditional MCMC approaches avoid these assumptions at the cost of increased computation due to its incompatibility to subsampling of the likelihood. New Piecewise Deterministic Markov Process (PDMP) samplers permit subsampling, though introduce a model specific inhomogenous Poisson Process (IPPs) which is difficult to sample from. This work introduces a new generic and adaptive thinning scheme for sampling from these IPPs, and demonstrates how this approach can accelerate the application of PDMPs for inference in BNNs. Experimentation illustrates how inference with these methods is computationally feasible, can improve predictive accuracy, MCMC mixing performance, and provide informative uncertainty measurements when compared against other approximate inference schemes. Comments: Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG); Other Statis...