[2604.01021] Transfer learning for nonparametric Bayesian networks
About this article
Abstract page for arXiv paper 2604.01021: Transfer learning for nonparametric Bayesian networks
Computer Science > Machine Learning arXiv:2604.01021 (cs) [Submitted on 1 Apr 2026] Title:Transfer learning for nonparametric Bayesian networks Authors:Rafael Sojo, Pedro Larrañaga, Concha Bielza View a PDF of the paper titled Transfer learning for nonparametric Bayesian networks, by Rafael Sojo and 2 other authors View PDF HTML (experimental) Abstract:This paper introduces two transfer learning methodologies for estimating nonparametric Bayesian networks under scarce data. We propose two algorithms, a constraint-based structure learning method, called PC-stable-transfer learning (PCS-TL), and a score-based method, called hill climbing transfer learning (HC-TL). We also define particular metrics to tackle the negative transfer problem in each of them, a situation in which transfer learning has a negative impact on the model's performance. Then, for the parameters, we propose a log-linear pooling approach. For the evaluation, we learn kernel density estimation Bayesian networks, a type of nonparametric Bayesian network, and compare their transfer learning performance with the models alone. To do so, we sample data from small, medium and large-sized synthetic networks and datasets from the UCI Machine Learning repository. Then, we add noise and modifications to these datasets to test their ability to avoid negative transfer. To conclude, we perform a Friedman test with a Bergmann-Hommel post-hoc analysis to show statistical proof of the enhanced experimental behavior of our ...