[2402.02644] Permutation-based Inference for Variational Learning of Directed Acyclic Graphs
Summary
This paper presents PIVID, a novel method for inferring distributions over permutations and directed acyclic graphs (DAGs) using variational inference, addressing challenges in Bayesian network structure estimation.
Why It Matters
Understanding the structure of Bayesian networks is crucial for causal discovery in machine learning. PIVID offers a new approach that improves accuracy and efficiency in estimating DAGs, which can significantly impact various applications in data science and AI.
Key Takeaways
- PIVID improves the inference of distributions over DAGs using variational methods.
- The method outperforms existing deterministic and Bayesian approaches in accuracy-uncertainty trade-offs.
- PIVID scales efficiently with the number of variables, making it suitable for larger datasets.
- The paper addresses key challenges in causal discovery and Bayesian network structure estimation.
- Experiments on both synthetic and real-world datasets validate the effectiveness of PIVID.
Computer Science > Machine Learning arXiv:2402.02644 (cs) [Submitted on 4 Feb 2024 (v1), last revised 16 Feb 2026 (this version, v4)] Title:Permutation-based Inference for Variational Learning of Directed Acyclic Graphs Authors:Edwin V. Bonilla, Pantelis Elinas, He Zhao, Maurizio Filippone, Vassili Kitsios, Terry O'Kane View a PDF of the paper titled Permutation-based Inference for Variational Learning of Directed Acyclic Graphs, by Edwin V. Bonilla and 5 other authors View PDF HTML (experimental) Abstract:Estimating the structure of Bayesian networks as directed acyclic graphs (DAGs) from observational data is a fundamental challenge, particularly in causal discovery. Bayesian approaches excel by quantifying uncertainty and addressing identifiability, but key obstacles remain: (i) representing distributions over DAGs and (ii) estimating a posterior in the underlying combinatorial space. We introduce PIVID, a method that jointly infers a distribution over permutations and DAGs using variational inference and continuous relaxations of discrete distributions. Through experiments on synthetic and real-world datasets, we show that PIVID can outperform deterministic and Bayesian approaches, achieving superior accuracy-uncertainty trade-offs while scaling efficiently with the number of variables. Subjects: Machine Learning (cs.LG); Machine Learning (stat.ML) Cite as: arXiv:2402.02644 [cs.LG] (or arXiv:2402.02644v4 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2402...