[2307.09366] Sparse Gaussian Graphical Models with Discrete Optimization: Computational and Statistical Perspectives
About this article
Abstract page for arXiv paper 2307.09366: Sparse Gaussian Graphical Models with Discrete Optimization: Computational and Statistical Perspectives
Computer Science > Machine Learning arXiv:2307.09366 (cs) [Submitted on 18 Jul 2023 (v1), last revised 4 Apr 2026 (this version, v2)] Title:Sparse Gaussian Graphical Models with Discrete Optimization: Computational and Statistical Perspectives Authors:Kayhan Behdin, Wenyu Chen, Rahul Mazumder View a PDF of the paper titled Sparse Gaussian Graphical Models with Discrete Optimization: Computational and Statistical Perspectives, by Kayhan Behdin and 2 other authors View PDF HTML (experimental) Abstract:We consider the problem of learning a sparse graph underlying an undirected Gaussian graphical model, a key problem in statistical machine learning. Given $n$ samples from a multivariate Gaussian distribution with $p$ variables, the goal is to estimate the $p \times p$ inverse covariance matrix (aka precision matrix), assuming it is sparse (i.e., has a few nonzero entries). We propose GraphL0BnB, a new estimator based on an $\ell_0$-penalized version of the pseudo-likelihood function, while most earlier approaches are based on the $\ell_1$-relaxation. Our estimator can be formulated as a convex mixed integer program (MIP) which can be difficult to compute beyond $p\approx 100$ using off-the-shelf commercial solvers. To solve the MIP, we propose a custom nonlinear branch-and-bound (BnB) framework that solves node relaxations with tailored first-order methods. As a key component of our BnB framework, we propose large-scale solvers for obtaining good primal solutions that are of i...