The dual PC algorithm and the role of Gaussianity for structure learning of Bayesian networks
From MaRDI portal
Publication:6137867
DOI10.1016/J.IJAR.2023.108975arXiv2112.09036MaRDI QIDQ6137867FDOQ6137867
Authors: Enrico Giudice, Jack Kuipers, Giusi Moffa
Publication date: 4 September 2023
Published in: International Journal of Approximate Reasoning (Search for Journal in Brave)
Abstract: Learning the graphical structure of Bayesian networks is key to describing data-generating mechanisms in many complex applications but poses considerable computational challenges. Observational data can only identify the equivalence class of the directed acyclic graph underlying a Bayesian network model, and a variety of methods exist to tackle the problem. Under certain assumptions, the popular PC algorithm can consistently recover the correct equivalence class by reverse-engineering the conditional independence (CI) relationships holding in the variable distribution. The dual PC algorithm is a novel scheme to carry out the CI tests within the PC algorithm by leveraging the inverse relationship between covariance and precision matrices. By exploiting block matrix inversions we can also perform tests on partial correlations of complementary (or dual) conditioning sets. The multiple CI tests of the dual PC algorithm proceed by first considering marginal and full-order CI relationships and progressively moving to central-order ones. Simulation studies show that the dual PC algorithm outperforms the classic PC algorithm both in terms of run time and in recovering the underlying network structure, even in the presence of deviations from Gaussianity. Additionally, we show that the dual PC algorithm applies for Gaussian copula models, and demonstrate its performance in that setting.
Full work available at URL: https://arxiv.org/abs/2112.09036
Cites Work
- Order-independent constraint-based causal structure learning
- Causation, prediction, and search
- High-dimensional semiparametric Gaussian copula graphical models
- The nonparanormal: semiparametric estimation of high dimensional undirected graphs
- A characterization of Markov equivalence classes for acyclic digraphs
- Large-sample learning of Bayesian networks is NP-hard
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Title not available (Why is that?)
- The max-min hill-climbing Bayesian network structure learning algorithm
- Causality. Models, reasoning, and inference
- Probabilistic graphical models.
- Estimating high-dimensional intervention effects from observational data
- Testing multivariate normality
- PARTIAL CORRELATION AND CONDITIONAL CORRELATION AS MEASURES OF CONDITIONAL INDEPENDENCE
- 10.1162/153244303321897717
- Title not available (Why is that?)
- Characterizations of multivariate normality: I. Through independence of some statistics
- Bayesian network based extreme learning machine for subjectivity detection
- A PC algorithm variation for ordinal variables
- Efficient Sampling and Structure Learning of Bayesian Networks
- The reduced PC-algorithm: improved causal structure learning in large random networks
This page was built for publication: The dual PC algorithm and the role of Gaussianity for structure learning of Bayesian networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6137867)