The dual PC algorithm and the role of Gaussianity for structure learning of Bayesian networks
From MaRDI portal
Publication:6137867
Abstract: Learning the graphical structure of Bayesian networks is key to describing data-generating mechanisms in many complex applications but poses considerable computational challenges. Observational data can only identify the equivalence class of the directed acyclic graph underlying a Bayesian network model, and a variety of methods exist to tackle the problem. Under certain assumptions, the popular PC algorithm can consistently recover the correct equivalence class by reverse-engineering the conditional independence (CI) relationships holding in the variable distribution. The dual PC algorithm is a novel scheme to carry out the CI tests within the PC algorithm by leveraging the inverse relationship between covariance and precision matrices. By exploiting block matrix inversions we can also perform tests on partial correlations of complementary (or dual) conditioning sets. The multiple CI tests of the dual PC algorithm proceed by first considering marginal and full-order CI relationships and progressively moving to central-order ones. Simulation studies show that the dual PC algorithm outperforms the classic PC algorithm both in terms of run time and in recovering the underlying network structure, even in the presence of deviations from Gaussianity. Additionally, we show that the dual PC algorithm applies for Gaussian copula models, and demonstrate its performance in that setting.
Cites work
- scientific article; zbMATH DE number 3585466 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- 10.1162/153244303321897717
- A PC algorithm variation for ordinal variables
- A characterization of Markov equivalence classes for acyclic digraphs
- Bayesian network based extreme learning machine for subjectivity detection
- Causality. Models, reasoning, and inference
- Causation, prediction, and search
- Characterizations of multivariate normality: I. Through independence of some statistics
- Efficient Sampling and Structure Learning of Bayesian Networks
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Estimating high-dimensional intervention effects from observational data
- High-dimensional semiparametric Gaussian copula graphical models
- Large-sample learning of Bayesian networks is NP-hard
- Order-independent constraint-based causal structure learning
- PARTIAL CORRELATION AND CONDITIONAL CORRELATION AS MEASURES OF CONDITIONAL INDEPENDENCE
- Probabilistic graphical models.
- Testing multivariate normality
- The max-min hill-climbing Bayesian network structure learning algorithm
- The nonparanormal: semiparametric estimation of high dimensional undirected graphs
- The reduced PC-algorithm: improved causal structure learning in large random networks
This page was built for publication: The dual PC algorithm and the role of Gaussianity for structure learning of Bayesian networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6137867)