Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes
From MaRDI portal
Publication:6136582
DOI10.1214/23-aos2280arXiv2101.04084OpenAlexW4386035702MaRDI QIDQ6136582
Publication date: 31 August 2023
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.04084
Poincaré-type inequalityfinite Markov chainsrandom walk Metropolis-Hastingsrapid mixingstrong selection consistencygreedy equivalence search (GES)locally informed proposals
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parameter priors for directed acyclic graphical models and the characterization of several probability distributions
- Improving the structure MCMC sampler for Bayesian networks by introducing a new edge reversal move
- Global identifiability of linear structural equation models
- Bayesian variable selection with shrinking and diffusing priors
- Empirical Bayes posterior concentration in sparse high-dimensional linear models
- Geometry of the faithfulness assumption in causal inference
- \(\ell_{0}\)-penalized maximum likelihood for sparse directed acyclic graphs
- Reversible MCMC on Markov equivalence classes of sparse directed acyclic graphs
- On the computational complexity of high-dimensional Bayesian variable selection
- Sparsistency and rates of convergence in large covariance matrix estimation
- Causation, prediction, and search
- A characterization of Markov equivalence classes for acyclic digraphs
- Being Bayesian about network structure. A Bayesian approach to structure discovery in Bayesian networks
- Improving Markov chain Monte Carlo model search for data mining
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- Learning Markov equivalence classes of directed acyclic graphs: an objective Bayes approach
- Posterior graph selection and estimation consistency for high-dimensional Bayesian DAG models
- High-dimensional consistency in score-based and hybrid structure learning
- Support consistency of direct sparse-change learning in Markov networks
- Bayesian structure learning in graphical models
- A general framework for Bayes structured linear models
- Compatible priors for model selection of high-dimensional Gaussian DAGs
- Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors
- Who learns better Bayesian network structures: accuracy and speed of structure learning algorithms
- Minimax-optimal nonparametric regression in high dimensions
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Posterior contraction in sparse Bayesian factor models for massive covariance matrices
- Sparse Matrix Inversion with Scaled Lasso
- Bayesian model averaging and model selection for markov equivalence classes of acyclic digraphs
- Learning Causal Bayesian Network Structures From Experimental Data
- Propagation of Probabilities, Means, and Variances in Mixed Graphical Association Models
- Improved Bounds for Mixing Rates of Markov Chains and Multicommodity Flow
- 10.1162/153244302760200696
- 10.1162/153244303321897717
- 10.1162/153244304773936045
- Bayesian Graphical Models for Discrete Data
- Bayesian Model Selection in High-Dimensional Settings
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Posterior contraction in sparse generalized linear models
- Consistency guarantees for greedy permutation-based causal inference algorithms
- Informed Proposals for Local MCMC in Discrete Spaces
- Equivalence and Synthesis of Causal Models
- Dimension-Free Mixing for High-Dimensional Bayesian Variable Selection
- Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes