Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes

From MaRDI portal
Publication:6136582

DOI10.1214/23-AOS2280arXiv2101.04084OpenAlexW4386035702MaRDI QIDQ6136582

Hyunwoong Chang, Quan Zhou

Publication date: 31 August 2023

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Structure learning via MCMC sampling is known to be very challenging because of the enormous search space and the existence of Markov equivalent DAGs. Theoretical results on the mixing behavior are lacking. In this work, we prove the rapid mixing of a random walk Metropolis-Hastings algorithm, which reveals that the complexity of Bayesian learning of sparse equivalence classes grows only polynomially in n and p, under some high-dimensional assumptions. A series of high-dimensional consistency results is obtained, including the strong selection consistency of an empirical Bayes model for structure learning. Our proof is based on two new results. First, we derive a general mixing time bound on finite state spaces, which can be applied to local MCMC schemes for other model selection problems. Second, we construct high-probability search paths on the space of equivalence classes with node degree constraints by proving a combinatorial property of DAG comparisons. Simulation studies on the proposed MCMC sampler are conducted to illustrate the main theoretical findings.


Full work available at URL: https://arxiv.org/abs/2101.04084





Cites Work


Cited In (1)






This page was built for publication: Complexity analysis of Bayesian learning of high-dimensional DAG models and their equivalence classes

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136582)