High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
From MaRDI portal
(Redirected from Publication:5405191)
Abstract: We consider the problem of high-dimensional Gaussian graphical model selection. We identify a set of graphs for which an efficient estimation algorithm exists, and this algorithm is based on thresholding of empirical conditional covariances. Under a set of transparent conditions, we establish structural consistency (or sparsistency) for the proposed algorithm, when the number of samples n=omega(J_{min}^{-2} log p), where p is the number of variables and J_{min} is the minimum (absolute) edge potential of the graphical model. The sufficient conditions for sparsistency are based on the notion of walk-summability of the model and the presence of sparse local vertex separators in the underlying graph. We also derive novel non-asymptotic necessary conditions on the number of samples required for sparsistency.
Recommendations
- High-dimensional structure estimation in Ising models: local separation criterion
- Estimation of Gaussian graphs by model selection
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional covariance estimation based on Gaussian graphical models
Cited in
(18)- Total positivity in exponential families with application to binary variables
- High-dimensional consistency in score-based and hybrid structure learning
- Region selection in Markov random fields: Gaussian case
- Estimating high dimensional faithful Gaussian graphical models by low-order conditioning
- Discovering the topology of complex networks via adaptive estimators
- Generalized chordality, vertex separators and hyperbolicity on graphs
- Causal Structural Learning via Local Graphs
- The reduced PC-algorithm: improved causal structure learning in large random networks
- Estimation of positive definite M-matrices and structure learning for attractive Gaussian Markov random fields
- The shape of partial correlation matrices
- Maximum likelihood estimation in Gaussian models under total positivity
- Integrative analysis with a system of semiparametric projection non-linear regression models
- Learning unfaithful \(K\)-separable Gaussian graphical models
- Learning loopy graphical models with latent variables: efficient methods and guarantees
- scientific article; zbMATH DE number 5957391 (Why is no real title available?)
- Large deviations of convex polyominoes
- High-dimensional structure estimation in Ising models: local separation criterion
- A unified framework for structured graph learning via spectral constraints
This page was built for publication: High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5405191)