Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models
From MaRDI portal
Publication:4690959
Abstract: Gaussian graphical models are widely utilized to infer and visualize networks of dependencies between continuous variables. However, inferring the graph is difficult when the sample size is small compared to the number of variables. To reduce the number of parameters to estimate in the model, we propose a non-asymptotic model selection procedure supported by strong theoretical guarantees based on an oracle inequality and a minimax lower bound. The covariance matrix of the model is approximated by a block-diagonal matrix. The structure of this matrix is detected by thresholding the sample covariance matrix, where the threshold is selected using the slope heuristic. Based on the block-diagonal structure of the covariance matrix, the estimation problem is divided into several independent problems: subsequently, the network of dependencies between variables is inferred using the graphical lasso algorithm in each block. The performance of the procedure is illustrated on simulated data. An application to a real gene expression dataset with a limited sample size is also presented: the dimension reduction allows attention to be objectively focused on interactions among smaller subsets of genes, leading to a more parsimonious and interpretable modular network.
Recommendations
- Assessment of Covariance Selection Methods in High-Dimensional Gaussian Graphical Models
- High-dimensional covariance estimation based on Gaussian graphical models
- High-Dimensional Gaussian Graphical Regression Models with Covariates
- Testing block‐diagonal covariance structure for high‐dimensional data
- Flexible covariance estimation in graphical Gaussian models
- Covariance Estimation in Decomposable Gaussian Graphical Models
- Bayesian block-diagonal variable selection and model averaging
- Covariance decomposition in undirected Gaussian graphical models
- Block-diagonal test for high-dimensional covariance matrices
- On a model selection problem from high-dimensional sample covariance matrices
Cites work
- scientific article; zbMATH DE number 4211299 (Why is no real title available?)
- A New Lower Bound for Multiple Hypothesis Testing
- A non asymptotic penalized criterion for Gaussian mixture model selection
- A sparse conditional Gaussian graphical model for analysis of genetical genomics data
- Computation of multivariate normal and \(t\) probabilities
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Covariance structure approximation via gLasso in high-dimensional supervised classification
- Detecting multiple change-points in the mean of Gaussian process by model selection
- Estimation of Gaussian graphs by model selection
- Exact covariance thresholding into connected components for large-scale graphical lasso
- Gaussian model selection
- Gaussian model selection with an unknown variance
- Graph selection with GGMselect
- High-dimensional graphs and variable selection with the Lasso
- Inferring sparse Gaussian graphical models with latent structure
- Joint estimation of multiple graphical models
- Minimal penalties for Gaussian model selection
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Model selection and estimation in the Gaussian graphical model
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- Optimal rates of convergence for covariance matrix estimation
- QUIC: quadratic approximation for sparse inverse covariance estimation
- Rates of convergence for the Gaussian mixture sieve.
- Regularized estimation of large covariance matrices
- Slope heuristics: overview and implementation
- Sparse inverse covariance estimation with the graphical lasso
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- The cluster graphical Lasso for improved estimation of Gaussian graphical models
- The discriminative functional mixture model for a comparative analysis of bike sharing systems
- The huge Package for High-dimensional Undirected Graph Estimation in R
Cited in
(11)- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
- Discrepancy between structured matrices in the power analysis of a separability test
- Testing for independence of large dimensional vectors
- Incorporating grouping information into Bayesian Gaussian graphical model selection
- Structured learning of time-varying networks with application to \(\mathrm{PM}_{2.5}\) data
- Block-diagonal test for high-dimensional covariance matrices
- Block-diagonal covariance estimation and application to the Shapley effects in sensitivity analysis
- Robust Bayesian model selection for variable clustering with the Gaussian graphical model
- Spectral statistics of sample block correlation matrices
- A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
- Approximation with a Kronecker product structure with one component as compound symmetry or autoregression via entropy loss function
This page was built for publication: Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4690959)