Asymptotic normality and optimalities in estimation of large Gaussian graphical models
From MaRDI portal
(Redirected from Publication:152845)
Abstract: The Gaussian graphical model, a popular paradigm for studying relationship among variables in a wide range of applications, has attracted great attention in recent years. This paper considers a fundamental question: When is it possible to estimate low-dimensional parameters at parametric square-root rate in a large Gaussian graphical model? A novel regression approach is proposed to obtain asymptotically efficient estimation of each entry of a precision matrix under a sparseness condition relative to the sample size. When the precision matrix is not sufficiently sparse, or equivalently the sample size is not sufficiently large, a lower bound is established to show that it is no longer possible to achieve the parametric rate in the estimation of each entry. This lower bound result, which provides an answer to the delicate sample size question, is established with a novel construction of a subset of sparse precision matrices in an application of Le Cam's lemma. Moreover, the proposed estimator is proven to have optimal convergence rate when the parametric rate cannot be achieved, under a minimal sample requirement. The proposed estimator is applied to test the presence of an edge in the Gaussian graphical model or to recover the support of the entire model, to obtain adaptive rate-optimal estimation of the entire precision matrix as measured by the matrix operator norm and to make inference in latent variables in the graphical model. All of this is achieved under a sparsity condition on the precision matrix and a side condition on the range of its spectrum. This significantly relaxes the commonly imposed uniform signal strength condition on the precision matrix, irrepresentability condition on the Hessian tensor operator of the covariance matrix or the constraint on the precision matrix. Numerical results confirm our theoretical findings. The ROC curve of the proposed algorithm, Asymptotic Normal Thresholding (ANT), for support recovery significantly outperforms that of the popular GLasso algorithm.
Recommendations
- Quasi-Bayesian estimation of large Gaussian graphical models
- High-dimensional covariance estimation based on Gaussian graphical models
- Model selection and estimation in the Gaussian graphical model
- Gaussian graphical model estimation with false discovery rate control
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
Cites work
- scientific article; zbMATH DE number 6378086 (Why is no real title available?)
- scientific article; zbMATH DE number 47926 (Why is no real title available?)
- scientific article; zbMATH DE number 1064667 (Why is no real title available?)
- scientific article; zbMATH DE number 1134987 (Why is no real title available?)
- scientific article; zbMATH DE number 3052523 (Why is no real title available?)
- A constrained \(\ell _{1}\) minimization approach to sparse precision matrix estimation
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Comments on: \(\ell _{1}\)-penalization for mixture regression models
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- Convergence of estimates under dimensionality restrictions
- Covariance regularization by thresholding
- Discussion: Latent variable graphical model selection via convex optimization
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- Exact matrix completion via convex optimization
- First-Order Methods for Sparse Covariance Selection
- Gaussian graphical model estimation with false discovery rate control
- High dimensional inverse covariance matrix estimation via linear programming
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- High-dimensional graphs and variable selection with the Lasso
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Latent variable graphical model selection via convex optimization
- Model selection and estimation in the Gaussian graphical model
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the conditions used to prove oracle results for the Lasso
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Optimal rates of convergence for covariance matrix estimation
- Optimal rates of convergence for sparse covariance matrix estimation
- Quantile regression for competing risks data with missing cause of failure
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Regularized estimation of large covariance matrices
- Restricted eigenvalue properties for correlated Gaussian designs
- Scaled sparse linear regression
- Simultaneous analysis of Lasso and Dantzig selector
- Some sharp performance bounds for least squares regression with L₁ regularization
- Sparse inverse covariance estimation with the graphical lasso
- Sparse matrix inversion with scaled Lasso
- Sparse permutation invariant covariance estimation
- Sparsistency and rates of convergence in large covariance matrix estimation
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Statistical significance in high-dimensional linear models
- The Dantzig selector and sparsity oracle inequalities
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- \(\ell_{1}\)-penalization for mixture regression models
Cited in
(80)- scientific article; zbMATH DE number 7559724 (Why is no real title available?)
- Robust regression via mutivariate regression depth
- Joint Gaussian graphical model estimation: a survey
- Approximate Bayesian estimation in large coloured graphical Gaussian models
- Data science, big data and statistics
- High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
- Combinatorial inference for graphical models
- A unified theory of confidence regions and testing for high-dimensional estimating equations
- Large-Scale Two-Sample Comparison of Support Sets
- Berry-Esseen bounds for estimating undirected graphs
- Fixed effects testing in high-dimensional linear mixed models
- High-dimensional robust precision matrix estimation: cellwise corruption under \(\epsilon \)-contamination
- Tuning-free heterogeneous inference in massive networks
- On the asymptotic variance of the debiased Lasso
- Simultaneous inference for pairwise graphical models with generalized score matching
- Quasi-Bayesian estimation of large Gaussian graphical models
- Inference of large modified Poisson-type graphical models: application to RNA-seq data in childhood atopic asthma studies
- Inference on Multi-level Partial Correlations Based on Multi-subject Time Series Data
- High-dimensional inference for cluster-based graphical models
- Asymptotic analysis of a matrix latent decomposition model
- A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity
- Minimax estimation of large precision matrices with bandable Cholesky factor
- Concentration of measure bounds for matrix-variate data with missing values
- Bayesian joint inference for multiple directed acyclic graphs
- High-dimensional Gaussian graphical models on network-linked data
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- SILGGM
- Blessing of massive scale: spatial graphical model estimation with a total cardinality constraint approach
- Bayesian inference for high-dimensional decomposable graphs
- StarTrek: combinatorial variable selection with false discovery rate control
- Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression
- One-step regularized estimator for high-dimensional regression models
- Gaussian graphical models with applications to omics analyses
- Efficient estimation of linear functionals of principal components
- Graphical Model Inference with Erosely Measured Data
- Confidence regions for entries of a large precision matrix
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors
- Frequentist Model Averaging for Undirected Gaussian Graphical Models
- Asymptotic normality in the maximum entropy models on graphs with an increasing number of parameters
- Causal discoveries for high dimensional mixed data
- Online Structural Change-Point Detection of High-dimensional Streaming Data via Dynamic Sparse Subspace Learning
- On semiparametric exponential family graphical models
- Confidence intervals for high-dimensional inverse covariance estimation
- The Hardness of Conditional Independence Testing and the Generalised Covariance Measure
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Gaussian graphical model estimation with false discovery rate control
- Honest confidence regions and optimality in high-dimensional precision matrix estimation
- Test for high dimensional covariance matrices
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Goodness-of-Fit Tests for High Dimensional Linear Models
- Hypothesis testing for high-dimensional multivariate regression with false discovery rate control
- Learning unfaithful \(K\)-separable Gaussian graphical models
- Scalable inference for high-dimensional precision matrix
- High-dimensional inference in misspecified linear models
- Ultrahigh dimensional precision matrix estimation via refitted cross validation
- Post-regularization inference for time-varying nonparanormal graphical models
- Double-estimation-friendly inference for high-dimensional misspecified models
- Inference for high-dimensional varying-coefficient quantile regression
- Pivotal estimation via square-root lasso in nonparametric regression
- An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled Lasso
- Joint estimation of heterogeneous exponential Markov random fields through an approximate likelihood inference
- Scalable and efficient inference via CPE
- Testing and signal identification for two-sample high-dimensional covariances via multi-level thresholding
- Reproducible learning in large-scale graphical models
- Network differential connectivity analysis
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
- Innovated scalable dynamic learning for time-varying graphical models
- Uniform joint screening for ultra-high dimensional graphical models
- Bayesian bandwidth test and selection for high-dimensional banded precision matrices
- Information‐incorporated Gaussian graphical model for gene expression data
- Bayesian estimation of large precision matrix based on Cholesky decomposition
- Generalized M-estimators for high-dimensional Tobit I models
- A Bayesian approach for partial Gaussian graphical models with sparsity
- Loss function, unbiasedness, and optimality of Gaussian graphical model selection
- Semiparametric efficiency bounds for high-dimensional models
- The Lasso with general Gaussian designs with applications to hypothesis testing
- Inference for heteroskedastic PCA with missing data
- ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models
- Transfer Learning in Large-Scale Gaussian Graphical Models with False Discovery Rate Control
This page was built for publication: Asymptotic normality and optimalities in estimation of large Gaussian graphical models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q152845)