Asymptotic normality and optimalities in estimation of large Gaussian graphical models
From MaRDI portal
Publication:152845
DOI10.1214/14-aos1286zbMath1328.62342arXiv1309.6024MaRDI QIDQ152845
Zhao Ren, Tingni Sun, Cun-Hui Zhang, Harrison H. Zhou, Tingni Sun, Zhao Ren, Cun-Hui Zhang, Harrison H. Zhou
Publication date: 1 June 2015
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1309.6024
asymptotic efficiencycovariance matrixminimax lower boundinferenceoptimal rate of convergencesparsityspectral normgraphical modelprecision matrixlatent graphical modelscaled lasso
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Nonparametric statistical resampling methods (62G09)
Related Items
Hypothesis testing for high-dimensional multivariate regression with false discovery rate control, Scalable inference for high-dimensional precision matrix, A unified theory of confidence regions and testing for high-dimensional estimating equations, High-Dimensional Inference for Cluster-Based Graphical Models, The benefit of group sparsity in group inference with de-biased scaled group Lasso, Bayesian joint inference for multiple directed acyclic graphs, Unnamed Item, Unnamed Item, Unnamed Item, Asymptotic Analysis of a Matrix Latent Decomposition Model, Confidence regions for entries of a large precision matrix, Efficient estimation of linear functionals of principal components, Inference for Nonparanormal Partial Correlation via Regularized Rank-Based Nodewise Regression, Joint estimation of heterogeneous exponential Markov random fields through an approximate likelihood inference, Transfer Learning in Large-Scale Gaussian Graphical Models with False Discovery Rate Control, Information‐incorporated Gaussian graphical model for gene expression data, Frequentist Model Averaging for Undirected Gaussian Graphical Models, An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled Lasso, Scalable and efficient inference via CPE, Testing and signal identification for two-sample high-dimensional covariances via multi-level thresholding, A Bayesian approach for partial Gaussian graphical models with sparsity, Blessing of massive scale: spatial graphical model estimation with a total cardinality constraint approach, Double-estimation-friendly inference for high-dimensional misspecified models, Uniform joint screening for ultra-high dimensional graphical models, Innovated scalable dynamic learning for time-varying graphical models, A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity, Concentration of measure bounds for matrix-variate data with missing values, The Lasso with general Gaussian designs with applications to hypothesis testing, StarTrek: combinatorial variable selection with false discovery rate control, Goodness-of-Fit Tests for High Dimensional Linear Models, Inference on Multi-level Partial Correlations Based on Multi-subject Time Series Data, Minimax estimation of large precision matrices with bandable Cholesky factor, Bayesian bandwidth test and selection for high-dimensional banded precision matrices, Combinatorial inference for graphical models, Fixed Effects Testing in High-Dimensional Linear Mixed Models, High-dimensional inference in misspecified linear models, Generalized M-estimators for high-dimensional Tobit I models, ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models, Inference of large modified Poisson-type graphical models: application to RNA-seq data in childhood atopic asthma studies, High-dimensional robust precision matrix estimation: cellwise corruption under \(\epsilon \)-contamination, Pivotal estimation via square-root lasso in nonparametric regression, Test for high dimensional covariance matrices, Data science, big data and statistics, Unnamed Item, Unnamed Item, Confidence intervals for high-dimensional inverse covariance estimation, The Hardness of Conditional Independence Testing and the Generalised Covariance Measure, Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Semiparametric efficiency bounds for high-dimensional models, Tuning-Free Heterogeneity Pursuit in Massive Networks, Bayesian inference for high-dimensional decomposable graphs, Robust regression via mutivariate regression depth, Ultrahigh dimensional precision matrix estimation via refitted cross validation, Asymptotic normality and optimalities in estimation of large Gaussian graphical models, Honest confidence regions and optimality in high-dimensional precision matrix estimation, Gaussian graphical model estimation with false discovery rate control, Bayesian estimation of large precision matrix based on Cholesky decomposition, SILGGM, Loss function, unbiasedness, and optimality of Gaussian graphical model selection, Inference for high-dimensional varying-coefficient quantile regression, Unnamed Item, Reproducible learning in large-scale graphical models, On the asymptotic variance of the debiased Lasso, Network differential connectivity analysis
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Sparse inverse covariance estimation with the graphical lasso
- Latent variable graphical model selection via convex optimization
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Gaussian graphical model estimation with false discovery rate control
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- Statistical significance in high-dimensional linear models
- The Dantzig selector and sparsity oracle inequalities
- \(\ell_{1}\)-penalization for mixture regression models
- Comments on: \(\ell_{1}\)-penalization for mixture regression models
- Optimal rates of convergence for sparse covariance matrix estimation
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Optimal rates of convergence for covariance matrix estimation
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Sparsistency and rates of convergence in large covariance matrix estimation
- Sparse permutation invariant covariance estimation
- On the conditions used to prove oracle results for the Lasso
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Simultaneous analysis of Lasso and Dantzig selector
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Convergence of estimates under dimensionality restrictions
- Exact matrix completion via convex optimization
- Sparse Matrix Inversion with Scaled Lasso
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Model selection and estimation in the Gaussian graphical model
- First-Order Methods for Sparse Covariance Selection
- Quantile regression for competing risks data with missing cause of failure
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Comments on: \(\ell _{1}\)-penalization for mixture regression models
- Discussion: Latent variable graphical model selection via convex optimization