Asymptotic normality and optimalities in estimation of large Gaussian graphical models

From MaRDI portal
Publication:152845

DOI10.1214/14-AOS1286zbMATH Open1328.62342arXiv1309.6024MaRDI QIDQ152845FDOQ152845


Authors: Zhao Ren, Tingni Sun, Cun-Hui Zhang, Harrison H. Zhou, Zhao Ren, Tingni Sun, Cun-Hui Zhang, Harrison H. Zhou Edit this on Wikidata


Publication date: 1 June 2015

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: The Gaussian graphical model, a popular paradigm for studying relationship among variables in a wide range of applications, has attracted great attention in recent years. This paper considers a fundamental question: When is it possible to estimate low-dimensional parameters at parametric square-root rate in a large Gaussian graphical model? A novel regression approach is proposed to obtain asymptotically efficient estimation of each entry of a precision matrix under a sparseness condition relative to the sample size. When the precision matrix is not sufficiently sparse, or equivalently the sample size is not sufficiently large, a lower bound is established to show that it is no longer possible to achieve the parametric rate in the estimation of each entry. This lower bound result, which provides an answer to the delicate sample size question, is established with a novel construction of a subset of sparse precision matrices in an application of Le Cam's lemma. Moreover, the proposed estimator is proven to have optimal convergence rate when the parametric rate cannot be achieved, under a minimal sample requirement. The proposed estimator is applied to test the presence of an edge in the Gaussian graphical model or to recover the support of the entire model, to obtain adaptive rate-optimal estimation of the entire precision matrix as measured by the matrix ellq operator norm and to make inference in latent variables in the graphical model. All of this is achieved under a sparsity condition on the precision matrix and a side condition on the range of its spectrum. This significantly relaxes the commonly imposed uniform signal strength condition on the precision matrix, irrepresentability condition on the Hessian tensor operator of the covariance matrix or the ell1 constraint on the precision matrix. Numerical results confirm our theoretical findings. The ROC curve of the proposed algorithm, Asymptotic Normal Thresholding (ANT), for support recovery significantly outperforms that of the popular GLasso algorithm.


Full work available at URL: https://arxiv.org/abs/1309.6024




Recommendations




Cites Work


Cited In (80)

Uses Software





This page was built for publication: Asymptotic normality and optimalities in estimation of large Gaussian graphical models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q152845)