Principal regression for high dimensional covariance matrices
From MaRDI portal
Publication:2233571
DOI10.1214/21-EJS1887zbMATH Open1471.62433arXiv2007.12740OpenAlexW3200776430MaRDI QIDQ2233571FDOQ2233571
Authors: Yanyan Li
Publication date: 11 October 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Abstract: This manuscript presents an approach to perform generalized linear regression with multiple high dimensional covariance matrices as the outcome. Model parameters are proposed to be estimated by maximizing a pseudo-likelihood. When the data are high dimensional, the normal likelihood function is ill-posed as the sample covariance matrix is rank-deficient. Thus, a well-conditioned linear shrinkage estimator of the covariance matrix is introduced. With multiple covariance matrices, the shrinkage coefficients are proposed to be common across matrices. Theoretical studies demonstrate that the proposed covariance matrix estimator is optimal achieving the uniformly minimum quadratic loss asymptotically among all linear combinations of the identity matrix and the sample covariance matrix. Under regularity conditions, the proposed estimator of the model parameters is consistent. The superior performance of the proposed approach over existing methods is illustrated through simulation studies. Implemented to a resting-state functional magnetic resonance imaging study acquired from the Alzheimer's Disease Neuroimaging Initiative, the proposed approach identified a brain network within which functional connectivity is significantly associated with Apolipoprotein E 4, a strong genetic marker for Alzheimer's disease.
Full work available at URL: https://arxiv.org/abs/2007.12740
Recommendations
- Covariance-regularized regression and classification for high dimensional problems
- High-dimensional covariance estimation
- Semiparametric partial common principal component analysis for covariance matrices
- Condition-number-regularized covariance estimation
- Bayesian generalized low rank regression models for neuroimaging phenotypes and genetic markers
Multivariate analysis (62H99) Linear inference, regression (62J99) Applications of statistics to biology and medical sciences; meta analysis (62P10)
Cites Work
- Sparsity and Smoothness Via the Fused Lasso
- A distribution-free M-estimator of multivariate scatter
- A well-conditioned estimator for large-dimensional covariance matrices
- Nonlinear shrinkage estimation of large-dimensional covariance matrices
- Spectral models for covariance matrices
- The Matrix-Logarithmic Covariance Model
- On the limit of the largest eigenvalue of the large dimensional sample covariance matrix
- Limit of the smallest eigenvalue of a large dimensional sample covariance matrix
- The statistical analysis of fMRI data
- Asymptotic Theory for Principal Component Analysis
- Shrinkage Estimators for Covariance Matrices
- A Hierarchical Eigenmodel for Pooled Covariance Estimation
- Robust Shrinkage Estimation of High-Dimensional Covariance Matrices
- Rejoinder
- Asymptotically efficient estimation of covariance matrices with linear structure
- Pseudo maximum likelihood estimation: Theory and applications
- A covariance regression model
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Simultaneous modelling of the Cholesky decomposition of several covariance matrices
- Generalized Robust Shrinkage Estimator and Its Application to STAP Detection Problem
- Bayesian nonparametric covariance regression
- Sparse principal component based high-dimensional mediation analysis
- Shared subspace models for multi-group covariance estimation
Cited In (8)
- Reducing subspace models for large‐scale covariance regression
- Covariance-regularized regression and classification for high dimensional problems
- Principal varying coefficient estimator for high-dimensional models
- A low rank-based estimation-testing procedure for matrix-covariate regression
- Title not available (Why is that?)
- An efficient randomized QLP algorithm for approximating the singular value decomposition
- Calibrated multivariate regression with application to neural semantic basis discovery
- Convergence and prediction of principal component scores in high-dimensional settings
Uses Software
This page was built for publication: Principal regression for high dimensional covariance matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2233571)