Efficient estimation of linear functionals of principal components
DOI10.1214/19-AOS1816zbMATH Open1440.62232arXiv1708.07642MaRDI QIDQ2176629FDOQ2176629
Authors: Matthias Löffler, Richard Nickl, Vladimir Koltchinskii
Publication date: 5 May 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1708.07642
Recommendations
Factor analysis and principal components; correspondence analysis (62H25) Asymptotic distribution theory in statistics (62E20) Central limit and other weak theorems (60F05) Approximations to statistical distributions (nonasymptotic) (62E17)
Cites Work
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Weak convergence and empirical processes. With applications to statistics
- Functional data analysis.
- Asymptotic Statistics
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Title not available (Why is that?)
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the distribution of the largest eigenvalue in principal components analysis
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- Sparse PCA: optimal rates and adaptive estimation
- Optimal detection of sparse principal components in high dimension
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Asymptotics of empirical eigenstructure for high dimensional spiked covariance
- Applications of the van Trees inequality: A Bayesian Cramér-Rao bound
- Asymptotic Theory for Principal Component Analysis
- The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices
- Rate-optimal posterior contraction for sparse PCA
- Bernstein-von Mises theorems for functionals of the covariance matrix
- Minimax sparse principal subspace estimation in high dimensions
- Semiparametric efficiency bounds for high-dimensional models
- Statistical and computational trade-offs in estimation of sparse principal components
- Learning Theory
- New asymptotic results in principal component analysis
- Concentration inequalities and moment bounds for sample covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Smooth principal component analysis over two-dimensional manifolds with an application to neuroimaging
- On the principal components of sample covariance matrices
- Normal approximation and concentration of spectral projectors of sample covariance
- Estimation of functionals of sparse covariance matrices
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Confidence sets for spectral projectors of covariance matrices
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Efficient estimation of linear functionals of principal components
Cited In (22)
- Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching
- Efficient Multidimensional Diracs Estimation With Linear Sample Complexity
- Estimating covariance and precision matrices along subspaces
- Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data
- Wald Statistics in high-dimensional PCA
- Efficient estimation of smooth functionals in Gaussian shift models
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- Estimation of smooth functionals of location parameter in Gaussian and Poincaré random shift models.
- Sampled forms of functional PCA in reproducing kernel Hilbert spaces
- Inference for low-rank models
- Lower bounds for invariant statistical models with applications to principal component analysis
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Rates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCA
- Efficient estimation of linear functionals of principal components
- Distribution function estimation with calibration on principal components
- Estimation of smooth functionals in normal models: bias reduction and asymptotic efficiency
- Title not available (Why is that?)
- PCA-kernel estimation
- Title not available (Why is that?)
- Singular vector and singular subspace distribution for the matrix denoising model
- Inference for heteroskedastic PCA with missing data
- Estimation of smooth functionals in high-dimensional models: bootstrap chains and Gaussian approximation
Uses Software
This page was built for publication: Efficient estimation of linear functionals of principal components
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2176629)