Efficient estimation of linear functionals of principal components
From MaRDI portal
Publication:2176629
Abstract: We study principal component analysis (PCA) for mean zero i.i.d. Gaussian observations in a separable Hilbert space with unknown covariance operator The complexity of the problem is characterized by its effective rank where denotes the trace of and denotes its operator norm. We develop a method of bias reduction in the problem of estimation of linear functionals of eigenvectors of Under the assumption that we establish the asymptotic normality and asymptotic properties of the risk of the resulting estimators and prove matching minimax lower bounds, showing their semi-parametric optimality.
Recommendations
Cites Work
- scientific article; zbMATH DE number 3942782 (Why is no real title available?)
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Applications of the van Trees inequality: A Bayesian Cramér-Rao bound
- Asymptotic Statistics
- Asymptotic Theory for Principal Component Analysis
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Asymptotics of empirical eigenstructure for high dimensional spiked covariance
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- Bernstein-von Mises theorems for functionals of the covariance matrix
- Concentration inequalities and moment bounds for sample covariance operators
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Confidence sets for spectral projectors of covariance matrices
- Efficient estimation of linear functionals of principal components
- Estimation of functionals of sparse covariance matrices
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- Functional data analysis.
- Learning Theory
- Mathematical foundations of infinite-dimensional statistical models
- Minimax sparse principal subspace estimation in high dimensions
- New asymptotic results in principal component analysis
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Normal approximation and concentration of spectral projectors of sample covariance
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the distribution of the largest eigenvalue in principal components analysis
- On the principal components of sample covariance matrices
- Optimal detection of sparse principal components in high dimension
- Rate-optimal posterior contraction for sparse PCA
- Semiparametric efficiency bounds for high-dimensional models
- Smooth principal component analysis over two-dimensional manifolds with an application to neuroimaging
- Sparse PCA: optimal rates and adaptive estimation
- Statistical and computational trade-offs in estimation of sparse principal components
- The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices
- Weak convergence and empirical processes. With applications to statistics
Cited In (27)
- Principal components in linear mixed models with general bulk
- Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching
- Efficient Multidimensional Diracs Estimation With Linear Sample Complexity
- Estimating covariance and precision matrices along subspaces
- Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data
- Wald Statistics in high-dimensional PCA
- Efficient estimation of smooth functionals in Gaussian shift models
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- On the asymptotic normality and efficiency of Kronecker envelope principal component analysis
- Estimation of smooth functionals of location parameter in Gaussian and Poincaré random shift models.
- Sampled forms of functional PCA in reproducing kernel Hilbert spaces
- Inference for low-rank models
- Lower bounds for invariant statistical models with applications to principal component analysis
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Rates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCA
- On the sample covariance matrix estimator of reduced effective rank population matrices, with applications to fPCA
- Efficient estimation of linear functionals of principal components
- Convergence rate of Krasulina estimator
- Asymptotic efficiency in high-dimensional covariance estimation
- Distribution function estimation with calibration on principal components
- Estimation of smooth functionals in normal models: bias reduction and asymptotic efficiency
- Title not available (Why is no real title available?)
- PCA-kernel estimation
- Title not available (Why is no real title available?)
- Singular vector and singular subspace distribution for the matrix denoising model
- Inference for heteroskedastic PCA with missing data
- Estimation of smooth functionals in high-dimensional models: bootstrap chains and Gaussian approximation
Uses Software
This page was built for publication: Efficient estimation of linear functionals of principal components
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2176629)