Lower bounds for invariant statistical models with applications to principal component analysis
From MaRDI portal
Publication:2157446
Abstract: This paper develops nonasymptotic information inequalities for the estimation of the eigenspaces of a covariance operator. These results generalize previous lower bounds for the spiked covariance model, and they show that recent upper bounds for models with decaying eigenvalues are sharp. The proof relies on lower bound techniques based on group invariance arguments which can also deal with a variety of other statistical models.
Recommendations
- The statistics and mathematics of high dimension low sample size asymptotics
- Minimax sparse principal subspace estimation in high dimensions
- Efficient estimation of linear functionals of principal components
- Minimax bounds for sparse PCA with noisy high-dimensional data
- High-dimensional principal projections
Cites Work
- scientific article; zbMATH DE number 1713116 (Why is no real title available?)
- scientific article; zbMATH DE number 3886886 (Why is no real title available?)
- scientific article; zbMATH DE number 5546942 (Why is no real title available?)
- scientific article; zbMATH DE number 3930109 (Why is no real title available?)
- scientific article; zbMATH DE number 3733065 (Why is no real title available?)
- scientific article; zbMATH DE number 51418 (Why is no real title available?)
- scientific article; zbMATH DE number 1261669 (Why is no real title available?)
- scientific article; zbMATH DE number 1302661 (Why is no real title available?)
- scientific article; zbMATH DE number 1465044 (Why is no real title available?)
- scientific article; zbMATH DE number 3441440 (Why is no real title available?)
- scientific article; zbMATH DE number 841537 (Why is no real title available?)
- scientific article; zbMATH DE number 7370563 (Why is no real title available?)
- scientific article; zbMATH DE number 5204610 (Why is no real title available?)
- A large deviation theorem for the empirical eigenvalue distribution of random unitary matrices
- A new concentration result for regularized risk minimizers
- A van Trees inequality for estimators on manifolds
- An introduction to random matrices
- Applications of the van Trees inequality: A Bayesian Cramér-Rao bound
- Asymptotic Statistics
- Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Asymptotics of sample eigenstructure for a large dimensional spiked covariance model
- Efficient estimation of linear functionals of principal components
- Estimation of a covariance matrix using the reference prior
- Exact Minimax Estimation for Phase Synchronization
- Finite sample approximation results for principal component analysis: A matrix perturbation approach
- High-dimensional principal projections
- Information geometry
- Introduction to nonparametric estimation
- Large deviations for functions of two random projection matrices
- Large deviations techniques and applications.
- Mathematical theory of statistics. Statistical experiments and asymptotic decision theory
- Mathematics, Models, and Magz, Part I: Patterns in Pascal's Triangle and Tetrahedron
- Message-passing algorithms for synchronization problems over compact groups
- Methodology and convergence rates for functional linear regression
- Minimax sparse principal subspace estimation in high dimensions
- Multivariate calculation. Use of the continuous groups
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Optimal rates for regularization of statistical inverse learning problems
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
- Perturbation bounds for eigenspaces under a relative gap condition
- Random perturbation of low rank matrices: improving classical bounds
- Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics
- Real Analysis and Probability
- Sparse PCA: optimal rates and adaptive estimation
- Stochastic Equations in Infinite Dimensions
- Support Vector Machines
- The random matrix theory of the classical compact groups
- Theoretical foundations of functional data analysis, with an introduction to linear operators
Cited In (6)
- On lower bounds for the bias-variance trade-off
- Functional estimation in log-concave location families
- Transposition invariant principal component analysis in \(L_{1}\) for long tailed data
- Tight query complexity lower bounds for PCA via finite sample deformed Wigner law
- Re-thinking high-dimensional mathematical statistics. Abstracts from the workshop held May 15--21, 2022
- Van Trees inequality, group equivariance, and estimation of principal subspaces
This page was built for publication: Lower bounds for invariant statistical models with applications to principal component analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2157446)