Optimal detection of sparse principal components in high dimension

From MaRDI portal
Publication:385763


DOI10.1214/13-AOS1127zbMath1277.62155arXiv1202.5070WikidataQ59409960 ScholiaQ59409960MaRDI QIDQ385763

Quentin Berthet, Philippe Rigollet

Publication date: 11 December 2013

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1202.5070


62H25: Factor analysis and principal components; correspondence analysis

90C22: Semidefinite programming


Related Items

An $\ell_{\infty}$ Eigenvector Perturbation Bound and Its Application to Robust Covariance Estimation, Community Detection and Stochastic Block Models, ECA: High-Dimensional Elliptical Component Analysis in Non-Gaussian Distributions, Unnamed Item, Unnamed Item, Sequential subspace change point detection, Scale-Invariant Sparse PCA on High-Dimensional Meta-Elliptical Data, Gaussian determinantal processes: A new model for directionality in data, Using ℓ1-Relaxation and Integer Programming to Obtain Dual Bounds for Sparse PCA, Finding a planted clique by adaptive probing, Proximal Distance Algorithms: Theory and Examples, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Wald Statistics in high-dimensional PCA, Exploring dimension learning via a penalized probabilistic principal component analysis, Optimal multiple change-point detection for high-dimensional data, Solving sparse principal component analysis with global support, Free Energy Wells and Overlap Gap Property in Sparse PCA, Sparse principal component analysis for high‐dimensional stationary time series, Public-key encryption from homogeneous CLWE, Inference for low-rank models, On lower bounds for the bias-variance trade-off, Two-sample Hypothesis Testing for Inhomogeneous Random Graphs, Sharp minimax tests for large covariance matrices and adaptation, Asymptotic power of sphericity tests for high-dimensional data, Approximation bounds for sparse principal component analysis, Comment on ``Hypothesis testing by convex optimization, A robust test for sphericity of high-dimensional covariance matrices, Algorithmic thresholds for tensor PCA, Minimax estimation in sparse canonical correlation analysis, Estimation of functionals of sparse covariance matrices, Community detection in sparse random networks, Guaranteed recovery of planted cliques and dense subgraphs by convex relaxation, Finding hidden cliques of size \(\sqrt{N/e}\) in nearly linear time, Hypothesis testing for high-dimensional multinomials: a selective review, Multidimensional two-component Gaussian mixtures detection, Recent developments in high dimensional covariance estimation and its related issues, a review, Bayesian inference for spectral projectors of the covariance matrix, Testing the order of a population spectral distribution for high-dimensional data, Optimal rates of statistical seriation, The spectral norm of random inner-product kernel matrices, Notes on computational-to-statistical gaps: predictions using statistical physics, Combinatorial inference for graphical models, High-dimensional covariance matrices in elliptical distributions with application to spherical test, Robust covariance estimation for approximate factor models, Detecting Markov random fields hidden in white noise, Projection tests for high-dimensional spiked covariance matrices, Optimality and sub-optimality of PCA. I: Spiked random matrix models, Slope meets Lasso: improved oracle bounds and optimality, Finding a large submatrix of a Gaussian random matrix, Sparse power factorization: balancing peakiness and sample complexity, Minimax rates in sparse, high-dimensional change point detection, Notes on computational hardness of hypothesis testing: predictions using the low-degree likelihood ratio, Isotonic regression with unknown permutations: statistics, computation and adaptation, Tensor clustering with planted structures: statistical optimality and computational limits, Testing for principal component directions under weak identifiability, Efficient estimation of linear functionals of principal components, Sparse signal reconstruction via the approximations of \(\ell_0\) quasinorm, Fundamental limits of detection in the spiked Wigner model, Statistical and computational limits for sparse matrix detection, Phase transitions for detecting latent geometry in random graphs, The limits of the sample spiked eigenvalues for a high-dimensional generalized Fisher matrix and its applications, Tests for covariance matrices in high dimension with less sample size, Optimal testing for planted satisfiability problems, Sparse equisigned PCA: algorithms and performance bounds in the noisy rank-1 setting, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Hypothesis testing for densities and high-dimensional multinomials: sharp local minimax rates, Sparse principal component analysis with missing observations, Sparsistency and agnostic inference in sparse PCA, Optimal estimation and rank detection for sparse spiked covariance matrices, Detecting positive correlations in a multivariate sample, Computational barriers in minimax submatrix detection, Do semidefinite relaxations solve sparse PCA up to the information limit?, High-dimensional change-point estimation: combining filtering with convex optimization, Large covariance estimation through elliptical factor models, Sparse PCA: optimal rates and adaptive estimation, Community detection in dense random networks, On the optimality of sliced inverse regression in high dimensions, Sharp detection boundaries on testing dense subhypergraph, Matrix means and a novel high-dimensional shrinkage phenomenon, Estimation of Wasserstein distances in the spiked transport model



Cites Work