Concentration inequalities and moment bounds for sample covariance operators
From MaRDI portal
Publication:502859
DOI10.3150/15-BEJ730zbMATH Open1366.60057arXiv1405.2468OpenAlexW2524946712MaRDI QIDQ502859FDOQ502859
Authors: K. Lounici, Vladimir Koltchinskii
Publication date: 11 January 2017
Published in: Bernoulli (Search for Journal in Brave)
Abstract: Let be i.i.d. centered Gaussian random variables in a separable Banach space with covariance operator Sigma:E^{ast}mapsto E, Sigma u = {mathbb E}langle X,u
angle, uin E^{ast}. The sample covariance operator is defined as hat Sigma u := n^{-1}sum_{j=1}^n langle X_j,u
angle X_j, uin E^{ast}. The goal of the paper is to obtain concentration inequalities and expectation bounds for the operator norm of the deviation of the sample covariance operator from the true covariance operator. In particular, it is shown that {mathbb E}|hat Sigma-Sigma|asymp |Sigma|�iggl(sqrt{frac{{�f r}(Sigma)}{n}}�igvee frac{{�f r}(Sigma)}{n}�iggr), where {�f r}(Sigma):=frac{Bigl({mathbb E}|X|Bigr)^2}{|Sigma|}. Moreover, under the assumption that it is proved that, for all with probability at least �egin{align*} Bigl||hatSigma - Sigma|-{mathbb E}|hatSigma - Sigma|Bigr| lesssim |Sigma|�iggl(sqrt{frac{t}{n}}�igvee frac{t}{n}�iggr). end{align*}
Full work available at URL: https://arxiv.org/abs/1405.2468
Recommendations
- Normal approximation and concentration of spectral projectors of sample covariance
- New concentration inequalities for suprema of empirical processes
- Relative perturbation bounds with applications to empirical covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Concentration inequalities for random tensors
Gaussian processes (60G15) Probability theory on linear topological spaces (60B11) Inequalities; stochastic orderings (60E15)
Cited In (77)
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- High-resolution signal recovery via generalized sampling and functional principal component analysis
- Robust modifications of U-statistics and applications to covariance estimation problems
- New challenges in covariance estimation: multiple structures and coarse quantization
- Covariance estimation under one-bit quantization
- Finite-sample analysis of \(M\)-estimators using self-concordance
- A note on the prediction error of principal component regression in high dimensions
- Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching
- Time-uniform, nonparametric, nonasymptotic confidence sequences
- Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method
- Relative perturbation bounds with applications to empirical covariance operators
- The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising
- Robust covariance estimation under \(L_4\)-\(L_2\) norm equivalence
- Estimating covariance and precision matrices along subspaces
- Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries
- On consistency and sparsity for high-dimensional functional time series with application to autoregressions
- All-in-one robust estimator of the Gaussian mean
- Cross-validation with confidence
- On the non-asymptotic concentration of heteroskedastic Wishart-type matrix
- Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data
- Construction and Monte Carlo estimation of wavelet frames generated by a reproducing kernel
- CDPA: common and distinctive pattern analysis between high-dimensional datasets
- On cumulative slicing estimation for high dimensional data
- The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics
- SONIC: social network analysis with influencers and communities
- Hanson-Wright inequality in Banach spaces
- Efficient estimation of smooth functionals in Gaussian shift models
- An elementary analysis of ridge regression with random design
- Robust high-dimensional factor models with applications to statistical machine learning
- Approximating \(L_p\) unit balls via random sampling
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Minimax rates in sparse, high-dimensional change point detection
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- New asymptotic results in principal component analysis
- Efficient estimation of linear functionals of principal components
- Time-uniform Chernoff bounds via nonnegative supermartingales
- Convergence rate of Krasulina estimator
- Bayesian inference for spectral projectors of the covariance matrix
- Distributed estimation of principal eigenspaces
- Estimation of smooth functionals in normal models: bias reduction and asymptotic efficiency
- On convergence rates of adaptive ensemble Kalman inversion for linear ill-posed problems
- Bootstrap confidence sets for spectral projectors of sample covariance
- Perturbation bounds for eigenspaces under a relative gap condition
- Model assisted variable clustering: minimax-optimal recovery and algorithms
- Deep learning: a statistical viewpoint
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Dimensionality Reduction, Regularization, and Generalization in Overparameterized Regressions
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- Moment bounds for large autocovariance matrices under dependence
- Matrix concentration inequalities and free probability
- Confidence sets for spectral projectors of covariance matrices
- Optimal multiple change-point detection for high-dimensional data
- MSE bounds for estimators of matrix functions
- An \({\ell_p}\) theory of PCA and spectral clustering
- Normal approximation and concentration of spectral projectors of sample covariance
- Contiguity under high-dimensional Gaussianity with applications to covariance testing
- Noise covariance estimation in multi-task high-dimensional linear models
- Robustifying Markowitz
- Fast randomized numerical rank estimation for numerically low-rank matrices
- Composite Index Construction with Expert Opinion
- Large-dimensional central limit theorem with fourth-moment error bounds on convex sets and balls
- Tangent space and dimension estimation with the Wasserstein distance
- Covariance estimation under missing observations and \(L_4 - L_2\) moment equivalence
- Model Reduction for Nonlinear Systems by Balanced Truncation of State and Gradient Covariance
- Convexification with bounded gap for randomly projected quadratic optimization
- Wald Statistics in high-dimensional PCA
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- Dimension-free bounds for sums of independent matrices and simple tensors via the variational principle
- Optimal estimation of Schatten norms of a rectangular matrix
- Mean estimation in high dimension
- Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions
- Non-zero constraints in elliptic PDE with random boundary values and applications to hybrid inverse problems
- Improved covariance estimation: optimal robustness and sub-Gaussian guarantees under heavy tails
- Title not available (Why is that?)
- Detecting approximate replicate components of a high-dimensional random vector with latent structure
- Learning Gaussian graphical models with latent confounders
- Nonasymptotic one- and two-sample tests in high dimension with unknown covariance structure
This page was built for publication: Concentration inequalities and moment bounds for sample covariance operators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q502859)