Concentration inequalities and moment bounds for sample covariance operators
From MaRDI portal
(Redirected from Publication:502859)
Abstract: Let be i.i.d. centered Gaussian random variables in a separable Banach space with covariance operator Sigma:E^{ast}mapsto E, Sigma u = {mathbb E}langle X,u
angle, uin E^{ast}. The sample covariance operator is defined as hat Sigma u := n^{-1}sum_{j=1}^n langle X_j,u
angle X_j, uin E^{ast}. The goal of the paper is to obtain concentration inequalities and expectation bounds for the operator norm of the deviation of the sample covariance operator from the true covariance operator. In particular, it is shown that {mathbb E}|hat Sigma-Sigma|asymp |Sigma|�iggl(sqrt{frac{{�f r}(Sigma)}{n}}�igvee frac{{�f r}(Sigma)}{n}�iggr), where {�f r}(Sigma):=frac{Bigl({mathbb E}|X|Bigr)^2}{|Sigma|}. Moreover, under the assumption that it is proved that, for all with probability at least �egin{align*} Bigl||hatSigma - Sigma|-{mathbb E}|hatSigma - Sigma|Bigr| lesssim |Sigma|�iggl(sqrt{frac{t}{n}}�igvee frac{t}{n}�iggr). end{align*}
Recommendations
- Normal approximation and concentration of spectral projectors of sample covariance
- New concentration inequalities for suprema of empirical processes
- Relative perturbation bounds with applications to empirical covariance operators
- Asymptotics and concentration bounds for bilinear forms of spectral projectors of sample covariance
- Concentration inequalities for random tensors
Cited in
(77)- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- High-resolution signal recovery via generalized sampling and functional principal component analysis
- Robust modifications of U-statistics and applications to covariance estimation problems
- New challenges in covariance estimation: multiple structures and coarse quantization
- Covariance estimation under one-bit quantization
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Contiguity under high-dimensional Gaussianity with applications to covariance testing
- Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching
- Noise covariance estimation in multi-task high-dimensional linear models
- Time-uniform, nonparametric, nonasymptotic confidence sequences
- Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method
- A note on the prediction error of principal component regression in high dimensions
- Relative perturbation bounds with applications to empirical covariance operators
- The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising
- Robustifying Markowitz
- Robust covariance estimation under \(L_4\)-\(L_2\) norm equivalence
- Estimating covariance and precision matrices along subspaces
- Fast randomized numerical rank estimation for numerically low-rank matrices
- Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries
- On consistency and sparsity for high-dimensional functional time series with application to autoregressions
- All-in-one robust estimator of the Gaussian mean
- Composite Index Construction with Expert Opinion
- Large-dimensional central limit theorem with fourth-moment error bounds on convex sets and balls
- On the non-asymptotic concentration of heteroskedastic Wishart-type matrix
- Construction and Monte Carlo estimation of wavelet frames generated by a reproducing kernel
- Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data
- Cross-validation with confidence
- CDPA: common and distinctive pattern analysis between high-dimensional datasets
- Tangent space and dimension estimation with the Wasserstein distance
- Covariance estimation under missing observations and \(L_4 - L_2\) moment equivalence
- On cumulative slicing estimation for high dimensional data
- The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics
- Model Reduction for Nonlinear Systems by Balanced Truncation of State and Gradient Covariance
- Convexification with bounded gap for randomly projected quadratic optimization
- Wald Statistics in high-dimensional PCA
- SONIC: social network analysis with influencers and communities
- Hanson-Wright inequality in Banach spaces
- Efficient estimation of smooth functionals in Gaussian shift models
- An elementary analysis of ridge regression with random design
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- Robust high-dimensional factor models with applications to statistical machine learning
- Approximating \(L_p\) unit balls via random sampling
- Dimension-free bounds for sums of independent matrices and simple tensors via the variational principle
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Minimax rates in sparse, high-dimensional change point detection
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- Optimal estimation of Schatten norms of a rectangular matrix
- New asymptotic results in principal component analysis
- Efficient estimation of linear functionals of principal components
- Bayesian inference for spectral projectors of the covariance matrix
- Time-uniform Chernoff bounds via nonnegative supermartingales
- Convergence rate of Krasulina estimator
- Distributed estimation of principal eigenspaces
- Mean estimation in high dimension
- Estimation of smooth functionals in normal models: bias reduction and asymptotic efficiency
- Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions
- Non-zero constraints in elliptic PDE with random boundary values and applications to hybrid inverse problems
- On convergence rates of adaptive ensemble Kalman inversion for linear ill-posed problems
- Bootstrap confidence sets for spectral projectors of sample covariance
- Perturbation bounds for eigenspaces under a relative gap condition
- Model assisted variable clustering: minimax-optimal recovery and algorithms
- Improved covariance estimation: optimal robustness and sub-Gaussian guarantees under heavy tails
- Deep learning: a statistical viewpoint
- scientific article; zbMATH DE number 7625199 (Why is no real title available?)
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Dimensionality Reduction, Regularization, and Generalization in Overparameterized Regressions
- Detecting approximate replicate components of a high-dimensional random vector with latent structure
- Moment bounds for large autocovariance matrices under dependence
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- Matrix concentration inequalities and free probability
- Confidence sets for spectral projectors of covariance matrices
- Learning Gaussian graphical models with latent confounders
- MSE bounds for estimators of matrix functions
- Nonasymptotic one- and two-sample tests in high dimension with unknown covariance structure
- Optimal multiple change-point detection for high-dimensional data
- An \({\ell_p}\) theory of PCA and spectral clustering
- Normal approximation and concentration of spectral projectors of sample covariance
This page was built for publication: Concentration inequalities and moment bounds for sample covariance operators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q502859)