Concentration inequalities and moment bounds for sample covariance operators

From MaRDI portal
Publication:502859

DOI10.3150/15-BEJ730zbMath1366.60057arXiv1405.2468OpenAlexW2524946712MaRDI QIDQ502859

Karim Lounici, Vladimir I. Koltchinskii

Publication date: 11 January 2017

Published in: Bernoulli (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1405.2468



Related Items

On the non-asymptotic concentration of heteroskedastic Wishart-type matrix, Wald Statistics in high-dimensional PCA, A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, High-resolution signal recovery via generalized sampling and functional principal component analysis, Deep learning: a statistical viewpoint, Construction and Monte Carlo estimation of wavelet frames generated by a reproducing kernel, All-in-one robust estimator of the Gaussian mean, Non-zero constraints in elliptic PDE with random boundary values and applications to hybrid inverse problems, CDPA: common and distinctive pattern analysis between high-dimensional datasets, Optimal multiple change-point detection for high-dimensional data, Dimensionality Reduction, Regularization, and Generalization in Overparameterized Regressions, Bayesian inference for spectral projectors of the covariance matrix, Convexification with Bounded Gap for Randomly Projected Quadratic Optimization, Asymptotically efficient estimation of smooth functionals of covariance operators, Model assisted variable clustering: minimax-optimal recovery and algorithms, Efficient estimation of linear functionals of principal components, Unnamed Item, SONIC: social network analysis with influencers and communities, Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method, On convergence rates of adaptive ensemble Kalman inversion for linear ill-posed problems, A note on the prediction error of principal component regression in high dimensions, Learning Gaussian graphical models with latent confounders, Model Reduction for Nonlinear Systems by Balanced Truncation of State and Gradient Covariance, Time-uniform Chernoff bounds via nonnegative supermartingales, New asymptotic results in principal component analysis, Matrix concentration inequalities and free probability, Nonasymptotic one- and two-sample tests in high dimension with unknown covariance structure, Contiguity under high-dimensional Gaussianity with applications to covariance testing, Nonasymptotic upper bounds for the reconstruction error of PCA, Bootstrapping max statistics in high dimensions: near-parametric rates under weak variance decay and application to functional and multinomial data, Robust covariance estimation under \(L_4\)-\(L_2\) norm equivalence, Robustifying Markowitz, Fast randomized numerical rank estimation for numerically low-rank matrices, Dimension-free bounds for sums of independent matrices and simple tensors via the variational principle, Mean estimation in high dimension, Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions, Detecting approximate replicate components of a high-dimensional random vector with latent structure, Confidence sets for spectral projectors of covariance matrices, Finite-sample analysis of \(M\)-estimators using self-concordance, Estimating covariance and precision matrices along subspaces, MSE bounds for estimators of matrix functions, Cross-Validation With Confidence, Unnamed Item, Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries, Convergence rate of Krasulina estimator, Robust modifications of U-statistics and applications to covariance estimation problems, Hanson-Wright inequality in Banach spaces, The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising, Distributed estimation of principal eigenspaces, Robust high-dimensional factor models with applications to statistical machine learning, Approximating \(L_p\) unit balls via random sampling, Time-uniform, nonparametric, nonasymptotic confidence sequences, Minimax rates in sparse, high-dimensional change point detection, Finite impulse response models: a non-asymptotic analysis of the least squares estimator, Efficient estimation of smooth functionals in Gaussian shift models, Perturbation bounds for eigenspaces under a relative gap condition, Estimation of smooth functionals in normal models: bias reduction and asymptotic efficiency, Bootstrap confidence sets for spectral projectors of sample covariance, An elementary analysis of ridge regression with random design, The two-to-infinity norm and singular subspace geometry with applications to high-dimensional statistics, Moment bounds for large autocovariance matrices under dependence, Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices, An \({\ell_p}\) theory of PCA and spectral clustering, New challenges in covariance estimation: multiple structures and coarse quantization, Bootstrapping the operator norm in high dimensions: error estimation for covariance matrices and sketching, On consistency and sparsity for high-dimensional functional time series with application to autoregressions, Relative perturbation bounds with applications to empirical covariance operators, Covariance estimation under one-bit quantization