Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
From MaRDI portal
Publication:447861
DOI10.1214/12-AOS1000zbMath1274.62219arXiv1102.4807OpenAlexW3106324661MaRDI QIDQ447861
Martin J. Wainwright, Alekh Agarwal, Sahand N. Negahban
Publication date: 29 August 2012
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1102.4807
Related Items
Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas, Unbiased risk estimates for matrix estimation in the elliptical case, Robust low-rank data matrix approximations, Two-stage convex relaxation approach to least squares loss constrained low-rank plus sparsity optimization problems, An empirical study into finding optima in stochastic optimization of neural networks, Low-rank diffusion matrix estimation for high-dimensional time-changed Lévy processes, Regularized high dimension low tubal-rank tensor regression, Robust inference of risks of large portfolios, Inertial Proximal ADMM for Linearly Constrained Separable Convex Optimization, Sharp MSE bounds for proximal denoising, Unnamed Item, Unnamed Item, Unnamed Item, Large covariance estimation through elliptical factor models, Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem, Learning Gaussian graphical models with latent confounders, Majorized iPADMM for Nonseparable Convex Minimization Models with Quadratic Coupling Terms, Unnamed Item, Efficient learning rate adaptation based on hierarchical optimization approach, Transfer Learning in Large-Scale Gaussian Graphical Models with False Discovery Rate Control, Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression, Compressed sensing and matrix completion with constant proportion of corruptions, Robust matrix estimations meet Frank-Wolfe algorithm, Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression, Large factor model estimation by nuclear norm plus \(\ell_1\) norm penalization, Bridging factor and sparse models, Multiple Change Point Detection in Reduced Rank High Dimensional Vector Autoregressive Models, Detecting approximate replicate components of a high-dimensional random vector with latent structure, Discriminant analysis in small and large dimensions, Pivotal variable detection of the covariance matrix and its application to high-dimensional factor models, Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, Statistical inference based on robust low-rank data matrix approximation, Cross: efficient low-rank tensor completion, Multi-stage convex relaxation method for low-rank and sparse matrix separation problem, Robust covariance estimation for approximate factor models, Outlier detection in networks with missing links, Robust bilinear factorization with missing and grossly corrupted observations, Robust matrix completion, Two-stage convex relaxation approach to low-rank and sparsity regularized least squares loss, Detection of block-exchangeable structure in large-scale correlation matrices, Large Covariance Estimation by Thresholding Principal Orthogonal Complements, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, A large covariance matrix estimator under intermediate spikiness regimes, Exact and asymptotic tests on a factor model in low and large dimensions with applications, Matrix optimization based Euclidean embedding with outliers, Linear Models Based on Noisy Data and the Frisch Scheme, Fast global convergence of gradient methods for high-dimensional statistical recovery, An alternating minimization algorithm for Factor Analysis, Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data, Low-Rank Approximation and Completion of Positive Tensors, A proximal alternating direction method for multi-block coupled convex optimization, Scalable Robust Matrix Recovery: Frank--Wolfe Meets Proximal Methods, Spectral thresholding for the estimation of Markov chain transition operators, Rank regularized estimation of approximate factor models, Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- On the distribution of the largest eigenvalue in principal components analysis
- Two proposals for robust PCA using semidefinite programming
- Robust principal component analysis?
- Rank-Sparsity Incoherence for Matrix Decomposition
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Robust PCA via Outlier Pursuit
- Robust Matrix Decomposition With Sparse Corruptions
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Convex Analysis
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers