Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
From MaRDI portal
Publication:2054540
DOI10.1214/21-AOS2066zbMath1486.62181arXiv2001.05484MaRDI QIDQ2054540
Yuxin Chen, Yuling Yan, Cong Ma, Jianqing Fan
Publication date: 3 December 2021
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.05484
convex relaxationrobust principal component analysisleave-one-out analysis\( \ell_\infty\) guarantees
Factor analysis and principal components; correspondence analysis (62H25) Estimation in multivariate analysis (62H12) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery, Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization, Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Latent variable graphical model selection via convex optimization
- High dimensional covariance matrix estimation using a factor model
- Fast alternating linearization methods for minimizing the sum of two convex functions
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Minimax risk of matrix denoising by singular value thresholding
- Angular synchronization by eigenvectors and semidefinite programming
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Robust matrix completion
- Robust covariance estimation for approximate factor models
- Compressed sensing and matrix completion with constant proportion of corruptions
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Sparse PCA: optimal rates and adaptive estimation
- Noisy low-rank matrix completion with general sampling distribution
- Exact matrix completion via convex optimization
- Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
- Guaranteed Matrix Completion via Non-Convex Factorization
- Incoherence-Optimal Matrix Completion
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Robust Spectral Compressed Sensing via Structured Matrix Completion
- Recursive Robust PCA or Recursive Sparse Recovery in Large but Structured Noise
- Blind Deconvolution Using Convex Programming
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- Robust principal component analysis?
- Rank-Sparsity Incoherence for Matrix Decomposition
- An $\ell_{\infty}$ Eigenvector Perturbation Bound and Its Application to Robust Covariance Estimation
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- An Online Algorithm for Separating Sparse and Low-Dimensional Signal Sequences From Their Sum
- Phase Retrieval Using Alternating Minimization
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Accelerated Alternating Projections for Robust Principal Component Analysis
- On the Convex Geometry of Blind Deconvolution and Matrix Completion
- Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization
- Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
- Composite optimization for robust rank one bilinear sensing
- Inference and uncertainty quantification for noisy matrix completion
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Robust Matrix Decomposition With Sparse Corruptions
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Matrix Completion From a Few Entries
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Learning Theory
- Low-rank matrix completion using alternating minimization
- Large Covariance Estimation by Thresholding Principal Orthogonal Complements
- Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization