Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
From MaRDI portal
Publication:2054540
Abstract: This paper delivers improved theoretical guarantees for the convex programming approach in low-rank matrix estimation, in the presence of (1) random noise, (2) gross sparse outliers, and (3) missing data. This problem, often dubbed as robust principal component analysis (robust PCA), finds applications in various domains. Despite the wide applicability of convex relaxation, the available statistical support (particularly the stability analysis vis-`a-vis random noise) remains highly suboptimal, which we strengthen in this paper. When the unknown matrix is well-conditioned, incoherent, and of constant rank, we demonstrate that a principled convex program achieves near-optimal statistical accuracy, in terms of both the Euclidean loss and the loss. All of this happens even when nearly a constant fraction of observations are corrupted by outliers with arbitrary magnitudes. The key analysis idea lies in bridging the convex program in use and an auxiliary nonconvex optimization algorithm, and hence the title of this paper.
Recommendations
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Robust principal component analysis?
- Strongly convex programming for exact matrix completion and robust principal component analysis
- Robust PCA by manifold optimization
- Exact guarantees on the absence of spurious local minima for non-negative rank-1 robust principal component analysis
Cites work
- Accelerated Alternating Projections for Robust Principal Component Analysis
- An Online Algorithm for Separating Sparse and Low-Dimensional Signal Sequences From Their Sum
- An \(\ell_{\infty}\) eigenvector perturbation bound and its application
- Angular synchronization by eigenvectors and semidefinite programming
- Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization
- Blind Deconvolution Using Convex Programming
- Clustering partially observed graphs via convex optimization
- Composite optimization for robust rank one bilinear sensing
- Compressed sensing and matrix completion with constant proportion of corruptions
- Exact matrix completion via convex optimization
- Fast alternating linearization methods for minimizing the sum of two convex functions
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Guaranteed Matrix Completion via Non-Convex Factorization
- Guarantees of Riemannian optimization for low rank matrix recovery
- High dimensional covariance matrix estimation using a factor model
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Incoherence-Optimal Matrix Completion
- Inference and uncertainty quantification for noisy matrix completion
- Large covariance estimation by thresholding principal orthogonal complements. With discussion and authors' reply
- Latent variable graphical model selection via convex optimization
- Learning Theory
- Low-rank matrix completion using alternating minimization
- Matrix Completion From a Few Entries
- Matrix completion with noisy entries and outliers
- Minimax risk of matrix denoising by singular value thresholding
- Noisy low-rank matrix completion with general sampling distribution
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization
- Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- On the convex geometry of blind deconvolution and matrix completion
- Phase Retrieval Using Alternating Minimization
- Phase retrieval via Wirtinger flow: theory and algorithms
- Rank-Sparsity Incoherence for Matrix Decomposition
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- Recursive Robust PCA or Recursive Sparse Recovery in Large but Structured Noise
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Robust Matrix Decomposition With Sparse Corruptions
- Robust Spectral Compressed Sensing via Structured Matrix Completion
- Robust covariance estimation for approximate factor models
- Robust matrix completion
- Robust principal component analysis?
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- Sparse PCA: optimal rates and adaptive estimation
- Spectral regularization algorithms for learning large incomplete matrices
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- The projected power method: an efficient algorithm for joint alignment from pairwise differences
Cited in
(15)- A short note on strongly convex programming for exact matrix completion and robust principal component analysis
- Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization
- Robust PCA by manifold optimization
- Inference for heteroskedastic PCA with missing data
- A new complexity metric for nonconvex rank-one generalized matrix completion
- Matrix rigidity and the ill-posedness of robust PCA and matrix completion
- Spectral analysis of Gram matrices with missing at random observations: convergence, central limit theorems, and applications in statistical inference
- Analysis of a nonsmooth optimization approach to robust estimation
- Non-convex matrix completion and related problems via strong duality
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Comment: Feature Screening and Variable Selection via Iterative Ridge Regression
- GNMR: a provable one-line algorithm for low rank matrix recovery
- Strongly convex programming for exact matrix completion and robust principal component analysis
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Stable local-smooth principal component pursuit
This page was built for publication: Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2054540)