Relations among some low-rank subspace recovery models
From MaRDI portal
Publication:5380321
DOI10.1162/NECO_A_00762zbMATH Open1418.62250arXiv1412.2196WikidataQ40751270 ScholiaQ40751270MaRDI QIDQ5380321FDOQ5380321
Chao Zhang, Hongyang Zhang, Junbin Gao, Zhouchen Lin
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Abstract: Recovering intrinsic low dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, there has been a lot of work that models subspace recovery as low rank minimization problems. We find that some representative models, such as Robust Principal Component Analysis (R-PCA), Robust Low Rank Representation (R-LRR), and Robust Latent Low Rank Representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find better solutions to these low rank models at overwhelming probabilities, although these models are non-convex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low complexity randomized algorithms, e.g., our novel filtering algorithm, to R-PCA. Experiments verify the advantages of our algorithms over other state-of-the-art ones that are based on the alternating direction method.
Full work available at URL: https://arxiv.org/abs/1412.2196
Recommendations
- Random consensus robust PCA
- A novel M-estimator for robust PCA
- Robust principal component analysis: a factorization-based approach with linear complexity
- Robust alternating low-rank representation by joint \(L_p\)- and \(L_{2,p}\)-norm minimization
- Robust PCA and subspace tracking from incomplete observations using \(\ell _0\)-surrogates
Nonparametric robustness (62G35) Factor analysis and principal components; correspondence analysis (62H25)
Cites Work
- Title not available (Why is that?)
- Robust principal component analysis?
- A Novel M-Estimator for Robust PCA
- Rank-Sparsity Incoherence for Matrix Decomposition
- Robust Matrix Decomposition With Sparse Corruptions
- Randomized Algorithms for Matrices and Data
- Identification of hybrid systems. A tutorial
- A framework for robust subspace learning
- Robust PCA via Outlier Pursuit
- TILT: transform invariant low-rank textures
- Two proposals for robust PCA using semidefinite programming
- Robust computation of linear models by convex relaxation
- Blendenpik: Supercharging LAPACK's Least-Squares Solver
- A geometric analysis of subspace clustering with outliers
- Subspace system identification for training-based MIMO channel estimation
Cited In (7)
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Robust Subspace Discovery via Relaxed Rank Minimization
- A nonconvex formulation for low rank subspace clustering: algorithms and convergence analysis
- Low-rank signal subspace: parameterization, projection and signal estimation
- Subspace Recovery From Structured Union of Subspaces
- Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset
- Title not available (Why is that?)
Uses Software
This page was built for publication: Relations among some low-rank subspace recovery models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380321)