Low rank matrix recovery with adversarial sparse noise
From MaRDI portal
Publication:5030160
DOI10.1088/1361-6420/AC44DCzbMATH Open1485.90135OpenAlexW4200591508MaRDI QIDQ5030160FDOQ5030160
Authors: Hang Xu, Song Li, Junhong Lin
Publication date: 16 February 2022
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1088/1361-6420/ac44dc
Recommendations
- Low rank matrix recovery with impulsive noise
- Exact low-rank matrix completion from sparsely corrupted entries via adaptive outlier pursuit
- An analysis of noise folding for low-rank matrix recovery
- Minimization of the difference of nuclear and Frobenius norms for noisy low rank matrix recovery
- Perturbation analysis of low-rank matrix stable recovery
robustnessleast absolute deviationiterative hard thresholdinglow-rank matrix recoverymatrix decompositionmixed noisesubgradient descent
Cites Work
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. With comments by Ronald A. Thisted and M. R. Osborne and a rejoinder by the authors
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- High-dimensional statistics. A non-asymptotic viewpoint
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Asymptotic Theory of Least Absolute Error Regression
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Stable signal recovery from incomplete and inaccurate measurements
- A simple proof of the restricted isometry property for random matrices
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Compressed sensing
- Iterative hard thresholding for compressed sensing
- Strong and Weak Convexity of Sets and Functions
- A framework for robust subspace learning
- A semismooth Newton method for nonlinear parameter identification problems with impulsive noise
- Restricted isometry properties and nonconvex compressive sensing
- Weak Sharp Minima in Mathematical Programming
- Interior-point method for nuclear norm approximation with application to system identification
- Matrix recipes for hard thresholding methods
- Restricted $p$-Isometry Properties of Nonconvex Matrix Recovery
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Phase retrieval: stability and recovery guarantees
- Title not available (Why is that?)
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Low-Rank Positive Semidefinite Matrix Recovery From Corrupted Rank-One Measurements
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Title not available (Why is that?)
- Nonconvex Robust Low-Rank Matrix Recovery
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- A Proof of Conjecture on Restricted Isometry Property Constants $\delta _{tk}\ \left(0<t<\frac {4}{3}\right)$
- Subgradient methods for sharp weakly convex functions
- Global Optimality in Low-Rank Matrix Optimization
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Restricted Isometry Property for General p-Norms
- \(\ell_1-\alpha\ell_2\) minimization methods for signal and image reconstruction with impulsive noise removal
- The nonsmooth landscape of phase retrieval
- Iterative hard thresholding for low-rank recovery from rank-one projections
- Non-convex low-rank matrix recovery with arbitrary outliers via median-truncated gradient descent
- Convergence of projected Landweber iteration for matrix rank minimization
- Low-Rank Matrix Recovery With Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number
Cited In (8)
- Matrix recovery from nonconvex regularized least absolute deviations
- Low-rank matrix completion and denoising under Poisson noise
- An analysis of noise folding for low-rank matrix recovery
- Rate Optimal Denoising of Simultaneously Sparse and Low Rank Matrices
- Low rank matrix recovery with impulsive noise
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- LOW-RANK AND SPARSE MATRIX RECOVERY FROM NOISY OBSERVATIONS VIA 3-BLOCK ADMM ALGORITHM
- Proof methods for robust low-rank matrix recovery
Uses Software
This page was built for publication: Low rank matrix recovery with adversarial sparse noise
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5030160)