Proof methods for robust low-rank matrix recovery
DOI10.1007/978-3-031-09745-4_2zbMATH Open1504.94038arXiv2106.04382OpenAlexW3168624210MaRDI QIDQ2106469FDOQ2106469
Peter Jung, David J. Gross, Felix Krahmer, Richard Kueng, Dominik Stöger, Tim Fuchs
Publication date: 14 December 2022
Full work available at URL: https://arxiv.org/abs/2106.04382
Recommendations
- On the convex geometry of blind deconvolution and matrix completion
- An analysis of noise folding for low-rank matrix recovery
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Low rank matrix recovery with adversarial sparse noise
- Uniqueness conditions for low-rank matrix recovery
Convex programming (90C25) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Semidefinite programming (90C22) Norms of matrices, numerical range, applications of functional analysis to matrix theory (15A60)
Cites Work
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Matrix completion from noisy entries
- Phaselift: exact and stable signal recovery from magnitude measurements via convex programming
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Title not available (Why is that?)
- Noisy low-rank matrix completion with general sampling distribution
- Exact matrix completion via convex optimization
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Decoding by Linear Programming
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Stable signal recovery from incomplete and inaccurate measurements
- Characterization of the subdifferential of some matrix norms
- User-friendly tail bounds for sums of random matrices
- A mathematical introduction to compressive sensing
- A Probabilistic and RIPless Theory of Compressed Sensing
- Matrix Completion From a Few Entries
- Sparse Approximate Solutions to Linear Systems
- The convex geometry of linear inverse problems
- Learning without concentration
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- Title not available (Why is that?)
- Living on the edge: phase transitions in convex programs with random data
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- A Simpler Approach to Matrix Completion
- Low-rank matrix completion using alternating minimization
- Convex multi-task feature learning
- Suprema of chaos processes and the restricted isometry property
- Painless reconstruction from magnitudes of frame coefficients
- Phase retrieval: stability and recovery guarantees
- An algebraic characterization of injectivity in phase retrieval
- Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
- Solving quadratic equations via phaselift when there are about as many equations as unknowns
- Stable low-rank matrix recovery via null space properties
- A partial derandomization of phaselift using spherical designs
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Phase retrieval from coded diffraction patterns
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Convex Recovery of a Structured Signal from Independent Random Linear Measurements
- Incoherence-Optimal Matrix Completion
- Blind Deconvolution Using Convex Programming
- Improved recovery guarantees for phase retrieval from coded diffraction patterns
- Low rank matrix recovery from rank one measurements
- Localization from incomplete noisy distance measurements
- Guaranteed Matrix Completion via Non-Convex Factorization
- Phase retrieval via matrix completion
- Explicit frames for deterministic phase retrieval via PhaseLift
- Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
- Robust Nonnegative Sparse Recovery and the Nullspace Property of 0/1 Measurements
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Optimal Injectivity Conditions for Bilinear Inverse Problems with Applications to Identifiability of Deconvolution Problems
- Blind Deconvolution Meets Blind Demixing: Algorithms and Performance Bounds
- Blind Recovery of Sparse Signals From Subsampled Convolution
- Blind Demixing and Deconvolution at Near-Optimal Rate
- Phase Retrieval Without Small-Ball Probability Assumptions
- Sparse power factorization: balancing peakiness and sample complexity
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Practical Sketching Algorithms for Low-Rank Matrix Approximation
- Title not available (Why is that?)
- On the Convex Geometry of Blind Deconvolution and Matrix Completion
- Complex phase retrieval from subgaussian measurements
- Identifiability in Bilinear Inverse Problems With Applications to Subspace or Sparsity-Constrained Blind Gain and Phase Calibration
- Non-Bayesian Activity Detection, Large-Scale Fading Coefficient Estimation, and Unsourced Random Access With a Massive MIMO Receiver
Cited In (4)
Uses Software
This page was built for publication: Proof methods for robust low-rank matrix recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2106469)