Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all
DOI10.1016/J.ACHA.2016.08.004zbMATH Open1391.94421arXiv1510.00504OpenAlexW2962684767MaRDI QIDQ1748256FDOQ1748256
Authors: Yann Traonmilin, Rémi Gribonval
Publication date: 9 May 2018
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.00504
Recommendations
- Stability of low-rank matrix recovery and its connections to Banach space geometry
- Concave Mirsky inequality and low-rank recovery
- Guarantees of Riemannian optimization for low rank matrix recovery
- Stable recovery of low-rank matrix via nonconvex Schatten \(p\)-minimization
- scientific article; zbMATH DE number 1542511
- Uniform recovery of high-dimensional \(C^r\)-functions
- Optimal recovery of linear functionals on sets of finite dimension
- Uniform RIP Conditions for Recovery of Sparse Signals by $\ell _p\,(0< p\leq 1)$ Minimization
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Near-optimal recovery of linear and \(N\)-convex functions on unions of convex sets
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Sampling theory in information and communication theory (94A20) Geometric methods (including applications of algebraic geometry) applied to coding theory (94B27)
Cites Work
- Title not available (Why is that?)
- Model Selection and Estimation in Regression with Grouped Variables
- Extensions of Lipschitz mappings into a Hilbert space
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Title not available (Why is that?)
- A mathematical introduction to compressive sensing
- Infinite dimensional analysis. A hitchhiker's guide.
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Restricted Isometry Constants Where $\ell ^{p}$ Sparse Recovery Can Fail for $0≪ p \leq 1$
- Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices
- The restricted isometry property and its implications for compressed sensing
- The convex geometry of linear inverse problems
- Living on the edge: phase transitions in convex programs with random data
- Model-Based Compressive Sensing
- Compressed sensing with coherent and redundant dictionaries
- Robust Recovery of Signals From a Structured Union of Subspaces
- A generalized sampling theorem for stable reconstructions in arbitrary bases
- Title not available (Why is that?)
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sampling and Reconstructing Signals From a Union of Linear Subspaces
- Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms
- Beyond consistent reconstructions: optimality and sharp bounds for generalized sampling, and application to the uniform resampling problem
- Breaking the coherence barrier: a new theory for compressed sensing
- New analysis of manifold embeddings and signal recovery from compressive measurements
- Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems
- Robust multi-image processing with optimal sparse regularization
- A GENERAL ATOMIC DECOMPOSITION THEOREM AND BANACH'S CLOSED RANGE THEOREM
- Stable restoration and separation of approximately sparse signals
Cited In (17)
- Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class
- Optimal approximation of infinite-dimensional holomorphic functions
- Uniform recovery in infinite-dimensional compressed sensing and applications to structured binary sampling
- Recovering Wavelet Coefficients from Binary Samples Using Fast Transforms
- Generalized notions of sparsity and restricted isometry property. II: Applications
- Structure and Optimisation in Computational Harmonic Analysis: On Key Aspects in Sparse Regularisation
- Sparse recovery from extreme eigenvalues deviation inequalities
- Statistical learning guarantees for compressive clustering and compressive mixture modeling
- Convergence bounds for empirical nonlinear least-squares
- Sample complexity bounds for the local convergence of least squares approximation
- WARPd: a linearly convergent first-order primal-dual algorithm for inverse problems with approximate sharpness conditions
- Structured iterative hard thresholding with on- and off-grid applications
- A theory of optimal convex regularization for low-dimensional recovery
- Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing
- On the Absence of Uniform Recovery in Many Real-World Applications of Compressed Sensing and the Restricted Isometry Property and Nullspace Property in Levels
- Hierarchical isometry properties of hierarchical measurements
- Breaking the coherence barrier: a new theory for compressed sensing
This page was built for publication: Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1748256)