A Probabilistic and RIPless Theory of Compressed Sensing

From MaRDI portal
Publication:5272230

DOI10.1109/TIT.2011.2161794zbMath1365.94174arXiv1011.3854OpenAlexW2137198385WikidataQ105584786 ScholiaQ105584786MaRDI QIDQ5272230

Emmanuel J. Candès, Yaniv Plan

Publication date: 12 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1011.3854



Related Items

Structure dependent sampling in compressed sensing: theoretical guarantees for tight frames, On the Fourier transform of a quantitative trait: implications for compressive sensing, One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations, On polynomial chaos expansion via gradient-enhanced \(\ell_1\)-minimization, Compressive Sensing with Redundant Dictionaries and Structured Measurements, Self-calibration and biconvex compressive sensing, A survey on compressive sensing: classical results and recent advancements, A Survey of Compressed Sensing, The Quest for Optimal Sampling: Computationally Efficient, Structure-Exploiting Measurements for Compressed Sensing, Infinite-dimensional compressed sensing and function interpolation, Improved recovery guarantees for phase retrieval from coded diffraction patterns, Extracting Sparse High-Dimensional Dynamics from Limited Data, Compressive sampling of polynomial chaos expansions: convergence analysis and sampling strategies, Infinite dimensional compressed sensing from anisotropic measurements and applications to inverse problems in PDE, Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression, Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class, Short Communication: Localized Adversarial Artifacts for Compressed Sensing MRI, Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution, Remote sensing via \(\ell_1\)-minimization, Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained 1 minimization, Blind three dimensional deconvolution via convex optimization, Compressed sensing and matrix completion with constant proportion of corruptions, A gradient enhanced \(\ell_{1}\)-minimization for sparse approximation of polynomial chaos expansions, Universal Features for High-Dimensional Learning and Inference, Basis adaptive sample efficient polynomial chaos (BASE-PC), The geometry of off-the-grid compressed sensing, A Variable Density Sampling Scheme for Compressive Fourier Transform Interferometry, Structured model selection via ℓ1−ℓ2 optimization, Geological facies recovery based on weighted \(\ell_1\)-regularization, Time Series Source Separation Using Dynamic Mode Decomposition, High-dimensional estimation with geometric constraints: Table 1., Bias reduction in variational regularization, Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing, BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING, RIPless compressed sensing from anisotropic measurements, Sparsity and incoherence in orthogonal matching pursuit, Compressed sensing with structured sparsity and structured acquisition, Sparse learning of partial differential equations with structured dictionary matrix, Extracting Structured Dynamical Systems Using Sparse Optimization With Very Few Samples, On Reconstructing Functions from Binary Measurements, Reconstruction Methods in THz Single-Pixel Imaging, Coherence motivated sampling and convergence analysis of least squares polynomial chaos regression, Phase retrieval from Fourier measurements with masks, A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials, Generalized sampling and infinite-dimensional compressed sensing, Variance-based adaptive sequential sampling for polynomial chaos expansion, A non-convex regularization approach for compressive sensing, Sparse polynomial interpolation: sparse recovery, super-resolution, or Prony?, Necessary and sufficient conditions of solution uniqueness in 1-norm minimization, Low-rank matrix completion in a general non-orthogonal basis, Robust group lasso: model and recoverability, New regularization method and iteratively reweighted algorithm for sparse vector recovery, Low Complexity Regularization of Linear Inverse Problems, Foveated compressed sensing, DFT spectrum-sparsity-based quasi-periodic signal identification and application, Hard thresholding pursuit algorithms: number of iterations, Analysis of sparse MIMO radar, Structure and Optimisation in Computational Harmonic Analysis: On Key Aspects in Sparse Regularisation, Prediction bounds for higher order total variation regularized least squares, Structured random measurements in signal processing, Randomized signal processing with continuous frames, Unnamed Item, Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark, Learning ``best kernels from data in Gaussian process regression. With application to aerodynamics, On the Role of Total Variation in Compressed Sensing, Submatrices with NonUniformly Selected Random Supports and Insights into Sparse Approximation, Proof methods for robust low-rank matrix recovery, On the Generation of Sampling Schemes for Magnetic Resonance Imaging, Nonuniform recovery of fusion frame structured sparse signals, Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm