A simple proof of the restricted isometry property for random matrices

From MaRDI portal
Publication:1039884

DOI10.1007/s00365-007-9003-xzbMath1177.15015OpenAlexW2030449718WikidataQ57254828 ScholiaQ57254828MaRDI QIDQ1039884

Michael B. Wakin, Richard G. Baraniuk, Mark A. Davenport, Ronald A. DeVore

Publication date: 23 November 2009

Published in: Constructive Approximation (Search for Journal in Brave)

Full work available at URL: http://hdl.handle.net/1911/21683



Related Items

Use of EM algorithm for data reduction under sparsity assumption, Derandomized compressed sensing with nonuniform guarantees for \(\ell_1\) recovery, Sparse high-dimensional linear regression. Estimating squared error and a phase transition, On polynomial chaos expansion via gradient-enhanced \(\ell_1\)-minimization, A Novel Compressed Sensing Scheme for Photoacoustic Tomography, Compressive Sensing with Redundant Dictionaries and Structured Measurements, On the optimization landscape of tensor decompositions, Fast Phase Retrieval from Local Correlation Measurements, Random fusion frames are nearly equiangular and tight, THE COEFFICIENT REGULARIZED REGRESSION WITH RANDOM PROJECTION, A Survey of Compressed Sensing, Quantization and Compressive Sensing, Sparse Model Uncertainties in Compressed Sensing with Application to Convolutions and Sporadic Communication, Explicit Matrices with the Restricted Isometry Property: Breaking the Square-Root Bottleneck, Sparse signal recovery via non-convex optimization and overcomplete dictionaries, Deterministic construction of compressed sensing matrices with characters over finite fields, Bayesian factor-adjusted sparse regression, A new sufficient condition for sparse recovery with multiple orthogonal least squares, Recovery error analysis of noisy measurement in compressed sensing, Unnamed Item, Phase retrieval by binary questions: which complementary subspace is closer?, Suprema of Chaos Processes and the Restricted Isometry Property, Error bounds for compressed sensing algorithms with group sparsity: A unified approach, A new bound on the block restricted isometry constant in compressed sensing, The recovery of complex sparse signals from few phaseless measurements, Improved RIP conditions for compressed sensing with coherent tight frames, Compressive statistical learning with random feature moments, Performance analysis of the compressed distributed least squares algorithm, Parametrized quasi-soft thresholding operator for compressed sensing and matrix completion, Iterative hard thresholding for compressed data separation, Stability of lq-analysis based dual frame with Weibull matrices for 0 < q ≤ 1, Deterministic construction of compressed sensing matrices from constant dimension codes, Capturing ridge functions in high dimensions from point queries, Effective zero-norm minimization algorithms for noisy compressed sensing, Compressed sensing of low-rank plus sparse matrices, Unnamed Item, Representation and coding of signal geometry, Time for dithering: fast and quantized random embeddings via the restricted isometry property, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Three deterministic constructions of compressed sensing matrices with low coherence, Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements, Compressive Sensing, Bipolar measurement matrix using chaotic sequence, Greedy-like algorithms for the cosparse analysis model, Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices, On deterministic sketching and streaming for sparse recovery and norm estimation, On the sparsity of Lasso minimizers in sparse data recovery, Stability of 1-bit compressed sensing in sparse data reconstruction, Signal separation under coherent dictionaries and \(\ell_p\)-bounded noise, An Introduction to Compressed Sensing, DASSO: Connections Between the Dantzig Selector and Lasso, Statistical mechanics of complex neural systems and high dimensional data, Compressed Sensing with Nonlinear Fourier Atoms, Deterministic constructions of compressed sensing matrices, Frames as Codes, A Tight Bound of Hard Thresholding, Stable restoration and separation of approximately sparse signals, Convergence rates of learning algorithms by random projection, Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO, Sparse signals recovery from noisy measurements by orthogonal matching pursuit, Stability and instance optimality for Gaussian measurements in compressed sensing, Iterative re-weighted least squares algorithm for \(l_p\)-minimization with tight frame and \(0 < p \leq 1\), Construction of highly redundant incoherent unit norm tight frames as a union of orthonormal bases, Restricted isometry property for matrices whose entries are random variables belonging to some Orlicz spaces $L_U(\Omega )$, Polynomial Homotopy Method for the Sparse Interpolation Problem Part I: Equally Spaced Sampling, On rank awareness, thresholding, and MUSIC for joint sparse recovery, Fusion frames and distributed sparsity, Least Sparsity of $p$-Norm Based Optimization Problems with $p>1$, Packings in Real Projective Spaces, On Collaborative Compressive Sensing Systems: The Framework, Design, and Algorithm, Learning in compressed space, An effective algorithm for the spark of sparse binary measurement matrices, Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles, Fast and RIP-optimal transforms, Sparse Approximation of Overdetermined Systems for Image Retrieval Application, Construction of Sparse Binary Sensing Matrices Using Set Systems, Fourth-Order Derivative-Free Optimal Families of King’s and Ostrowski’s Methods, Analysis of compressed distributed adaptive filters, Lower Bounds for Sparse Coding, Compressed sensing and best 𝑘-term approximation, Erasure recovery matrices for encoder protection, Simple Classification using Binary Data, Learning directed acyclic graph SPNs in sub-quadratic time, Deterministic constructions of compressed sensing matrices based on codes, On the strong restricted isometry property of Bernoulli random matrices, Sparse reconstruction with multiple Walsh matrices, A simple Gaussian measurement bound for exact recovery of block-sparse signals, Binary sparse signal recovery with binary matching pursuit *, Sampling strategies for uncertainty reduction in categorical random fields: formulation, mathematical analysis and application to multiple-point simulations, Lattices from tight frames and vertex transitive graphs, Optimal RIP bounds for sparse signals recovery via \(\ell_p\) minimization, Sharp sufficient conditions for stable recovery of block sparse signals by block orthogonal matching pursuit, Weaker regularity conditions and sparse recovery in high-dimensional regression, Theory and applications of compressed sensing, On some aspects of approximation of ridge functions, An Overview of Computational Sparse Models and Their Applications in Artificial Intelligence, New analysis of manifold embeddings and signal recovery from compressive measurements, Optimal \(D\)-RIP bounds in compressed sensing, A novel measurement matrix based on regression model for block compressed sensing, Sparse PSD approximation of the PSD cone, On principal components regression, random projections, and column subsampling, Derandomizing restricted isometries via the Legendre symbol, Improved bounds for sparse recovery from subsampled random convolutions, A unified framework for linear dimensionality reduction in L1, Compressed blind signal reconstruction model and algorithm, A class of deterministic sensing matrices and their application in harmonic detection, Deterministic convolutional compressed sensing matrices, Robust sparse phase retrieval made easy, A novel probabilistic approach for vehicle position prediction in free, partial, and full GPS outages, The restricted isometry property for time-frequency structured random matrices, Approximation accuracy, gradient methods, and error bound for structured convex optimization, On exact recovery of sparse vectors from linear measurements, Theory of compressive sensing via \(\ell_1\)-minimization: a non-RIP analysis and extensions, The Gelfand widths of \(\ell_p\)-balls for \(0 < p \leq 1\), RBF-network based sparse signal recovery algorithm for compressed sensing reconstruction, Random sampling of bandlimited signals on graphs, Stability of the elastic net estimator, Restricted isometries for partial random circulant matrices, Compressed sensing and matrix completion with constant proportion of corruptions, Sampling in the analysis transform domain, Error estimates for orthogonal matching pursuit and random dictionaries, Sparse Legendre expansions via \(\ell_1\)-minimization, Two are better than one: fundamental parameters of frame coherence, Uniform estimates for order statistics and Orlicz functions, Null space conditions and thresholds for rank minimization, Sobolev duals for random frames and \(\varSigma \varDelta \) quantization of compressed sensing measurements, Interpolation via weighted \(\ell_{1}\) minimization, From compression to compressed sensing, A strong restricted isometry property, with an application to phaseless compressed sensing, Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling, Explicit constructions of RIP matrices and related problems, Robustness properties of dimensionality reduction with Gaussian random matrices, Learning functions of few arbitrary linear parameters in high dimensions, Random matrices and erasure robust frames, Minimax risks for sparse regressions: ultra-high dimensional phenomenons, Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices, Numerically erasure-robust frames, Convex feasibility modeling and projection methods for sparse signal recovery, Democracy in action: quantization, saturation, and compressive sensing, Linear regression with sparsely permuted data, Compressive sensing of analog signals using discrete prolate spheroidal sequences, Median filter based compressed sensing model with application to MR image reconstruction, Compressed sensing with preconditioning for sparse recovery with subsampled matrices of Slepian prolate functions, Compressed classification learning with Markov chain samples, On the conditioning of random subdictionaries, Compressed sensing with coherent tight frames via \(l_q\)-minimization for \(0 < q \leq 1\), Sparse dual frames and dual Gabor functions of minimal time and frequency supports, The road to deterministic matrices with the restricted isometry property, Near-optimal encoding for sigma-delta quantization of finite frame expansions, Steiner equiangular tight frames, Discrete uncertainty principles and sparse signal processing, An overview on the applications of matrix theory in wireless communications and signal processing, Compressive sensing using chaotic sequence based on Chebyshev map, Kernel conjugate gradient methods with random projections, Lattices from equiangular tight frames, Real-valued embeddings and sketches for fast distance and similarity estimation, On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems, Sparse recovery in probability via \(l_q\)-minimization with Weibull random matrices for \(0 < q\leq 1\), Dimensionality reduction with subgaussian matrices: a unified theory, Sparse signal recovery using a new class of random matrices, Compressed sensing for quaternionic signals, Restricted \(p\)-isometry property and its application for nonconvex compressive sensing, Two-dimensional random projection, Sparsity in time-frequency representations, Learning general sparse additive models from point queries in high dimensions, Compressed sensing with coherent and redundant dictionaries, Deterministic constructions of compressed sensing matrices based on optimal codebooks and codes, Randomization of data acquisition and \(\ell_{1}\)-optimization (recognition with compression), 2D compressed learning: support matrix machine with bilinear random projections, Exponential screening and optimal rates of sparse estimation, A box constrained gradient projection algorithm for compressed sensing, Compressed data separation via dual frames based split-analysis with Weibull matrices, Greedy variance estimation for the LASSO, Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization, Sparse recovery by non-convex optimization - instance optimality, Adaptive compressive learning for prediction of protein-protein interactions from primary sequence, On the sparseness of 1-norm support vector machines, A novel measurement matrix optimization approach for hyperspectral unmixing, Geometric component analysis and its applications to data analysis, Application of ESN prediction model based on compressed sensing in stock market, Instance-optimality in probability with an \(\ell _1\)-minimization decoder, Optimal non-linear models for sparsity and sampling, Atoms of all channels, unite! Average case analysis of multi-channel sparse recovery using greedy algorithms, Random projections for Bayesian regression, Hard thresholding pursuit algorithms: number of iterations, Matrix-free interior point method for compressed sensing problems, Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\), Optimal fast Johnson-Lindenstrauss embeddings for large data sets, Image reconstruction based on improved block compressed sensing, Random sampling of sparse trigonometric polynomials. II: Orthogonal matching pursuit versus basis pursuit, Random projections of smooth manifolds, Compressive sensing for subsurface imaging using ground penetrating radar, Asymptotic analysis for extreme eigenvalues of principal minors of random matrices, Uniform uncertainty principle for Bernoulli and subgaussian ensembles, Rigorous restricted isometry property of low-dimensional subspaces, Perturbation analysis of \(L_{1-2}\) method for robust sparse recovery, Fast and memory-optimal dimension reduction using Kac's walk, Increasing the semantic storage density of sparse distributed memory, Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications, Kolmogorov \(n\)-widths of function classes induced by a non-degenerate differential operator: a convex duality approach, Required Number of Iterations for Sparse Signal Recovery via Orthogonal Least Squares, Sufficient conditions on stable reconstruction of weighted problem, Compressed data separation via unconstrained l1-split analysis, Convergence rates for the joint solution of inverse problems with compressed sensing data, A survey on compressive sensing: classical results and recent advancements, Unnamed Item, Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling, Performance analysis for unconstrained analysis based approaches*, Robust recovery of a kind of weighted l1-minimization without noise level, Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\), Improved RIP-based bounds for guaranteed performance of two compressed sensing algorithms, Recovery of low-rank matrices based on the rank null space properties, Compressive phase retrieval: Optimal sample complexity with deep generative priors, On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary, One-bit sensing, discrepancy and Stolarsky's principle, Recent Theoretical Advances in Non-Convex Optimization, Influences of preconditioning on the mutual coherence and the restricted isometry property of Gaussian/Bernoulli measurement matrices, Variance-stabilization-based compressive inversion under Poisson or Poisson–Gaussian noise with analytical bounds, Chaotic Binary Sensing Matrices, Unnamed Item, High-dimensional regression with unknown variance, Sparse estimation by exponential weighting, CoverBLIP: accelerated and scalable iterative matched-filtering for magnetic resonance fingerprint reconstruction*, Unnamed Item, Sparse recovery from extreme eigenvalues deviation inequalities, Rendition: Reclaiming What a Black Box Takes Away, Characterization of ℓ1 minimizer in one-bit compressed sensing, Sparse recovery with general frame via general-dual-based analysis Dantzig selector, On the Impossibility of Dimension Reduction for Doubling Subsets of $\ell_{p}$, A partial graphical model with a structural prior on the direct links between predictors and responses, An Accelerated Linearized Alternating Direction Method of Multipliers, Quasi-linear Compressed Sensing, Sampling, Metric Entropy, and Dimensionality Reduction, Euclidean arrangements in Banach spaces, Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares, The Restricted Isometry Property of Subsampled Fourier Matrices, New Restricted Isometry Property Analysis for $\ell_1-\ell_2$ Minimization Methods, Robust Width: A Characterization of Uniformly Stable and Robust Compressed Sensing, Low rank matrix recovery with adversarial sparse noise*



Cites Work