Convex Recovery of a Structured Signal from Independent Random Linear Measurements

From MaRDI portal
Publication:2799918

DOI10.1007/978-3-319-19749-4_2zbMath1358.94034arXiv1405.1102OpenAlexW1684009677WikidataQ98837548 ScholiaQ98837548MaRDI QIDQ2799918

Joel A. Tropp

Publication date: 14 April 2016

Published in: Sampling Theory, a Renaissance (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1405.1102



Related Items

Simultaneous Phase Retrieval and Blind Deconvolution via Convex Programming, A note on computing the smallest conic singular value, Bias versus non-convexity in compressed sensing, Low rank matrix recovery from rank one measurements, Compressed sensing for finite-valued signals, \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?, Generic error bounds for the generalized Lasso with sub-exponential data, Linear convergence of Frank-Wolfe for rank-one matrix recovery without strong convexity, Robust sensing of low-rank matrices with non-orthogonal sparse decomposition, Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution, A unified approach to uniform signal recovery from nonlinear observations, Sampling rates for \(\ell^1\)-synthesis, Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, Performance bounds of the intensity-based estimators for noisy phase retrieval, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Stable low-rank matrix recovery via null space properties, Complex phase retrieval from subgaussian measurements, Low-rank matrix recovery via rank one tight frame measurements, Norm and Trace Estimation with Random Rank-one Vectors, Simplicial faces of the set of correlation matrices, Sparse space-time models: concentration inequalities and Lasso, Phase retrieval with PhaseLift algorithm, Estimation in High Dimensions: A Geometric Perspective, The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning, Sharp oracle inequalities for low-complexity priors, Solving equations of random convex functions via anchored regression, Preserving injectivity under subgaussian mappings and its application to compressed sensing, Low-Rank Matrix Estimation from Rank-One Projections by Unlifted Convex Optimization, Proof methods for robust low-rank matrix recovery, Robust Width: A Characterization of Uniformly Stable and Robust Compressed Sensing, Fast Convex Pruning of Deep Neural Networks



Cites Work