Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing
From MaRDI portal
Publication:3460831
DOI10.1007/978-3-319-16042-9_4zbMath1333.94024OpenAlexW2257268277MaRDI QIDQ3460831
Samet Oymak, Christos Thrampoulidis, Babak Hassibi
Publication date: 8 January 2016
Published in: Compressed Sensing and its Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-16042-9_4
Related Items
Randomized numerical linear algebra: Foundations and algorithms, Book Review: A mathematical introduction to compressive sensing, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sharp MSE bounds for proximal denoising
- The geometry of least squares in the 21st century
- Low-rank tensor completion by Riemannian optimization
- From Steiner formulas for cones to concentration of intrinsic volumes
- Some inequalities for Gaussian processes and applications
- Gauss and the invention of least squares
- The convex geometry of linear inverse problems
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Exact matrix completion via convex optimization
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- Tensor completion and low-n-rank tensor recovery via convex optimization
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- De-noising by soft-thresholding
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Compressive principal component pursuit
- Computational and statistical tradeoffs via convex relaxation
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- Living on the edge: phase transitions in convex programs with random data
- The LASSO Risk for Gaussian Matrices
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Applied Multivariate Statistical Analysis
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
- Sparse nonnegative solution of underdetermined linear equations by linear programming
- Neighborliness of randomly projected simplices in high dimensions
- Stable signal recovery from incomplete and inaccurate measurements
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers