The Noise-Sensitivity Phase Transition in Compressed Sensing

From MaRDI portal
Publication:5272314

DOI10.1109/TIT.2011.2165823zbMath1365.94094arXiv1004.1218OpenAlexW2164595191MaRDI QIDQ5272314

David L. Donoho, Arian Maleki, Andrea Montanari

Publication date: 12 July 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1004.1218



Related Items

Cross validation in sparse linear regression with piecewise continuous nonconvex penalties and its acceleration, Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing, Sharp MSE bounds for proximal denoising, High dimensional robust M-estimation: asymptotic variance via approximate message passing, Nearly optimal minimax estimator for high-dimensional sparse linear regression, Accuracy assessment for high-dimensional linear regression, Performance comparisons of greedy algorithms in compressed sensing, Sharp global convergence guarantees for iterative nonconvex optimization with random data, Fundamental limits of weak recovery with applications to phase retrieval, A simple homotopy proximal mapping algorithm for compressive sensing, Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, Guarantees of total variation minimization for signal recovery, From compression to compressed sensing, Which bridge estimator is the best for variable selection?, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, Evaluation of generalized degrees of freedom for sparse estimation by replica method, Perfect reconstruction of sparse signals with piecewise continuous nonconvex penalties and nonconvexity control, Foveated compressive imaging for low power vehicle fingerprinting and tracking in aerial imagery, Debiasing the Lasso: optimal sample size for Gaussian designs, Overcoming the limitations of phase transition by higher order analysis of regularization techniques, Consistent parameter estimation for Lasso and approximate message passing, Approximate message passing for nonconvex sparse regularization with stability and asymptotic analysis, The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning, Precise statistical analysis of classification accuracies for adversarial training, Universality in polytope phase transitions and message passing algorithms, Unnamed Item, A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks, LASSO risk and phase transition under dependence