Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising

From MaRDI portal
Publication:2989183

DOI10.1109/TIT.2013.2239356zbMath1364.94092arXiv1111.1041OpenAlexW2103539935MaRDI QIDQ2989183

David L. Donoho, Iain M. Johnstone, Andrea Montanari

Publication date: 8 June 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1111.1041



Related Items

A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, Activation function design for deep networks: linearity and effective initialisation, Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing, Sharp MSE bounds for proximal denoising, Estimation of low-rank matrices via approximate message passing, Performance comparisons of greedy algorithms in compressed sensing, Parametrized quasi-soft thresholding operator for compressed sensing and matrix completion, A simple homotopy proximal mapping algorithm for compressive sensing, CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion, Guarantees of total variation minimization for signal recovery, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, Content-aware compressive sensing recovery using Laplacian scale mixture priors and side information, An Introduction to Compressed Sensing, Minimax risk of matrix denoising by singular value thresholding, Typical reconstruction performance for distributed compressed sensing based on ℓ2,1-norm regularized least square and Bayesian optimal reconstruction: influences of noise, Debiasing the Lasso: optimal sample size for Gaussian designs, A Tight Bound of Hard Thresholding, Universality of approximate message passing algorithms, On convergence of the cavity and Bolthausen's TAP iterations to the local magnetization, Efficient Threshold Selection for Multivariate Total Variation Denoising, Plug in estimation in high dimensional linear inverse problems a rigorous analysis, The committee machine: computational to statistical gaps in learning a two-layers neural network, Universality in polytope phase transitions and message passing algorithms, LASSO risk and phase transition under dependence