Abstract: Denoising has to do with estimating a signal from its noisy observations . In this paper, we focus on the "structured denoising problem", where the signal possesses a certain structure and has independent normally distributed entries with mean zero and variance . We employ a structure-inducing convex function and solve to estimate , for some . Common choices for include the norm for sparse vectors, the norm for block-sparse signals and the nuclear norm for low-rank matrices. The metric we use to evaluate the performance of an estimate is the normalized mean-squared-error . We show that NMSE is maximized as and we find the emph{exact} worst case NMSE, which has a simple geometric interpretation: the mean-squared-distance of a standard normal vector to the -scaled subdifferential . When is optimally tuned to minimize the worst-case NMSE, our results can be related to the constrained denoising problem . The paper also connects these results to the generalized LASSO problem, in which, one solves to estimate from noisy linear observations . We show that certain properties of the LASSO problem are closely related to the denoising problem. In particular, we characterize the normalized LASSO cost and show that it exhibits a "phase transition" as a function of number of observations. Our results are significant in two ways. First, we find a simple formula for the performance of a general convex estimator. Secondly, we establish a connection between the denoising and linear inverse problems.
Recommendations
- Minimax risk of matrix denoising by singular value thresholding
- Recovering structured signals in noise: least-squares meets compressed sensing
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- A new perspective on least squares under convex constraint
- The convex geometry of linear inverse problems
Cites work
- scientific article; zbMATH DE number 439380 (Why is no real title available?)
- scientific article; zbMATH DE number 4061904 (Why is no real title available?)
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- scientific article; zbMATH DE number 1266748 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 2121575 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3192366 (Why is no real title available?)
- A Singular Value Thresholding Algorithm for Matrix Completion
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- Atomic Norm Denoising With Applications to Line Spectral Estimation
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Compressed sensing
- Compressive principal component pursuit
- Computational and statistical tradeoffs via convex relaxation
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- De-noising by soft-thresholding
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Living on the edge: phase transitions in convex programs with random data
- Minimax risk of matrix denoising by singular value thresholding
- Model-Based Compressive Sensing
- Modified-CS: Modifying Compressive Sensing for Problems With Partially Known Support
- Monotone Operators and the Proximal Point Algorithm
- Neighborliness of randomly projected simplices in high dimensions
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Nonlinear total variation based noise removal algorithms
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- On the small balls problem for equivalent Gaussian measures
- Proximal methods for hierarchical sparse coding
- Proximal splitting methods in signal processing
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- Robust principal component analysis?
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Signal Recovery by Proximal Forward-Backward Splitting
- Simple bounds for recovering low-complexity models
- Simultaneous analysis of Lasso and Dantzig selector
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Sparse Recovery of Nonnegative Signals With Minimal Expansion
- Sparsity oracle inequalities for the Lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Stable image reconstruction using total variation minimization
- Stable signal recovery from incomplete and inaccurate measurements
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- The LASSO Risk for Gaussian Matrices
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- The concentration of measure phenomenon
- The convex geometry of linear inverse problems
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Universality in polytope phase transitions and message passing algorithms
Cited in
(23)- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Nonparametric shape-restricted regression
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- On Bayesian estimation and proximity operators
- Sharp oracle inequalities for least squares estimators in shape restricted regression
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- The DFS fused Lasso: linear-time denoising over general graphs
- Adaptive confidence sets in shape restricted regression
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- Uniqueness conditions for low-rank matrix recovery
- The bounds of restricted isometry constants for low rank matrices recovery
- On risk bounds in isotonic and other shape restricted regression problems
- Which bridge estimator is the best for variable selection?
- Distribution-free properties of isotonic regression
- Recovering structured signals in noise: least-squares meets compressed sensing
- Fast and reliable parameter estimation from nonlinear observations
- On the risk of convex-constrained least squares estimators under misspecification
- Learning semidefinite regularizers
- Estimating piecewise monotone signals
- Generic error bounds for the generalized Lasso with sub-exponential data
- On the determination of Lagrange multipliers for a weighted Lasso problem using geometric and convex analysis techniques
- Adaptive risk bounds in univariate total variation denoising and trend filtering
- A new perspective on least squares under convex constraint
This page was built for publication: Sharp MSE bounds for proximal denoising
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q330102)