Sharp MSE bounds for proximal denoising
DOI10.1007/S10208-015-9278-4zbMATH Open1380.90221arXiv1305.2714OpenAlexW1881709950MaRDI QIDQ330102FDOQ330102
Publication date: 24 October 2016
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1305.2714
Recommendations
- Minimax risk of matrix denoising by singular value thresholding
- Recovering structured signals in noise: least-squares meets compressed sensing
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- A new perspective on least squares under convex constraint
- The convex geometry of linear inverse problems
convex optimizationstochastic noisemodel fittingproximity operatorstatistical estimationstructured sparsitygeneralized LASSOlinear inverse
Convex programming (90C25) Gaussian processes (60G15) Approximation methods and heuristics in mathematical programming (90C59) Convex functions and convex programs in convex geometry (52A41)
Cites Work
- Nonlinear total variation based noise removal algorithms
- Title not available (Why is that?)
- Title not available (Why is that?)
- Simultaneous analysis of Lasso and Dantzig selector
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Sparsity oracle inequalities for the Lasso
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- Title not available (Why is that?)
- Title not available (Why is that?)
- Exact matrix completion via convex optimization
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Stable signal recovery from incomplete and inaccurate measurements
- The concentration of measure phenomenon
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- De-noising by soft-thresholding
- Proximal Splitting Methods in Signal Processing
- Signal Recovery by Proximal Forward-Backward Splitting
- Stable image reconstruction using total variation minimization
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Monotone Operators and the Proximal Point Algorithm
- Title not available (Why is that?)
- Neighborliness of randomly projected simplices in high dimensions
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- On the small balls problem for equivalent Gaussian measures
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Title not available (Why is that?)
- Compressive principal component pursuit
- The convex geometry of linear inverse problems
- Title not available (Why is that?)
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Living on the edge: phase transitions in convex programs with random data
- The LASSO Risk for Gaussian Matrices
- Atomic Norm Denoising With Applications to Line Spectral Estimation
- Minimax risk of matrix denoising by singular value thresholding
- Model-Based Compressive Sensing
- Universality in polytope phase transitions and message passing algorithms
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- Simple bounds for recovering low-complexity models
- Modified-CS: Modifying Compressive Sensing for Problems With Partially Known Support
- Sparse Recovery of Nonnegative Signals With Minimal Expansion
- Computational and statistical tradeoffs via convex relaxation
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
Cited In (21)
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- Nonparametric shape-restricted regression
- Learning semidefinite regularizers
- Adaptive risk bounds in univariate total variation denoising and trend filtering
- Distribution-free properties of isotonic regression
- Estimating piecewise monotone signals
- Uniqueness conditions for low-rank matrix recovery
- Adaptive confidence sets in shape restricted regression
- A new perspective on least squares under convex constraint
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- Which bridge estimator is the best for variable selection?
- On the determination of Lagrange multipliers for a weighted Lasso problem using geometric and convex analysis techniques
- Generic error bounds for the generalized Lasso with sub-exponential data
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- The bounds of restricted isometry constants for low rank matrices recovery
- On the risk of convex-constrained least squares estimators under misspecification
- Sharp oracle inequalities for least squares estimators in shape restricted regression
- On risk bounds in isotonic and other shape restricted regression problems
- Fast and Reliable Parameter Estimation from Nonlinear Observations
Uses Software
This page was built for publication: Sharp MSE bounds for proximal denoising
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q330102)