Sharp MSE bounds for proximal denoising
From MaRDI portal
Publication:330102
DOI10.1007/s10208-015-9278-4zbMath1380.90221arXiv1305.2714OpenAlexW1881709950MaRDI QIDQ330102
Publication date: 24 October 2016
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1305.2714
convex optimizationstatistical estimationstochastic noisemodel fittingproximity operatorstructured sparsitygeneralized LASSOlinear inverse
Gaussian processes (60G15) Convex programming (90C25) Approximation methods and heuristics in mathematical programming (90C59) Convex functions and convex programs in convex geometry (52A41)
Related Items
Recovering Structured Signals in Noise: Least-Squares Meets Compressed Sensing, The bounds of restricted isometry constants for low rank matrices recovery, \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?, Adaptive risk bounds in univariate total variation denoising and trend filtering, Estimating piecewise monotone signals, Generic error bounds for the generalized Lasso with sub-exponential data, Fast and Reliable Parameter Estimation from Nonlinear Observations, On the risk of convex-constrained least squares estimators under misspecification, Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions, On the determination of Lagrange multipliers for a weighted Lasso problem using geometric and convex analysis techniques, Uniqueness conditions for low-rank matrix recovery, Adaptive confidence sets in shape restricted regression, Which bridge estimator is the best for variable selection?, Nonparametric shape-restricted regression, A new perspective on least squares under convex constraint, Learning semidefinite regularizers, Overcoming the limitations of phase transition by higher order analysis of regularization techniques, Sharp oracle inequalities for least squares estimators in shape restricted regression, Sharp RIP bound for sparse signal and low-rank matrix recovery, On risk bounds in isotonic and other shape restricted regression problems, Distribution-free properties of isotonic regression
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- Simple bounds for recovering low-complexity models
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Minimax risk of matrix denoising by singular value thresholding
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- The convex geometry of linear inverse problems
- Universality in polytope phase transitions and message passing algorithms
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Exact matrix completion via convex optimization
- Stable Image Reconstruction Using Total Variation Minimization
- Proximal Splitting Methods in Signal Processing
- Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices
- Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
- Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- Monotone Operators and the Proximal Point Algorithm
- On the small balls problem for equivalent Gaussian measures
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Modified-CS: Modifying Compressive Sensing for Problems With Partially Known Support
- Sparse Recovery of Nonnegative Signals With Minimal Expansion
- Atomic Norm Denoising With Applications to Line Spectral Estimation
- De-noising by soft-thresholding
- Compressive principal component pursuit
- Computational and statistical tradeoffs via convex relaxation
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- Living on the edge: phase transitions in convex programs with random data
- The LASSO Risk for Gaussian Matrices
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Model-Based Compressive Sensing
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Neighborliness of randomly projected simplices in high dimensions
- Signal Recovery by Proximal Forward-Backward Splitting
- Stable signal recovery from incomplete and inaccurate measurements
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers