Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations
DOI10.3934/IPI.2018047OpenAlexW2580509819MaRDI QIDQ1785032FDOQ1785032
Authors: Felix Lucka, Katharina Proksch, Christoph Brune, Nicolai Bissantz, Martin Burger, Frank Wübbeling, Holger Dette
Publication date: 27 September 2018
Published in: Inverse Problems and Imaging (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1701.04970
Recommendations
- A General Heuristic for Choosing the Regularization Parameter in Ill-Posed Problems
- An optimal parameter choice for regularized ill-posed problems
- scientific article; zbMATH DE number 3972088
- scientific article; zbMATH DE number 4094684
- Regularization of ill-posed problems: Optimal parameter choice in finite dimensions
- Comparing parameter choice methods for regularization of ill-posed problems
- On the choice of the regularization parameter in ill-posed problems with approximately given noise level of data
- On minimization strategies for choice of the regularization parameter in ill-posed problems
- The principle of penalized empirical risk in severely ill-posed problems
- scientific article; zbMATH DE number 3281217
Asymptotic properties of parametric estimators (62F12) Ill-posedness and regularization problems in numerical linear algebra (65F22) Inverse problems in optimal control (49N45)
Cites Work
- Estimation of the mean of a multivariate normal distribution
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Local behavior of sparse analysis regularization: applications to risk estimation
- A SURE Approach for Digital Signal/Image Deconvolution Problems
- The Use of the L-Curve in the Regularization of Discrete Ill-Posed Problems
- Asymptotic optimality for \(C_ p\), \(C_ L\), cross-validation and generalized cross-validation: Discrete index set
- Title not available (Why is that?)
- Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration
- Generalized SURE for Exponential Families: Applications to Regularization
- Title not available (Why is that?)
- Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection
- Random Fields and Geometry
- Image Denoising in Mixed Poisson–Gaussian Noise
- Regularization Parameter Selection for Nonlinear Iterative Image Restoration and MRI Reconstruction Using GCV and SURE-Based Methods
- Analysis of Discrete Ill-Posed Problems by Means of the L-Curve
- Title not available (Why is that?)
- SURE Estimates for a Heteroscedastic Hierarchical Model
- Modular solvers for image restoration problems using the discrepancy principle
- Parameter Estimation for Blind and Non-Blind Deblurring Using Residual Whiteness Measures
- Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
- On a Problem of Adaptive Estimation in Gaussian White Noise
- First order algorithms in variational image processing
- Ordered linear smoothers
- A Lepskij-type stopping rule for regularized Newton methods
- From Stein's unbiased risk estimates to the method of generalized cross- validation
- The degrees of freedom of the Lasso for general design matrix
- On SURE estimates in hierarchical models assuming heteroscedasticity for both levels of a two-level normal hierarchical model
- Iterative parameter choice by discrepancy principle
- The discrepancy principle for a class of regularization methods
- The projected GSURE for automatic parameter tuning in iterative shrinkage methods
- SURE guided Gaussian mixture image denoising
- Title not available (Why is that?)
- The degrees of freedom of partly smooth regularizers
- Nonlocal Means With Dimensionality Reduction and SURE-Based Parameter Selection
- Spectral cut-off regularizations for ill-posed linear models
- The homotopy method revisited: computing solution paths of \(\ell_1\)-regularized problems
Cited In (15)
- Risk hull method and regularization by projections of ill-posed inverse problems
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Early stopping for statistical inverse problems via truncated SVD estimation
- Unbiased predictive risk estimation of the Tikhonov regularization parameter: convergence with increasing rank approximations of the singular value decomposition
- Noise Level Free Regularization of General Linear Inverse Problems under Unconstrained White Noise
- Lower Risk Bounds and Properties of Confidence Sets for Ill-Posed Estimation Problems with Applications to Spectral Density and Persistence Estimation, Unit Roots, and Estimation of Long Memory Parameters
- Semi-discrete Tikhonov regularization in RKHS with large randomly distributed noise
- Predictive risk estimation for the expectation maximization algorithm with Poisson data
- A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems
- Maximum likelihood estimation of regularization parameters in high-dimensional inverse problems: an empirical Bayesian approach. I: Methodology and experiments
- Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution
- Convergence of regularization methods with filter functions for a regularization parameter chosen with GSURE and mildly ill-posed inverse problems
- Tomographic reconstruction from Poisson distributed data: a fast and convergent EM-TV dual approach
- A parameter choice rule for Tikhonov regularization based on predictive risk
- Risk Estimators for Choosing Regularization Parameters in Ill-Posed Problems - Properties and Limitations
This page was built for publication: Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1785032)