Empirical risk minimization as parameter choice rule for general linear regularization methods
From MaRDI portal
Abstract: We consider the statistical inverse problem to recover from noisy measurements where is Gaussian white noise and a compact operator between Hilbert spaces. Considering general reconstruction methods of the form with an ordered filter , we investigate the choice of the regularization parameter by minimizing an unbiased estimate of the predictive risk . The corresponding parameter and its usage are well-known in the literature, but oracle inequalities and optimality results in this general setting are unknown. We prove a (generalized) oracle inequality, which relates the direct risk with the oracle prediction risk . From this oracle inequality we are then able to conclude that the investigated parameter choice rule is of optimal order. Finally we also present numerical simulations, which support the order optimality of the method and the quality of the parameter choice in finite sample situations.
Recommendations
Cites work
- scientific article; zbMATH DE number 3919680 (Why is no real title available?)
- scientific article; zbMATH DE number 936298 (Why is no real title available?)
- scientific article; zbMATH DE number 3298300 (Why is no real title available?)
- A Lepskij-type stopping rule for regularized Newton methods
- A statistical approach to some inverse problems for partial differential equations
- A statistical perspective on ill-posed inverse problems (with discussion)
- Adaptive Wavelet Galerkin Methods for Linear Inverse Problems
- Adaptive spectral regularizations of high dimensional linear models
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Asymptotic optimality for \(C_ p\), \(C_ L\), cross-validation and generalized cross-validation: Discrete index set
- Asymptotic optimality of generalized cross-validation for choosing the regularization parameter
- Asymptotically optimal difference-based estimation of variance in nonparametric regression
- Bandwidth choice for nonparametric regression
- Block Thresholding and Sharp Adaptive Estimation in Severely Ill-Posed Inverse Problems
- Bounds on the prediction error of penalized least squares estimators with convex penalty
- Characterizations of variational source conditions, converse results, and maxisets of spectral regularization methods
- Comparing parameter choice methods for regularization of ill-posed problems
- Computational Methods for Inverse Problems
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- Convergence rates in expectation for Tikhonov-type regularization of inverse problems with Poisson data
- Discretization effects in statistical inverse problems
- Estimating the Variance In Nonparametric Regression—What is a Reasonable Choice?
- Estimation of the mean of a multivariate normal distribution
- Gaussian model selection
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Geometry of linear ill-posed problems in variable Hilbert scales
- How general are general source conditions?
- Improved estimates of statistical regularization parameters in Fourier differentiation and smoothing
- Minimal penalties for Gaussian model selection
- Minimax rates for statistical inverse problems under general source conditions
- Minimax signal detection in ill-posed inverse problems
- Nonlinear solution of linear inverse problems by wavelet-vaguelette decomposition
- On a Problem of Adaptive Estimation in Gaussian White Noise
- On convergence rates for iteratively regularized Newton-type methods under a Lipschitz-type nonlinearity condition
- On pointwise adaptive nonparametric deconvolution
- On the best rate of adaptive estimation in some inverse problems
- On the discrepancy principle and generalised maximum likelihood for regularisation
- On universal oracle inequalities related to high-dimensional linear models
- Optimal Choice of a Truncation Level for the Truncated SVD Solution of Linear First Kind Integral Equations When Data are Noisy
- Optimal adaptation for early stopping in statistical inverse problems
- Optimal discretization of inverse problems in Hilbert scales. Regularization and self-regulari\-za\-tion of projection methods
- Optimal filtering of square-integrable signals in Gaussian noise
- Oracle inequalities for inverse problems
- Ordered linear smoothers
- Practical Approximate Solutions to Linear Operator Equations When the Data are Noisy
- Regularization of some linear ill-posed problems with discretized random noisy data
- Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations
- Risk hull method and regularization by projections of ill-posed inverse problems
- Runge–Kutta integrators yield optimal regularization schemes
- SURE guided Gaussian mixture image denoising
- Signal detection for inverse problems in a multidimensional framework
- Some Comments on C P
- Spectral cut-off regularizations for ill-posed linear models
- Statistical Inverse Estimation in Hilbert Scales
- Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection
- The Lepskii principle revisited
- The principle of penalized empirical risk in severely ill-posed problems
- Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
- Wavelet Deconvolution in a Periodic Setting
- Wavelet decomposition approaches to statistical inverse problems
Cited in
(12)- Penalized empirical risk minimization over Besov spaces
- Predictive risk estimation for the expectation maximization algorithm with Poisson data
- A modified discrepancy principle to attain optimal convergence rates under unknown noise
- On the asymptotical regularization for linear inverse problems in presence of white noise
- Oracle inequalities for inverse problems
- A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
- Empirical risk minimization in inverse problems
- Oracle inequality for a statistical Raus-Gfrerer-type rule
- Integrality constraints in minimizing the empirical loss function of linear decision rules
- Unbiased predictive risk estimation of the Tikhonov regularization parameter: convergence with increasing rank approximations of the singular value decomposition
This page was built for publication: Empirical risk minimization as parameter choice rule for general linear regularization methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2179243)