Empirical risk minimization as parameter choice rule for general linear regularization methods
DOI10.1214/19-AIHP966zbMATH Open1439.62096arXiv1703.07809OpenAlexW3004886965MaRDI QIDQ2179243FDOQ2179243
Authors: Yanyan Li
Publication date: 12 May 2020
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.07809
Recommendations
oracle inequalityregularization methodexponential boundsstatistical inverse problema posteriori parameter choice ruleorder optimalityfilter-based inversion
Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Numerical solutions of ill-posed problems in abstract spaces; regularization (65J20) Numerical solution to inverse problems in abstract spaces (65J22)
Cites Work
- Some Comments on C P
- Estimation of the mean of a multivariate normal distribution
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Nonlinear solution of linear inverse problems by wavelet-vaguelette decomposition
- Asymptotically optimal difference-based estimation of variance in nonparametric regression
- Gaussian model selection
- Asymptotic optimality for \(C_ p\), \(C_ L\), cross-validation and generalized cross-validation: Discrete index set
- Bandwidth choice for nonparametric regression
- Minimal penalties for Gaussian model selection
- Estimating the Variance In Nonparametric Regression—What is a Reasonable Choice?
- A statistical perspective on ill-posed inverse problems (with discussion)
- Regularization of some linear ill-posed problems with discretized random noisy data
- Geometry of linear ill-posed problems in variable Hilbert scales
- Computational Methods for Inverse Problems
- Title not available (Why is that?)
- Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection
- How general are general source conditions?
- Comparing parameter choice methods for regularization of ill-posed problems
- Title not available (Why is that?)
- Optimal filtering of square-integrable signals in Gaussian noise
- Oracle inequalities for inverse problems
- Risk hull method and regularization by projections of ill-posed inverse problems
- Practical Approximate Solutions to Linear Operator Equations When the Data are Noisy
- Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
- Wavelet decomposition approaches to statistical inverse problems
- Adaptive Wavelet Galerkin Methods for Linear Inverse Problems
- On universal oracle inequalities related to high-dimensional linear models
- Convergence rates in expectation for Tikhonov-type regularization of inverse problems with Poisson data
- Statistical Inverse Estimation in Hilbert Scales
- Wavelet Deconvolution in a Periodic Setting
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- The Lepskii principle revisited
- Title not available (Why is that?)
- On a Problem of Adaptive Estimation in Gaussian White Noise
- Optimal discretization of inverse problems in Hilbert scales. Regularization and self-regulari\-za\-tion of projection methods
- Ordered linear smoothers
- A Lepskij-type stopping rule for regularized Newton methods
- Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations
- Block Thresholding and Sharp Adaptive Estimation in Severely Ill-Posed Inverse Problems
- The principle of penalized empirical risk in severely ill-posed problems
- Signal detection for inverse problems in a multidimensional framework
- Asymptotic optimality of generalized cross-validation for choosing the regularization parameter
- Improved estimates of statistical regularization parameters in Fourier differentiation and smoothing
- Optimal Choice of a Truncation Level for the Truncated SVD Solution of Linear First Kind Integral Equations When Data are Noisy
- On the best rate of adaptive estimation in some inverse problems
- On the discrepancy principle and generalised maximum likelihood for regularisation
- On pointwise adaptive nonparametric deconvolution
- Minimax signal detection in ill-posed inverse problems
- A statistical approach to some inverse problems for partial differential equations
- Discretization effects in statistical inverse problems
- On convergence rates for iteratively regularized Newton-type methods under a Lipschitz-type nonlinearity condition
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Optimal adaptation for early stopping in statistical inverse problems
- SURE guided Gaussian mixture image denoising
- Characterizations of variational source conditions, converse results, and maxisets of spectral regularization methods
- Bounds on the prediction error of penalized least squares estimators with convex penalty
- Spectral cut-off regularizations for ill-posed linear models
- Adaptive spectral regularizations of high dimensional linear models
- Runge–Kutta integrators yield optimal regularization schemes
- Minimax rates for statistical inverse problems under general source conditions
Cited In (12)
- Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
- Penalized empirical risk minimization over Besov spaces
- Empirical risk minimization in inverse problems
- Unbiased predictive risk estimation of the Tikhonov regularization parameter: convergence with increasing rank approximations of the singular value decomposition
- Oracle inequalities for inverse problems
- Oracle inequality for a statistical Raus-Gfrerer-type rule
- A modified discrepancy principle to attain optimal convergence rates under unknown noise
- Predictive risk estimation for the expectation maximization algorithm with Poisson data
- On the asymptotical regularization for linear inverse problems in presence of white noise
- A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems
- Integrality constraints in minimizing the empirical loss function of linear decision rules
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
This page was built for publication: Empirical risk minimization as parameter choice rule for general linear regularization methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2179243)