Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
From MaRDI portal
Publication:6136775
Abstract: We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form , where is the available data, the forward operator, an ordered filter, and a regularization parameter. Whenever such a method is used in practice, has to be chosen appropriately. Typically, the aim is to find or at least approximate the best possible in the sense that mean squared error (MSE) w.r.t.~the true solution is minimized. In this paper, we introduce the Sharp Optimal Lepskiu{i}-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on and the noise level as well as the operator and the filter and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiii-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
Recommendations
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Adaptive complexity regularization for linear inverse problems
- Adaptive hard-thresholding for linear inverse problems
- Oracle inequalities for inverse problems
Cites work
- scientific article; zbMATH DE number 3911594 (Why is no real title available?)
- scientific article; zbMATH DE number 781866 (Why is no real title available?)
- scientific article; zbMATH DE number 936298 (Why is no real title available?)
- A Lepskij-type stopping rule for regularized Newton methods
- A new chi-square approximation to the distribution of non-negative definite quadratic forms in non-central normal variables
- A statistical approach to some inverse problems for partial differential equations
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Bootstrap tuning in Gaussian ordered model selection
- Comparing parameter choice methods for regularization of ill-posed problems
- Composite quantile regression and the oracle model selection theory
- Convergence Rates for Inverse Problems with Impulsive Noise
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- Convergence rates for exponentially ill-posed inverse problems with impulsive noise
- Convergence rates in expectation for Tikhonov-type regularization of inverse problems with Poisson data
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Estimating nuisance parameters in inverse problems
- Geometry of linear ill-posed problems in variable Hilbert scales
- How general are general source conditions?
- Image deblurring with Poisson data: from cells to galaxies
- Introduction to nonparametric estimation
- Inverse problems for partial differential equations
- Inverse problems with Poisson data: statistical regularization theory, applications and algorithms
- Minimax estimation of the solution of an ill-posed convolution type problem
- Minimax theory of image reconstruction
- On Difference-Based Variance Estimation in Nonparametric Regression When the Covariate is High Dimensional
- On the best rate of adaptive estimation in some inverse problems
- Optimal filtering of square-integrable signals in Gaussian noise
- Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix
- Regularization of exponentially ill-posed problems
- Regularization of some linear ill-posed problems with discretized random noisy data
- Regularization without preliminary knowledge of smoothness and error behaviour
- Statistical Inverse Estimation in Hilbert Scales
- The Le Cam distance between density estimation, Poisson processes and Gaussian white noise
- The Lepskii principle revisited
- The mathematics of computerized tomography
- Variance estimation for high-dimensional regression models
This page was built for publication: Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136775)