Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
From MaRDI portal
Publication:6136775
DOI10.1088/1361-6420/AD12E0arXiv2304.10356OpenAlexW4389367969MaRDI QIDQ6136775FDOQ6136775
Publication date: 17 January 2024
Published in: Inverse Problems (Search for Journal in Brave)
Abstract: We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form , where is the available data, the forward operator, an ordered filter, and a regularization parameter. Whenever such a method is used in practice, has to be chosen appropriately. Typically, the aim is to find or at least approximate the best possible in the sense that mean squared error (MSE) w.r.t.~the true solution is minimized. In this paper, we introduce the Sharp Optimal Lepskiu{i}-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on and the noise level as well as the operator and the filter and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiii-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
Full work available at URL: https://arxiv.org/abs/2304.10356
Cites Work
- Introduction to nonparametric estimation
- A new chi-square approximation to the distribution of non-negative definite quadratic forms in non-central normal variables
- Composite quantile regression and the oracle model selection theory
- Estimating nuisance parameters in inverse problems
- The mathematics of computerized tomography
- Regularization of some linear ill-posed problems with discretized random noisy data
- Image deblurring with Poisson data: from cells to galaxies
- Geometry of linear ill-posed problems in variable Hilbert scales
- Title not available (Why is that?)
- How general are general source conditions?
- Comparing parameter choice methods for regularization of ill-posed problems
- Optimal filtering of square-integrable signals in Gaussian noise
- Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix
- Minimax theory of image reconstruction
- Convergence Rates for Inverse Problems with Impulsive Noise
- Convergence rates in expectation for Tikhonov-type regularization of inverse problems with Poisson data
- On Difference-Based Variance Estimation in Nonparametric Regression When the Covariate is High Dimensional
- Statistical Inverse Estimation in Hilbert Scales
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- The Lepskii principle revisited
- Regularization of exponentially ill-posed problems
- A Lepskij-type stopping rule for regularized Newton methods
- Regularization without preliminary knowledge of smoothness and error behaviour
- On the best rate of adaptive estimation in some inverse problems
- Title not available (Why is that?)
- A statistical approach to some inverse problems for partial differential equations
- Variance estimation for high-dimensional regression models
- Inverse problems with Poisson data: statistical regularization theory, applications and algorithms
- Convergence rates for exponentially ill-posed inverse problems with impulsive noise
- Adaptivity and Oracle Inequalities in Linear Statistical Inverse Problems: A (Numerical) Survey
- Inverse problems for partial differential equations
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Title not available (Why is that?)
- The Le Cam distance between density estimation, Poisson processes and Gaussian white noise
- Bootstrap tuning in Gaussian ordered model selection
- Minimax estimation of the solution of an ill-posed convolution type problem
Cited In (1)
This page was built for publication: Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136775)