Regularization independent of the noise level: an analysis of quasi-optimality
From MaRDI portal
Publication:3537465
DOI10.1088/0266-5611/24/5/055009zbMATH Open1147.49024arXiv0710.1045OpenAlexW3099383717MaRDI QIDQ3537465FDOQ3537465
Authors:
Publication date: 6 November 2008
Published in: Inverse Problems (Search for Journal in Brave)
Abstract: The quasi-optimality criterion chooses the regularization parameter in inverse problems without taking into account the noise level. This rule works remarkably well in practice, although Bakushinskii has shown that there are always counterexamples with very poor performance. We propose an average case analysis of quasi-optimality for spectral cut-off estimators and we prove that the quasi-optimality criterion determines estimators which are rate-optimal {em on average}. Its practical performance is illustrated with a calibration problem from mathematical finance.
Full work available at URL: https://arxiv.org/abs/0710.1045
Recommendations
- The quasi-optimality criterion for classical inverse problems
- The quasi-optimality criterion in the linear functional strategy
- Some considerations concerning regularization and parameter choice algorithms
- On the quasi-optimal rules for the choice of the regularization parameter in case of a noisy operator
- Heuristic parameter choice in Tikhonov method from minimizers of the quasi-optimality function
regularization parameter in inverse problemsspectral cut-off estimatorstruncated singular value decomposition (TSVD)
Cited In (28)
- Comparing parameter choice methods for regularization of ill-posed problems
- Parameter choices for fast harmonic spline approximation
- Statistical Skorohod embedding problem: optimality and asymptotic normality
- Heuristic parameter choice in Tikhonov method from minimizers of the quasi-optimality function
- Estimation of the regularization parameter in linear discrete ill-posed problems using the Picard parameter
- Quasioptimal nonlinear filtering with regularization
- Parameter choice methods using minimization schemes
- Concentration inequalities for cross-validation in scattered data approximation
- Noise Level Free Regularization of General Linear Inverse Problems under Unconstrained White Noise
- Lasso Granger causal models: some strategies and their efficiency for gene expression regulatory networks
- Noise representation in residuals of LSQR, LSMR, and CRAIG regularization
- Statistical inference for time-changed Lévy processes via Mellin transform approach
- Statistical inference for time-changed Lévy processes via composite characteristic function estimation
- A modified discrepancy principle to attain optimal convergence rates under unknown noise
- Old and new parameter choice rules for discrete ill-posed problems
- A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems
- Fast and scalable computation of shape-morphing nonlinear solutions with application to evolutional neural networks
- A mollifier approach to regularize a Cauchy problem for the inhomogeneous Helmholtz equation
- Estimation and Calibration of Lévy Models via Fourier Methods
- Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution
- Lévy density estimation via information projection onto wavelet subspaces
- The quasi-optimality criterion in the linear functional strategy
- The quasi-optimality criterion for classical inverse problems
- The Little Engine that Could: Regularization by Denoising (RED)
- Title not available (Why is that?)
- Optimal adaptation for early stopping in statistical inverse problems
- Adaptivity and oracle inequalities in linear statistical inverse problems: a (numerical) survey
- Enhancing linear regularization to treat large noise
This page was built for publication: Regularization independent of the noise level: an analysis of quasi-optimality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3537465)