Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
From MaRDI portal
Publication:5302192
DOI10.1137/060651884zbMath1234.62062OpenAlexW2091178361MaRDI QIDQ5302192
No author found.
Publication date: 6 January 2009
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/060651884
Tikhonov regularizationnonparametric regressionerrors in variablesboostingminimax convergence ratesHilbert scalesiterative regularization methodssatellite gradiometry
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Nonparametric inference (62G99)
Related Items (89)
On the lifting of deterministic convergence rates for inverse problems with stochastic noise ⋮ Early stopping for statistical inverse problems via truncated SVD estimation ⋮ Weyl eigenvalue asymptotics and sharp adaptation on vector bundles ⋮ Smooth backfitting in additive inverse regression ⋮ Adaptive estimation for an inverse regression model with unknown operator ⋮ Consistency of Bayesian inference with Gaussian process priors for a parabolic inverse problem ⋮ Regularized Posteriors in Linear Ill-Posed Inverse Problems ⋮ Exponential Inequalities in Stochastic Inverse Problems Using an Iterative Method ⋮ Regularization of linear ill-posed problems involving multiplication operators ⋮ Adaptivity and Oracle Inequalities in Linear Statistical Inverse Problems: A (Numerical) Survey ⋮ A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems ⋮ Variational multiscale nonparametric regression: smooth functions ⋮ Operator-theoretic and regularization approaches to ill-posed problems ⋮ Pointwise convergence in probability of general smoothing splines ⋮ Optimal Convergence of the Discrepancy Principle for Polynomially and Exponentially Ill-Posed Operators under White Noise ⋮ Examples of \(L^2\)-complete and boundedly-complete distributions ⋮ On regularization algorithms in learning theory ⋮ Bayesian inverse problems with non-conjugate priors ⋮ On universal oracle inequalities related to high-dimensional linear models ⋮ Convergence of regularization methods with filter functions for a regularization parameter chosen with GSURE and mildly ill-posed inverse problems ⋮ Empirical risk minimization as parameter choice rule for general linear regularization methods ⋮ Minimax rates for statistical inverse problems under general source conditions ⋮ Characterizations of Variational Source Conditions, Converse Results, and Maxisets of Spectral Regularization Methods ⋮ On convergence rates of adaptive ensemble Kalman inversion for linear ill-posed problems ⋮ Variational regularization in inverse problems and machine learning ⋮ A note on confidence intervals for deblurred images ⋮ Primal and dual Bregman methods with application to optical nanoscopy ⋮ Adaptive minimax optimality in statistical inverse problems via SOLIT—Sharp Optimal Lepskiĭ-Inspired Tuning ⋮ Optimal regularized hypothesis testing in statistical inverse problems ⋮ Noise Level Free Regularization of General Linear Inverse Problems under Unconstrained White Noise ⋮ Fredholm integral equations of the first kind and topological information theory ⋮ Locally adaptive image denoising by a statistical multiresolution criterion ⋮ On Design of Polyhedral Estimates in Linear Inverse Problems ⋮ A unified treatment for non-asymptotic and asymptotic approaches to minimax signal detection ⋮ Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem ⋮ Adaptive complexity regularization for linear inverse problems ⋮ Nonparametric estimation of the volatility function in a high-frequency model corrupted by noise ⋮ Adaptive spectral regularizations of high dimensional linear models ⋮ Spatially inhomogeneous linear inverse problems with possible singularities ⋮ Additive inverse regression models with convolution-type operators ⋮ Boosting algorithms: regularization, prediction and model fitting ⋮ Asymptotics for TAYLEX and SIMEX estimators in deconvolution of densities ⋮ Confidence regions for images observed under the Radon transform ⋮ Bayesian inverse problems with Gaussian priors ⋮ Bayesian linear inverse problems in regularity scales ⋮ Testing for Lack of Fit in Inverse Regression—with Applications to Biophotonic Imaging ⋮ Optimal rates for regularization of statistical inverse learning problems ⋮ On the Asymptotical Regularization for Linear Inverse Problems in Presence of White Noise ⋮ On the regularizing property of stochastic gradient descent ⋮ Multiscale scanning in inverse problems ⋮ ON RATE OPTIMALITY FOR ILL-POSED INVERSE PROBLEMS IN ECONOMETRICS ⋮ CONVERGENCE RATES FOR ILL-POSED INVERSE PROBLEMS WITH AN UNKNOWN OPERATOR ⋮ Asymptotics for spectral regularization estimators in statistical inverse problems ⋮ Goodness-of-fit testing strategies from indirect observations ⋮ General regularization schemes for signal detection in inverse problems ⋮ Signal detection for inverse problems in a multidimensional framework ⋮ The Stein hull ⋮ REGULARIZING PRIORS FOR LINEAR INVERSE PROBLEMS ⋮ Möbius deconvolution on the hyperbolic plane with application to impedance density estimation ⋮ Iterative Solution Methods ⋮ Backward problem for time-space fractional diffusion equations in Hilbert scales ⋮ Optimal Adaptation for Early Stopping in Statistical Inverse Problems ⋮ Convergence Rates for Penalized Least Squares Estimators in PDE Constrained Regression Problems ⋮ Convergence analysis of (statistical) inverse problems under conditional stability estimates ⋮ Optimal Convergence Rates for Tikhonov Regularization in Besov Spaces ⋮ Towards adaptivity via a new discrepancy principle for Poisson inverse problems ⋮ Consistency of the Tikhonov's regularization in an ill-posed problem with random data ⋮ Penalized estimators for non linear inverse problems ⋮ Risk hull method for spectral regularization in linear statistical inverse problems ⋮ Convergence Rates of Spectral Regularization Methods: A Comparison between Ill-Posed Inverse Problems and Statistical Kernel Learning ⋮ Adaptive discretization for signal detection in statistical inverse problems ⋮ Modern regularization methods for inverse problems ⋮ Solving inverse problems using data-driven models ⋮ Bayesian inverse problems with partial observations ⋮ Nonlinear Tikhonov regularization in Hilbert scales with balancing principle tuning parameter in statistical inverse problems ⋮ Identification and estimation of nonlinear models using two samples with nonclassical measurement errors ⋮ Adaptive estimation in the linear random coefficients model when regressors have limited variation ⋮ Kernel partial least squares for stationary data ⋮ Asymptotic normality and confidence intervals for inverse regression models with convolution-type operators ⋮ The empirical process of residuals from an inverse regression ⋮ Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution ⋮ Ill-Posed Problems: Operator Methodologies of Resolution and Regularization ⋮ A method for identifying a spacewise-dependent heat source under stochastic noise interference ⋮ A modified discrepancy principle to attain optimal convergence rates under unknown noise ⋮ Variational Bayes' Method for Functions with Applications to Some Inverse Problems ⋮ Optimal indirect estimation for linear inverse problems with discretely sampled functional data ⋮ Error estimates for variational regularization of inverse problems with general noise models for data and operator ⋮ Exponential Inequalities in Calibration Problems with Gaussians Errors ⋮ Confidence bands for multivariate and time dependent inverse regression models
This page was built for publication: Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications