Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration

From MaRDI portal
Publication:3143610

DOI10.1088/0266-5611/28/11/115011zbMath1284.47051OpenAlexW2002218738MaRDI QIDQ3143610

Peter Mathé, Gilles Blanchard

Publication date: 3 December 2012

Published in: Inverse Problems (Search for Journal in Brave)

Full work available at URL: https://semanticscholar.org/paper/ea9ca9a4930861d4f1d1ab07dd3ccb3da6222a53



Related Items

On the lifting of deterministic convergence rates for inverse problems with stochastic noise, Early stopping for statistical inverse problems via truncated SVD estimation, Adaptivity and Oracle Inequalities in Linear Statistical Inverse Problems: A (Numerical) Survey, A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems, Optimal Convergence of the Discrepancy Principle for Polynomially and Exponentially Ill-Posed Operators under White Noise, Multi-task learning via linear functional strategy, Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems, Unnamed Item, Inverse learning in Hilbert scales, Dual gradient method for ill-posed problems using multiple repeated measurement data, On the discrepancy principle for stochastic gradient descent, Smoothed residual stopping for statistical inverse problems via truncated SVD estimation, Discrepancy based model selection in statistical inverse problems, Regularization parameter selection in indirect regression by residual based bootstrap, General regularization schemes for signal detection in inverse problems, Balancing principle in supervised learning for a general regularization scheme, Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations, Optimal Adaptation for Early Stopping in Statistical Inverse Problems, Convolution regularization method for backward problems of linear parabolic equations, Towards adaptivity via a new discrepancy principle for Poisson inverse problems, Bayesian inverse problems with non-commuting operators, Adaptive discretization for signal detection in statistical inverse problems, Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution, Parameter Choices for Fast Harmonic Spline Approximation, A modified discrepancy principle to attain optimal convergence rates under unknown noise, Oracle-type posterior contraction rates in Bayesian inverse problems