Pages that link to "Item:Q3143610"
From MaRDI portal
The following pages link to Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration (Q3143610):
Displaying 29 items.
- Oracle-type posterior contraction rates in Bayesian inverse problems (Q256079) (← links)
- Convolution regularization method for backward problems of linear parabolic equations (Q739024) (← links)
- Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution (Q777510) (← links)
- Early stopping for statistical inverse problems via truncated SVD estimation (Q1616307) (← links)
- Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations (Q1785032) (← links)
- Towards adaptivity via a new discrepancy principle for Poisson inverse problems (Q2044369) (← links)
- A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems (Q2153955) (← links)
- Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems (Q2191842) (← links)
- Smoothed residual stopping for statistical inverse problems via truncated SVD estimation (Q2209816) (← links)
- General regularization schemes for signal detection in inverse problems (Q2261922) (← links)
- Balancing principle in supervised learning for a general regularization scheme (Q2278452) (← links)
- On the lifting of deterministic convergence rates for inverse problems with stochastic noise (Q2360781) (← links)
- Multi-task learning via linear functional strategy (Q2407408) (← links)
- Discrepancy based model selection in statistical inverse problems (Q2442862) (← links)
- Parameter Choices for Fast Harmonic Spline Approximation (Q3120070) (← links)
- Adaptivity and Oracle Inequalities in Linear Statistical Inverse Problems: A (Numerical) Survey (Q4554185) (← links)
- Optimal Adaptation for Early Stopping in Statistical Inverse Problems (Q4689165) (← links)
- Adaptive discretization for signal detection in statistical inverse problems (Q4982027) (← links)
- (Q4998979) (← links)
- A modified discrepancy principle to attain optimal convergence rates under unknown noise (Q5006372) (← links)
- Optimal Convergence of the Discrepancy Principle for Polynomially and Exponentially Ill-Posed Operators under White Noise (Q5073867) (← links)
- On the discrepancy principle for stochastic gradient descent (Q5123704) (← links)
- Regularization parameter selection in indirect regression by residual based bootstrap (Q5134476) (← links)
- Bayesian inverse problems with non-commuting operators (Q5226663) (← links)
- Inverse learning in Hilbert scales (Q6134325) (← links)
- Dual gradient method for ill-posed problems using multiple repeated measurement data (Q6165999) (← links)
- Spectral algorithms for functional linear regression (Q6592232) (← links)
- Weighted discrepancy principle and optimal adaptivity in Poisson inverse problems (Q6634800) (← links)
- Regularization of linear inverse problems with irregular noise using embedding operators (Q6666617) (← links)