Pages that link to "Item:Q2278452"
From MaRDI portal
The following pages link to Balancing principle in supervised learning for a general regularization scheme (Q2278452):
Displaying 18 items.
- Analysis of regularized least squares for functional linear regression model (Q1791683) (← links)
- The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs (Q1983602) (← links)
- A statistical learning assessment of Huber regression (Q2054280) (← links)
- On a regularization of unsupervised domain adaptation in RKHS (Q2075006) (← links)
- Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems (Q2191842) (← links)
- Convergence analysis of Tikhonov regularization for non-linear statistical inverse problems (Q2192321) (← links)
- Nonasymptotic analysis of robust regression with modified Huber's loss (Q2693696) (← links)
- Kernel regression, minimax rates and effective dimensionality: Beyond the regular case (Q3298576) (← links)
- Distributed spectral pairwise ranking algorithms (Q5060714) (← links)
- A machine learning approach to optimal Tikhonov regularization I: Affine manifolds (Q5077184) (← links)
- Error guarantees for least squares approximation with noisy samples in domain adaptation (Q6071903) (← links)
- Convex regularization in statistical inverse learning problems (Q6115632) (← links)
- Inverse learning in Hilbert scales (Q6134325) (← links)
- Nonlinear Tikhonov regularization in Hilbert scales for inverse learning (Q6193949) (← links)
- Spectral algorithms for functional linear regression (Q6592232) (← links)
- Least squares approximations in linear statistical inverse learning problems (Q6601195) (← links)
- Optimality of robust online learning (Q6645952) (← links)
- Adaptive parameter selection for kernel ridge regression (Q6657409) (← links)