Optimal learning rates for kernel partial least squares
From MaRDI portal
Publication:1645280
DOI10.1007/s00041-017-9544-8zbMath1395.68235OpenAlexW2604519558MaRDI QIDQ1645280
Publication date: 28 June 2018
Published in: The Journal of Fourier Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00041-017-9544-8
Related Items (6)
Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications ⋮ Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems ⋮ Convergence analysis of Tikhonov regularization for non-linear statistical inverse problems ⋮ Kernel conjugate gradient methods with random projections ⋮ Approximate kernel partial least squares ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- An empirical feature-based learning algorithm producing sparse approximations
- On regularization algorithms in learning theory
- Adaptive kernel methods using the balancing principle
- Regularization networks and support vector machines
- Optimal rates for the regularized least-squares algorithm
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- On early stopping in gradient descent learning
- 10.1162/15324430260185556
- Spectral Algorithms for Supervised Learning
- CROSS-VALIDATION BASED ADAPTATION FOR REGULARIZATION OPERATORS IN LEARNING THEORY
- Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable Selection
- Regularization schemes for minimum error entropy principle
This page was built for publication: Optimal learning rates for kernel partial least squares