Convergence rates of Kernel Conjugate Gradient for random design regression
From MaRDI portal
Publication:2835985
DOI10.1142/S0219530516400017zbMath1349.62125arXiv1607.02387OpenAlexW2963613337MaRDI QIDQ2835985
Nicole Krämer, Gilles Blanchard
Publication date: 30 November 2016
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1607.02387
partial least squaresnonparametric regressionreproducing kernel Hilbert spaceconjugate gradientminimax convergence rates
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Optimal stopping in statistics (62L15)
Related Items (21)
Toward Efficient Ensemble Learning with Structure Constraints: Convergent Algorithms and Applications ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Faster Kernel Ridge Regression Using Sketching and Preconditioning ⋮ Capacity dependent analysis for functional online learning algorithms ⋮ Learning rates for the kernel regularized regression with a differentiable strongly convex loss ⋮ Unnamed Item ⋮ Asymptotic analysis for affine point processes with large initial intensity ⋮ Optimal rates for regularization of statistical inverse learning problems ⋮ Kernel conjugate gradient methods with random projections ⋮ Optimal learning rates for distribution regression ⋮ Accelerate stochastic subgradient method by leveraging local growth condition ⋮ Semi-supervised learning with summary statistics ⋮ Analysis of regularized Nyström subsampling for regression functions of low smoothness ⋮ Distributed learning with indefinite kernels ⋮ On a regularization of unsupervised domain adaptation in RKHS ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Distributed least squares prediction for functional linear regression* ⋮ Error analysis of the kernel regularized regression based on refined convex losses and RKBSs ⋮ From inexact optimization to learning via gradient concentration
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimal rates for the regularized least-squares algorithm
- 10.1162/15324430260185556
- The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses
- CROSS-VALIDATION BASED ADAPTATION FOR REGULARIZATION OPERATORS IN LEARNING THEORY
This page was built for publication: Convergence rates of Kernel Conjugate Gradient for random design regression