Learning rates of multi-kernel regularized regression
From MaRDI portal
Recommendations
- Learning rates of multi-kernel regression by orthogonal greedy algorithm
- Regularized least square algorithm with two kernels
- Learning rates of least-square regularized regression
- Learning rates of regularized regression for functional data
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
Cites work
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 3329342 (Why is no real title available?)
- 10.1162/153244303321897690
- Analysis of support vector machines regression
- Fast rates for support vector machines using Gaussian kernels
- Learnability of Gaussians with flexible variances
- Learning Bounds for Support Vector Machines with Learned Kernels
- Learning Theory
- Learning and approximation by Gaussians on Riemannian manifolds
- Learning rates of least-square regularized regression
- Learning the kernel function via regularization
- Learning the kernel matrix with semidefinite programming
- Model selection for regularized least-squares algorithm in learning theory
- Multi-kernel regularized classifiers
- Neural Network Learning
- On the mathematical foundations of learning
- Ranking and empirical minimization of \(U\)-statistics
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Support vector machine soft margin classifiers: error analysis
- Theory of Reproducing Kernels
Cited in
(20)- Reproducing kernels and choices of associated feature spaces, in the form of \(L^2\)-spaces
- Randomized multi-scale kernels learning with sparsity constraint regularization for regression
- Optimal learning rates for kernel partial least squares
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Statistical analysis of the moving least-squares method with unbounded sampling
- Optimal convergence rates of high order Parzen windows with unbounded sampling
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Improvement of multiple kernel learning using adaptively weighted regularization
- Multi-kernel regularized classifiers
- Approximation analysis of gradient descent algorithm for bipartite ranking
- Regularized least square algorithm with two kernels
- Learning rates of multi-kernel regression by orthogonal greedy algorithm
- Learning performance of regularized regression with multiscale kernels based on Markov observations
- Refined Rademacher chaos complexity bounds with applications to the multikernel learning problem
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- Boosted kernel ridge regression: optimal learning rates and early stopping
- Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions
- Learning rates for multi-kernel linear programming classifiers
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Learning rates of multitask kernel methods
This page was built for publication: Learning rates of multi-kernel regularized regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q974504)