Learning rates for l^1-regularized kernel classifiers
From MaRDI portal
Learning rates for \(l^1\)-regularized kernel classifiers
Recommendations
- Classification with polynomial kernels and \(l^1\)-coefficient regularization
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Multi-kernel regularized classifiers
- An approximation theory approach to learning with \(\ell^1\) regularization
- Approximation with polynomial kernels and SVM classifiers
Cited in
(17)- Ideal regularization for learning kernels from labels
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Learning rates of kernel-based robust classification
- Multi-kernel regularized classifiers
- Error analysis of classification learning algorithms based on LUMs loss
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- The learning rate of \(l_2\)-coefficient regularized classification with strong loss
- scientific article; zbMATH DE number 7295804 (Why is no real title available?)
- Approximate minimization of the regularized expected error over kernel models
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Optimal learning rates for kernel partial least squares
- Learning sparse low-threshold linear classifiers
- Classification with polynomial kernels and \(l^1\)-coefficient regularization
- Learning rates for multi-kernel linear programming classifiers
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Regularized ranking with convex losses and \(\ell^1\)-penalty
- An approximation theory approach to learning with \(\ell^1\) regularization
This page was built for publication: Learning rates for \(l^1\)-regularized kernel classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1789959)