Generalization Analysis of Fredholm Kernel Regularized Classifiers
From MaRDI portal
Publication:5380820
DOI10.1162/NECO_a_00967zbMath1456.68148WikidataQ38836885 ScholiaQ38836885MaRDI QIDQ5380820
No author found.
Publication date: 6 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Fredholm integral equations (45B05)
Related Items (2)
Modal additive models with data-driven structure identification ⋮ Regularized modal regression with data-dependent hypothesis spaces
Cites Work
- Convergence rate of the semi-supervised greedy algorithm
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Multi-kernel regularized classifiers
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- The covering number in learning theory
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Learning with sample dependent hypothesis spaces
- Approximation with polynomial kernels and SVM classifiers
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Capacity of reproducing kernel spaces in learning theory
- Efficient agnostic learning of neural networks with bounded fan-in
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- 10.1162/153244303321897690
- Error Analysis of Coefficient-Based Regularized Algorithm for Density-Level Detection
- Learning Theory
- Theory of Reproducing Kernels
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Generalization Analysis of Fredholm Kernel Regularized Classifiers