A note on stability of error bounds in statistical learning theory
From MaRDI portal
Publication:3096969
Recommendations
- STABILITY RESULTS IN LEARNING THEORY
- 10.1162/153244302760200704
- Stability of unstable learning algorithms
- A survey on learning theory. I: Stability and generalization
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
Cites work
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Cross-validation based adaptation for regularization operators in learning theory
- Learning theory estimates via integral operators and their approximations
- Model selection for regularized least-squares algorithm in learning theory
- On regularization algorithms in learning theory
- On the mathematical foundations of learning
- Optimal rates for the regularized least-squares algorithm
- Regularization networks and support vector machines
Cited in
(3)
This page was built for publication: A note on stability of error bounds in statistical learning theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3096969)