On the generalization error of fixed combinations of classifiers
From MaRDI portal
Publication:881592
DOI10.1016/j.jcss.2006.10.017zbMath1115.68126OpenAlexW2010463294MaRDI QIDQ881592
Publication date: 30 May 2007
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jcss.2006.10.017
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Enlarging the margins in perceptron decision trees
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- A learning rule for very simple universal approximators consisting of a single layer of perceptrons
- Learnability and the Vapnik-Chervonenkis dimension
- Uniform Central Limit Theorems
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- Function Learning from Interpolation
- Structural risk minimization over data-dependent hierarchies
- 10.1162/153244302760200713
- Neural Network Learning
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convergence of stochastic processes
This page was built for publication: On the generalization error of fixed combinations of classifiers