Boosting simple learners
From MaRDI portal
Publication:6566593
Cites work
- scientific article; zbMATH DE number 2089371 (Why is no real title available?)
- scientific article; zbMATH DE number 775021 (Why is no real title available?)
- 10.1162/1532443041424319
- A theory of multiclass boosting
- A theory of the learnable
- AdaBoost is consistent
- Boosting With theL2Loss
- Boosting. Foundations and algorithms.
- Density and dimension
- Discrepancy and approximations for bounded VC-dimension
- Geometric methods in the study of irregularities of distribution
- Greedy function approximation: A gradient boosting machine.
- Learnability and the Vapnik-Chervonenkis dimension
- On the Bayes-risk consistency of regularized boosting methods.
- On the density of families of sets
- Process consistency for AdaBoost.
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Stochastic gradient boosting.
- The VC dimension of \(k\)-fold union
- Tight lower bounds on the VC-dimension of geometric set systems
- Tight upper bounds for the discrepancy of half-spaces
- Understanding machine learning. From theory to algorithms
- Vapnik–Chervonenkis dimension of axis-parallel cuts
- Zur Theorie der Gesellschaftsspiele.
This page was built for publication: Boosting simple learners
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6566593)