On the equivalence of weak learnability and linear separability: new relaxations and efficient boosting algorithms
From MaRDI portal
Publication:1959594
DOI10.1007/s10994-010-5173-zzbMath1470.68173OpenAlexW2069936846MaRDI QIDQ1959594
Shai Shalev-Shwartz, Yoram Singer
Publication date: 7 October 2010
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-010-5173-z
Related Items
A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers ⋮ Unnamed Item ⋮ No-regret learning for repeated non-cooperative games with lossy bandits
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convex analysis and nonlinear optimization. Theory and examples.
- A primal-dual perspective of online learning algorithms
- Boosting the margin: a new explanation for the effectiveness of voting methods
- Analysis of boosting algorithms using the smooth margin function
- 10.1162/153244301753683726
- Entropy Regularized LPBoost
- Sequential greedy approximation for certain convex optimization problems
- 10.1162/153244304773936072
- An adaptive version of the boost by majority algorithm
- Logistic regression, AdaBoost and Bregman distances