Fully corrective boosting with arbitrary loss and regularization
From MaRDI portal
(Redirected from Publication:460675)
Recommendations
- Boosting as a regularized path to a maximum margin classifier
- scientific article; zbMATH DE number 2089370
- On the Bayes-risk consistency of regularized boosting methods.
- Boosting as a kernel-based method
- Statistical behavior and consistency of classification methods based on convex risk minimization.
Cites work
- scientific article; zbMATH DE number 3844997 (Why is no real title available?)
- scientific article; zbMATH DE number 3551792 (Why is no real title available?)
- scientific article; zbMATH DE number 1266748 (Why is no real title available?)
- scientific article; zbMATH DE number 1950578 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 1391397 (Why is no real title available?)
- A theory of the learnable
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Algorithm 778: L-BFGS-B
- Boosting as a regularized path to a maximum margin classifier
- Boosting with early stopping: convergence and consistency
- Deformation of log-likelihood loss function for multiclass boosting
- Efficient margin maximizing with boosting
- Entropy Regularized LPBoost
- Greedy function approximation: A gradient boosting machine.
- Improved boosting algorithms using confidence-rated predictions
- Linear programming boosting via column generation
- Prediction Games and Arcing Algorithms
- Soft margins for AdaBoost
- Statistical comparisons of classifiers over multiple data sets
- The \(F_{\infty}\)-norm support vector machine
Cited in
(4)
This page was built for publication: Fully corrective boosting with arbitrary loss and regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q460675)