Fully corrective boosting with arbitrary loss and regularization
From MaRDI portal
Publication:460675
DOI10.1016/j.neunet.2013.07.006zbMath1297.68206OpenAlexW1991849741WikidataQ45958806 ScholiaQ45958806MaRDI QIDQ460675
Anton van den Hengel, Hanxi Li, Chunhua Shen
Publication date: 14 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/2440/78929
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- Deformation of log-likelihood loss function for multiclass boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Improved boosting algorithms using confidence-rated predictions
- Boosting with early stopping: convergence and consistency
- Entropy Regularized LPBoost
- A theory of the learnable
- Algorithm 778: L-BFGS-B
- Prediction Games and Arcing Algorithms
- Soft margins for AdaBoost
- Linear programming boosting via column generation
This page was built for publication: Fully corrective boosting with arbitrary loss and regularization