A new accelerated proximal boosting machine with convergence rate O(1/t^2)
From MaRDI portal
Publication:2103099
Recommendations
Cites work
- A decision-theoretic generalization of on-line learning and an application to boosting
- A primal-dual convergence analysis of boosting
- Accelerated gradient boosting
- AdaBoost is consistent
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Arcing classifiers. (With discussion)
- Boosting With theL2Loss
- Boosting a weak learning algorithm by majority
- Boosting with early stopping: convergence and consistency
- Greedy function approximation: A gradient boosting machine.
- Introductory lectures on convex optimization. A basic course.
- Logistic regression, AdaBoost and Bregman distances
- MLlib: machine learning in Apache Spark
- Population theory for boosting ensembles.
- Prediction Games and Arcing Algorithms
- Randomized Gradient Boosting Machine
- Scikit-learn: machine learning in Python
- Some theory for generalized boosting algorithms
- Stochastic gradient boosting.
This page was built for publication: A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2103099)