Randomized Gradient Boosting Machine
From MaRDI portal
Publication:4971024
DOI10.1137/18M1223277zbMath1451.90122arXiv1810.10158OpenAlexW3091997330MaRDI QIDQ4971024
Publication date: 8 October 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.10158
convex optimizationensemble methodscomputational guaranteescoordinate descentgradient boostingfirst order methods
Related Items (3)
Temporal mixture ensemble models for probabilistic forecasting of intraday cryptocurrency volume ⋮ Driving detection based on the multifeature fusion ⋮ A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- A new perspective on boosting in linear regression via subgradient optimization and relatives
- SLOPE-adaptive variable selection via convex optimization
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- A decision-theoretic generalization of on-line learning and an application to boosting
- Condition number complexity of an elementary algorithm for computing a reliable solution of a conic linear system
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version
- Coordinate descent algorithms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Boosting with early stopping: convergence and consistency
- Towards a deeper geometric, analytic and algorithmic understanding of margins
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- The Rate of Convergence of AdaBoost
- Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
- A Primal-Dual Convergence Analysis of Boosting
- On the Convergence of Block Coordinate Descent Type Methods
- A new condition number for linear programming
- Stochastic gradient boosting.
- Logistic regression, AdaBoost and Bregman distances
This page was built for publication: Randomized Gradient Boosting Machine