Robust boosting with truncated loss functions
From MaRDI portal
Publication:82723
DOI10.1214/18-EJS1404zbMath1386.68143OpenAlexW2791518061MaRDI QIDQ82723
Publication date: 1 January 2018
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1519700496
Nonparametric robustness (62G35) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonconvex programming, global optimization (90C26) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Uses Software
Cites Work
- Greedy function approximation: A gradient boosting machine.
- Boosting algorithms: regularization, prediction and model fitting
- Estimating the dimension of a model
- Convex analysis approach to d. c. programming: Theory, algorithms and applications
- Introductory lectures on convex optimization. A basic course.
- A note on margin-based loss functions in classification
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Robust penalized logistic regression with truncated loss functions
- Robust Truncated Hinge Loss Support Vector Machines
- Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
- All of Nonparametric Statistics
- Variable Selection for Support Vector Machines in Moderately High Dimensions
- Optimization
- An adaptive version of the boost by majority algorithm
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Robust boosting with truncated loss functions