Robust boosting for regression problems
From MaRDI portal
Publication:133956
DOI10.1016/j.csda.2020.107065OpenAlexW3060687736MaRDI QIDQ133956
Matías Salibián-Barrera, Xiaomeng Ju, Xiaomeng Ju, Matías Salibián Barrera
Publication date: January 2021
Published in: Computational Statistics & Data Analysis, Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.02054
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- All Models are Wrong, but Many are Useful: Learning a Variable's Importance by Studying an Entire Class of Prediction Models Simultaneously
- Greedy function approximation: A gradient boosting machine.
- High breakdown-point and high efficiency robust estimates for regression
- Robustified \(L_2\) boosting
- Robust nonparametric regression with simultaneous scale curve estimation
- Multivariate adaptive regression splines
- On the optimality of S-estimators
- Robust nonparametric regression estimation
- Robustness of random forests for regression
- The Role of Pseudo Data for Robust Smoothing with Application to Wavelet Regression
- Robust Locally Weighted Regression and Smoothing Scatterplots
- The L 1 Method for Robust Nonparametric Regression
- Boosting With theL2Loss
- Robust estimators for additive models using backfitting
- Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions
- Learning Theory and Kernel Machines
- Regularization and Variable Selection Via the Elastic Net
- Random forests
This page was built for publication: Robust boosting for regression problems