Early stopping in \(L_{2}\)Boosting
From MaRDI portal
Publication:2445675
DOI10.1016/J.CSDA.2010.03.024zbMath1284.62041OpenAlexW1528145197MaRDI QIDQ2445675
Yu-Pai Huang, Yuan-chin Ivan Chang, Yufen Huang
Publication date: 14 April 2014
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2010.03.024
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Greedy function approximation: A gradient boosting machine.
- A stochastic approximation view of boosting
- Model selection and prediction: Normal regression
- Statistical predictor identification
- Regression and time series model selection in small samples
- An optimal selection of regression variables
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Model Selection and the Principle of Minimum Description Length
- Boosting With theL2Loss
- The elements of statistical learning. Data mining, inference, and prediction
- Logistic regression, AdaBoost and Bregman distances
- A new look at the statistical model identification
This page was built for publication: Early stopping in \(L_{2}\)Boosting