adabag
swMATH8024CRANadabagMaRDI QIDQ20039FDOQ20039
Applies Multiclass AdaBoost.M1, SAMME and Bagging
Last update: 31 May 2023
Copyright license: GNU General Public License, version 3.0, GNU General Public License, version 2.0
Software version identifier: 4.2, 4.3, 1.0, 1.1, 2.0, 2.1, 3.0, 3.1, 3.2, 4.0, 4.1, 5.0
Source code repository: https://github.com/cran/adabag
It implements Freund and Schapire's Adaboost.M1 algorithm and Breiman's Baggingalgorithm using classification trees as individual classifiers. Once these classifiers have beentrained, they can be used to predict on new data. Also, cross validation estimation of the error canbe done. Since version 2.0 the function margins() is available to calculate the margins for theseclassifiers. Also a higher flexibility is achieved giving access to the rpart.control() argumentof 'rpart'. Four important new features were introduced on version 3.0, AdaBoost-SAMME (Zhu et al., 2009) is implemented and a new function errorevol() shows the error of the ensembles asa function of the number of iterations. In addition, the ensembles can be pruned using the option 'newmfinal' in the predict.bagging() and predict.boosting() functions and the posterior probability ofeach class for observations can be obtained. Version 3.1 modifies the relative importance measureto take into account the gain of the Gini index given by a variable in each tree and the weights of these trees. Version 4.0 includes the margin-based ordered aggregation for Bagging pruning (Guoand Boukir, 2013) and a function to auto prune the 'rpart' tree. Moreover, three new plots are also available importanceplot(), plot.errorevol() and plot.margins(). Version 4.1 allows to predict on unlabeled data. Version 4.2 includes the parallel computation option for some of the functions. Version 5.0 includes the Boosting and Bagging algorithms for label ranking (Albano, Sciandraand Plaia, 2023).
Cited In (9)
- rminer
- Ensemble classification based on generalized additive models
- traineR
- Variable selection and updating in model-based discriminant analysis for high dimensional data with food authenticity applications
- Multi-class AdaBoost
- Comparing boosting and bagging for decision trees of rankings
- Foundations of statistical algorithms. With references to R packages
- pheble
- RHSBoost: improving classification performance in imbalance data
This page was built for software: adabag