Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
DOI10.1016/J.CSDA.2009.07.017zbMATH Open1453.62185OpenAlexW2004014581MaRDI QIDQ961895FDOQ961895
Authors: Lior Rokach
Publication date: 1 April 2010
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2009.07.017
Recommendations
- Pattern classification using ensemble methods.
- A new ensemble method with feature space partitioning for high-dimensional data classification
- scientific article; zbMATH DE number 2018600
- Cluster ensembles: a survey of approaches with recent extensions and applications
- Ensemble Classification Methods with Applicationsin R
Computational methods for problems pertaining to statistics (62-08) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Cites Work
- Java-ML: a machine learning library
- A stochastic approximation view of boosting
- Empirical characterization of random forest variable importance measures
- Random forests
- Bagging predictors
- Multivariate adaptive regression splines
- Title not available (Why is that?)
- Title not available (Why is that?)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Quadratic boosting
- Arcing classifiers. (With discussion)
- Stochastic gradient boosting.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Trimmed bagging
- Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy
- Attribute bagging: Improving accuracy of classifier ensembles by using random feature subsets.
- Ensembling neural networks: Many could be better than all
- Bundling classifiers by bagging trees
- Boosting and instability for regression trees
- Title not available (Why is that?)
- Data Mining and Knowledge Discovery Handbook
- Title not available (Why is that?)
- Genetic algorithm-based feature set partitioning for classification problems
- Is combining classifiers with stacking better than selecting the best one?
- Bagging, boosting and the random subspace method for linear classifiers
- Looking for lumps: boosting and bagging for density estimation.
- Knowledge-based artificial neural networks
- Title not available (Why is that?)
- Input decimated ensembles
- Title not available (Why is that?)
- Data-driven decomposition for multi-class classification
- Classifier combination based on confidence transformation
- A local boosting algorithm for solving classification problems
- Adaptive fusion and co-operative training for classifier ensembles
- Title not available (Why is that?)
- Experimental study for the comparison of classifier combination methods
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bayesian partition modelling.
- Title not available (Why is that?)
- Using boosting to prune double-bagging ensembles
- Using \(k\)-nearest-neighbor classification in the leaves of a tree
- Aggregating classifiers with mathematical programming
- Inverse boosting for monotone regression functions
- Parallelizing AdaBoost by weights dynamics
- Classification by ensembles from random partitions of high-dimensional data
- Robust learning from bites for data mining
- Logitboost with errors-in-variables
- Detection of unknown computer worms based on behavioral classification of the host
- Title not available (Why is that?)
- The dynamics of adaboost: cyclic behavior and convergence of margins
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- 10.1162/153244304322765630
- Title not available (Why is that?)
- Title not available (Why is that?)
- Effect of pruning and early stopping on performance of a boosting ensemble.
- Collective-agreement-based pruning of ensembles
- Improving malware detection by applying multi-inducer ensemble
- Decision trees using model ensemble-based nodes
- EROS: Ensemble rough subspaces
Cited In (21)
- Multivariate forests with missing mixed outcomes
- An asymptotically optimal kernel combined classifier
- A review of survival trees
- The relative performance of ensemble methods with deep convolutional neural networks for image classification
- A new ensemble method with feature space partitioning for high-dimensional data classification
- An empirical bias–variance analysis of DECORATE ensemble method at different training sample sizes
- Regression trees and forests for non-homogeneous Poisson processes
- Chapter 3: A comparative review of graph-based ensemble clustering as transformation methods for microarray data classification
- A nearest-neighbor-based ensemble classifier and its large-sample optimality
- \(L_1\) splitting rules in survival forests
- A weight-adjusted voting algorithm for ensembles of classifiers
- Robustness of random forests for regression
- Mixed-effects random forest for clustered data
- Automated versus do-it-yourself methods for causal inference: lessons learned from a data analysis competition
- PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection
- RandGA: injecting randomness into parallel genetic algorithm for variable selection
- Dynamic classifier aggregation using interaction-sensitive fuzzy measures
- Ensemble classification of paired data
- Pattern classification using ensemble methods.
- Aggregating classifiers via Rademacher-Walsh polynomials
- Out-of-Bag Estimation of the Optimal Hyperparameter in SubBag Ensemble Method
Uses Software
This page was built for publication: Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q961895)