The following pages link to AdaBoost.MH (Q20526):
Displayed 50 items.
- Variable selection and updating in model-based discriminant analysis for high dimensional data with food authenticity applications (Q977645) (← links)
- Learning with continuous experts using drifting games (Q982637) (← links)
- Boosting GARCH and neural networks for the prediction of heteroskedastic time series (Q984159) (← links)
- Learn\(^{++}\).MF: A random subspace approach for the missing feature problem (Q991269) (← links)
- Information theoretic combination of pattern classifiers (Q991948) (← links)
- Generalized re-weighting local sampling mean discriminant analysis (Q991950) (← links)
- Cost-sensitive boosting for classification of imbalanced data (Q996413) (← links)
- EROS: Ensemble rough subspaces (Q996469) (← links)
- Prediction of Alzheimer's diagnosis using semi-supervised distance metric learning with label propagation (Q1004962) (← links)
- View independent face detection based on horizontal rectangular features and accuracy improvement using combination kernel of various sizes (Q1005651) (← links)
- Surrogate maximization/minimization algorithms and extensions (Q1009342) (← links)
- Modeling churn using customer lifetime value (Q1011323) (← links)
- A stochastic approximation view of boosting (Q1020818) (← links)
- Efficient exploration of unknown indoor environments using a team of mobile robots (Q1022458) (← links)
- A local boosting algorithm for solving classification problems (Q1023522) (← links)
- Negative correlation in incremental learning (Q1024030) (← links)
- A \(\mathbb R\)eal generalization of discrete AdaBoost (Q1028894) (← links)
- If multi-agent learning is the answer, what is the question? (Q1028919) (← links)
- On generalization performance and non-convex optimization of extended \(\nu \)-support vector machine (Q1031941) (← links)
- Some challenges for statistics (Q1039967) (← links)
- A reference model for customer-centric data mining with support vector machines (Q1042173) (← links)
- The composite absolute penalties family for grouped and hierarchical variable selection (Q1043749) (← links)
- Wrappers for feature subset selection (Q1127360) (← links)
- A game of prediction with expert advice (Q1271549) (← links)
- On the boosting ability of top-down decision tree learning algorithms (Q1305926) (← links)
- An efficient membership-query algorithm for learning DNF with respect to the uniform distribution (Q1384530) (← links)
- Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy (Q1394785) (← links)
- On the difficulty of approximately maximizing agreements. (Q1401958) (← links)
- Growing support vector classifiers with controlled complexity. (Q1403745) (← links)
- On learning multicategory classification with sample queries. (Q1427857) (← links)
- A conversation with Leo Breiman. (Q1431203) (← links)
- Boosting using branching programs (Q1604220) (← links)
- Drifting games and Brownian motion (Q1604221) (← links)
- Ensembling neural networks: Many could be better than all (Q1605287) (← links)
- Estimator selection and combination in scalar-on-function regression (Q1615246) (← links)
- Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients (Q1615281) (← links)
- Random average shifted histograms (Q1623662) (← links)
- Stable feature selection for biomarker discovery (Q1631176) (← links)
- Noise peeling methods to improve boosting algorithms (Q1660240) (← links)
- On minimaxity of follow the leader strategy in the stochastic setting (Q1663642) (← links)
- An update on statistical boosting in biomedicine (Q1664502) (← links)
- Online multikernel learning based on a triple-norm regularizer for semantic image classification (Q1665404) (← links)
- Automatic emergence detection in complex systems (Q1674830) (← links)
- Probability estimation for multi-class classification using adaboost (Q1677008) (← links)
- Optimizing area under the ROC curve using semi-supervised learning (Q1677038) (← links)
- Building up a robust risk mathematical platform to predict colorectal cancer (Q1688123) (← links)
- Learning rotations with little regret (Q1689555) (← links)
- Context-based unsupervised ensemble learning and feature ranking (Q1689607) (← links)
- Analysis of web visit histories. II: Predicting navigation by nested STUMP regression trees (Q1695100) (← links)
- High-dimensional time series prediction using kernel-based koopman mode regression (Q1696900) (← links)