Improved boosting algorithms using confidence-rated predictions

From MaRDI portal
Publication:1969321

DOI10.1023/A:1007614523901zbMath0945.68194MaRDI QIDQ1969321

Robert E. Schapire, Yoram Singer

Publication date: 16 March 2000

Published in: Machine Learning (Search for Journal in Brave)




Related Items

AdaBoost Semiparametric Model Averaging Prediction for Multiple Categories, Modeling surrender risk in life insurance: theoretical and experimental insight, On the measurement complexity of differentially private query answering, Transforming examples for multiclass boosting, Unnamed Item, Evolution of the Viola-Jones Object Detection Method: A Survey, Fuzzy OWL-Boost: learning fuzzy concept inclusions via real-valued boosting, Binary multi-layer classifier, The improved AdaBoost algorithms for imbalanced data classification, Robust Boosting Algorithm Against Mislabeling in Multiclass Problems, Evaluating early pandemic response through length-of-stay analysis of case logs and epidemiological modeling: a case study of Singapore in early 2020, A Bayesian non-parametric modeling to estimate student response to ICT investment, The synergy between PAV and AdaBoost, The synergy between PAV and AdaBoost, Effect of pruning and early stopping on performance of a boosting ensemble., Listwise approaches based on feature ranking discovery, Comment, A Hybrid Approach of Boosting Against Noisy Data, Theory and Algorithm for Learning with Dissimilarity Functions, Hybrid classification algorithms based on boosting and support vector machines, Online Adaptive Decision Trees: Pattern Classification and Function Approximation, The AdaBoost Flow, COST-SENSITIVE MULTI-CLASS ADABOOST FOR UNDERSTANDING DRIVING BEHAVIOR BASED ON TELEMATICS, Superlinear Integrality Gaps for the Minimum Majority Problem, On approximating weighted sums with exponentially many terms, Angle-based cost-sensitive multicategory classification, Cost-sensitive ensemble learning: a unifying framework, Population theory for boosting ensembles., Statistical behavior and consistency of classification methods based on convex risk minimization., Classification by evolutionary ensembles, Multi-class boosting with asymmetric binary weak-learners, Using social media for classifying actionable insights in disaster scenario, A hybrid filter/wrapper approach of feature selection using information theory, Boosting the partial least square algorithm for regression modelling, Multi-class learning by smoothed boosting, Feature mining and pattern classification for steganalysis of LSB matching steganography in grayscale images, Quadratic boosting, Cost-sensitive boosting algorithms: do we really need them?, Reduction from Cost-Sensitive Ordinal Ranking to Weighted Binary Classification, Extensions of stability selection using subsamples of observations and covariates, A robust AdaBoost.RT based ensemble extreme learning machine, Breast cancer prediction using the isotonic separation technique, \(L_{2}\) boosting in kernel regression, Probability estimation for multi-class classification using adaboost, ML-KNN: A lazy learning approach to multi-label learning, Self-improved gaps almost everywhere for the agnostic approximation of monomials, Boosting random subspace method, Discriminative Reranking for Natural Language Parsing, Hellinger distance decision trees are robust and skew-insensitive, Predicate logic based image grammars for complex pattern recognition, Goal scoring, coherent loss and applications to machine learning, Complexity in the case against accuracy estimation, A time-series modeling method based on the boosting gradient-descent theory, Using LogitBoost classifier to predict protein structural classes, Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions, Boosting \(k\)-NN for categorization of natural scenes, Logical correctors in the problem of classification by precedents, First order random forests: Learning relational classifiers with complex aggregates, Multilabel classification with meta-level features in a learning-to-rank framework, Multiclass boosting with adaptive group-based \(k\)NN and its application in text categorization, Comment on: Boosting algorithms: regularization, prediction and model fitting, A boosting approach for supervised Mahalanobis distance metric learning, A new column generation algorithm for logical analysis of data, Random classification noise defeats all convex potential boosters, Hierarchical linear support vector machine, A noise-detection based AdaBoost algorithm for mislabeled data, Fully corrective boosting with arbitrary loss and regularization, A three-way selective ensemble model for multi-label classification, Boosting with missing predictors, Imbalanced classification in sparse and large behaviour datasets, A dynamic ensemble approach to robust classification in the presence of missing data, An unbiased method for constructing multilabel classification trees, Tune and mix: learning to rank using ensembles of calibrated multi-class classifiers, Calibrating AdaBoost for phoneme classification, Greedy function approximation: A gradient boosting machine., Efficient HOG human detection, Simultaneous spotting of signs and fingerspellings based on hierarchical conditional random fields and boostmap embeddings, Using natural class hierarchies in multi-class visual classification, Supervised projection approach for boosting classifiers, Gene boosting for cancer classification based on gene expression profiles, A reconfigurable architecture for rotation invariant multi-view face detection based on a novel two-stage boosting method, Boosting GARCH and neural networks for the prediction of heteroskedastic time series, Cost-sensitive learning and decision making revisited, Local fractal and multifractal features for volumic texture characterization, A multi-objective optimisation approach for class imbalance learning, Boosted multi-class semi-supervised learning for human action recognition, Gender discriminating models from facial surface normals, Deformation of log-likelihood loss function for multiclass boosting, CORES: fusion of supervised and unsupervised training methods for a multi-class classification problem, BoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependency, Cost-sensitive boosting for classification of imbalanced data, EROS: Ensemble rough subspaces, Multicategory large margin classification methods: hinge losses vs. coherence functions, New multicategory boosting algorithms based on multicategory Fisher-consistent losses, Learning a priori constrained weighted majority votes, An improved multiclass LogitBoost using adaptive-one-vs-one, Robust Algorithms via PAC-Bayes and Laplace Distributions, Learning \((k,l)\)-contextual tree languages for information extraction from web pages, Improved MCMC sampling methods for estimating weighted sums in Winnow with application to DNF learning, Multilabel classification via calibrated label ranking, Boosted Bayesian network classifiers, Surrogate maximization/minimization algorithms and extensions, Component-wisely sparse boosting, Boosting the margin: a new explanation for the effectiveness of voting methods, Soft-max boosting, An Improved Branch-and-Bound Method for Maximum Monomial Agreement, AN ASYMMETRIC ADAPTIVE CLASSIFICATION METHOD, SVM-boosting based on Markov resampling: theory and algorithm, A local boosting algorithm for solving classification problems, What Can We Learn Privately?, A \(\mathbb R\)eal generalization of discrete AdaBoost, Three Categories Customer Churn Prediction Based on the Adjusted Real Adaboost, On PAC-Bayesian bounds for random forests, EMD and GNN-adaboost fault diagnosis for urban rail train rolling bearings, Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors), THEORETICAL FOUNDATIONS AND EXPERIMENTAL RESULTS FOR A HIERARCHICAL CLASSIFIER WITH OVERLAPPING CLUSTERS, AdaBoost and robust one-bit compressed sensing, Complexities of convex combinations and bounding the generalization error in classification, Boosting with early stopping: convergence and consistency, Boosting for quantum weak learners, A geometric approach to leveraging weak learners, Drifting games and Brownian motion, Multilabel classification through random graph ensembles, Top-down decision tree learning as information based boosting


Uses Software