Boosting the margin: a new explanation for the effectiveness of voting methods

From MaRDI portal
Revision as of 10:18, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1807156

DOI10.1214/AOS/1024691352zbMath0929.62069OpenAlexW1975846642WikidataQ115720343 ScholiaQ115720343MaRDI QIDQ1807156

Wee Sun Lee, Robert E. Schapire, Yoav Freund, Bartlett, Peter L.

Publication date: 9 November 1999

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1214/aos/1024691352





Cites Work


Related Items (only showing first 100 items - show all)

Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins.Generalization error of combined classifiers.Generalization bounds for averaged classifiersOn approximating weighted sums with exponentially many termsPopulation theory for boosting ensembles.Process consistency for AdaBoost.On the Bayes-risk consistency of regularized boosting methods.Statistical behavior and consistency of classification methods based on convex risk minimization.Optimal aggregation of classifiers in statistical learning.Large margin classification with indefinite similaritiesLearning linear PCA with convex semi-definite programmingChagas parasite detection in blood images using AdaBoostA MapReduce-based distributed SVM ensemble for scalable image classification and annotationIdentifying the interacting positions of a protein using Boolean learning and support vector machinesClassification of gene-expression data: the manifold-based metric learning wayA precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiersAn algorithmic theory of learning: robust concepts and random projectionAn analysis of diversity measuresData-driven decomposition for multi-class classificationQuadratic boostingCost-sensitive boosting algorithms: do we really need them?Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithmNoise peeling methods to improve boosting algorithmsMulti-stage classifier design\(L_{2}\) boosting in kernel regressionPairwise fusion matrix for combining classifiersMulti-label optimal margin distribution machineBoosting random subspace methodDeep learning of support vector machines with class probability output networksPreference disaggregation and statistical learning for multicriteria decision support: A reviewOn hybrid classification using model assisted posterior estimatesA time-series modeling method based on the boosting gradient-descent theoryDouble-bagging: Combining classifiers by bootstrap aggregationOn robust classification using projection depthA review of boosting methods for imbalanced data classificationRecursive aggregation of estimators by the mirror descent algorithm with averagingPropositionalization and embeddings: two sides of the same coinEmpirical risk minimization is optimal for the convex aggregation problemRisk bounds for CART classifiers under a margin conditionFurther results on the margin explanation of boosting: new algorithm and experimentsBoosting algorithms: regularization, prediction and model fittingMulticlass classification with potential function rules: margin distribution and generalizationAnytime classification for a pool of instancesHierarchical linear support vector machineOn the equivalence of weak learnability and linear separability: new relaxations and efficient boosting algorithmsA noise-detection based AdaBoost algorithm for mislabeled dataDiscussion of: ``Nonparametric regression using deep neural networks with ReLU activation functionVote counting measures for ensemble classifiers.Entropy and divergence associated with power function and the statistical applicationThe value of agreement a new boosting algorithmSimultaneous adaptation to the margin and to complexity in classificationOptimal third root asymptotic bounds in the statistical estimation of thresholdsOptimal rates of aggregation in classification under low noise assumptionAn empirical study of using Rotation Forest to improve regressorsBoosting conditional probability estimatorsOptimal convergence rate of the universal estimation errorAnalysis of boosting algorithms using the smooth margin functionFrom dynamic classifier selection to dynamic ensemble selectionBoosting and instability for regression treesMaximum patterns in datasetsUsing boosting to prune double-bagging ensemblesAn efficient modified boosting method for solving classification problemsRegularization method for predicting an ordinal response using longitudinal high-dimensional genomic dataGA-Ensemble: a genetic algorithm for robust ensemblesRemembering Leo BreimanNavigating random forests and related advances in algorithmic modelingSupervised projection approach for boosting classifiersFeature selection based on loss-margin of nearest neighbor classificationComputer science and decision theoryRandom survival forestsBayesian Weibull tree models for survival analysis of clinico-genomic dataBoosting GARCH and neural networks for the prediction of heteroskedastic time seriesAn algorithmic theory of learning: Robust concepts and random projectionCost-sensitive learning and decision making revisitedAn analysis on the relationship between uncertainty and misclassification rate of classifiersBandwidth choice for nonparametric classificationDeformation of log-likelihood loss function for multiclass boostingBoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependencyA novel margin-based measure for directed hill climbing ensemble pruningIterative feature selection in least square regression estimationPAC-Bayesian bounds for randomized empirical risk minimizersArcing classifiers. (With discussion)Parallelizing AdaBoost by weights dynamicsA stochastic approximation view of boostingMulti-scale rois selection for classifying multi-spectral imagesNegative correlation in incremental learningA \(\mathbb R\)eal generalization of discrete AdaBoostOn generalization performance and non-convex optimization of extended \(\nu \)-support vector machineInterpretable machine learning: fundamental principles and 10 grand challengesAdditive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)On weak base hypotheses and their implications for boosting regression and classificationAdaBoost and robust one-bit compressed sensingOn the perceptron's compressionA geometric approach to leveraging weak learnersA re-weighting strategy for improving marginsRandom projections as regularizers: learning a linear discriminant from fewer observations than dimensionsLocal discriminative distance metrics ensemble learningIterative BayesTop-down decision tree learning as information based boostingAccurate tree-based missing data imputation and data fusion within the statistical learning paradigm

Uses Software




This page was built for publication: Boosting the margin: a new explanation for the effectiveness of voting methods