A decision-theoretic generalization of on-line learning and an application to boosting
From MaRDI portal
Publication:1370863
DOI10.1006/JCSS.1997.1504zbMath0880.68103DBLPjournals/jcss/FreundS97OpenAlexW1988790447WikidataQ56386811 ScholiaQ56386811MaRDI QIDQ1370863
Robert E. Schapire, Yoav Freund
Publication date: 16 February 1998
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/4ba566223e426677d12a9a18418c023a4deec77e
Related Items (only showing first 100 items - show all)
Evidential calibration of binary SVM classifiers ⋮ Mathematical optimization in classification and regression trees ⋮ Inducing wavelets into random fields via generative boosting ⋮ Angle-based cost-sensitive multicategory classification ⋮ Cost-sensitive ensemble learning: a unifying framework ⋮ Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule ⋮ Information-theoretic bounded rationality and \(\epsilon\)-optimality ⋮ Representation in the (artificial) immune system ⋮ Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches ⋮ Feature selection filter for classification of power system operating states ⋮ An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market ⋮ Identifying the interacting positions of a protein using Boolean learning and support vector machines ⋮ Cost-sensitive boosting algorithms: do we really need them? ⋮ Analysis of web visit histories. I: Distance-based visualization of sequence rules ⋮ PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection ⋮ Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm ⋮ Supervised classification and mathematical optimization ⋮ Mean and quantile boosting for partially linear additive models ⋮ Empirical models based on features ranking techniques for corporate financial distress prediction ⋮ Boosting of granular models ⋮ Improved customer choice predictions using ensemble methods ⋮ Breast cancer prediction using the isotonic separation technique ⋮ Conditional validity of inductive conformal predictors ⋮ \(L_{2}\) boosting in kernel regression ⋮ Knee joint vibration signal analysis with matching pursuit decomposition and dynamic weighted classifier fusion ⋮ ML-KNN: A lazy learning approach to multi-label learning ⋮ Face detection with boosted Gaussian features ⋮ Knowledge acquisition and development of accurate rules for predicting protein stability changes ⋮ Self-improved gaps almost everywhere for the agnostic approximation of monomials ⋮ Combining initial segments of lists ⋮ A Fisher consistent multiclass loss function with variable margin on positive examples ⋮ Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting ⋮ Performance improvement of classifier fusion for batch samples based on upper integral ⋮ Probabilistic combination of classification rules and its application to medical diagnosis ⋮ Survey on speech emotion recognition: features, classification schemes, and databases ⋮ Multi-label classification and extracting predicted class hierarchies ⋮ Blasso for object categorization and retrieval: towards interpretable visual models ⋮ Predicate logic based image grammars for complex pattern recognition ⋮ A time-series modeling method based on the boosting gradient-descent theory ⋮ Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions ⋮ A simple extension of boosting for asymmetric mislabeled data ⋮ Online variance minimization ⋮ Online learning from local features for video-based face recognition ⋮ Variable selection for nonparametric Gaussian process priors: Models and computational strategies ⋮ Sparse weighted voting classifier selection and its linear programming relaxations ⋮ Risk bounds for CART classifiers under a margin condition ⋮ Further results on the margin explanation of boosting: new algorithm and experiments ⋮ Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies ⋮ Comment on: ``Support vector machines with applications ⋮ Boosting algorithms: regularization, prediction and model fitting ⋮ Comment on: Boosting algorithms: regularization, prediction and model fitting ⋮ A boosting method for maximization of the area under the ROC curve ⋮ Representing and recognizing objects with massive local image patches ⋮ A boosting approach for supervised Mahalanobis distance metric learning ⋮ Functional gradient ascent for probit regression ⋮ A noise-detection based AdaBoost algorithm for mislabeled data ⋮ Practical speech emotion recognition based on online learning: from acted data to elicited data ⋮ Nonstochastic bandits: Countable decision set, unbounded costs and reactive environments ⋮ Boosting multi-features with prior knowledge for mini unmanned helicopter landmark detection ⋮ The value of agreement a new boosting algorithm ⋮ Method for quickly inferring the mechanisms of large-scale complex networks based on the census of subgraph concentrations ⋮ Weight-selected attribute bagging for credit scoring ⋮ Does modeling lead to more accurate classification? A study of relative efficiency in linear classification ⋮ Sample-weighted clustering methods ⋮ Unsupervised weight parameter estimation method for ensemble learning ⋮ A testing based extraction algorithm for identifying significant communities in networks ⋮ A lazy bagging approach to classification ⋮ Concept drift detection via competence models ⋮ Online aggregation of unbounded losses using shifting experts with confidence ⋮ Cox process functional learning ⋮ Boosting conditional probability estimators ⋮ Support vector machines based on convex risk functions and general norms ⋮ An extensive comparison of recent classification tools applied to microarray data ⋮ Boosting and instability for regression trees ⋮ Multiple kernel boosting framework based on information measure for classification ⋮ Boosting additive models using component-wise P-splines ⋮ Using boosting to prune double-bagging ensembles ⋮ Estimating classification error rate: repeated cross-validation, repeated hold-out and bootstrap ⋮ The Bayesian additive classification tree applied to credit risk modelling ⋮ BART: Bayesian additive regression trees ⋮ Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing ⋮ A study on iris localization and recognition on mobile phones ⋮ Heterogeneous stacking for classification-driven watershed segmentation ⋮ Least angle and \(\ell _{1}\) penalized regression: a review ⋮ From cluster ensemble to structure ensemble ⋮ Tree models for difference and change detection in a complex environment ⋮ Subsemble: an ensemble method for combining subset-specific algorithm fits ⋮ Reducing forgeries in writer-independent off-line signature verification through ensemble of classifiers ⋮ Projective morphoogies and their application in structural analysis of digital images ⋮ Machine learning approaches for discrimination of extracellular matrix proteins using hybrid feature space ⋮ Optimal learning for sequential sampling with non-parametric beliefs ⋮ Component-wisely sparse boosting ⋮ Machine learning feature selection methods for landslide susceptibility mapping ⋮ Fast pedestrian detection system with a two layer cascade of classifiers ⋮ Soft-max boosting ⋮ AdaBoost.MH ⋮ Hybrid cluster ensemble framework based on the random combination of data transformation operators ⋮ Variable selection using penalized empirical likelihood ⋮ Regularization of case-specific parameters for robustness and efficiency ⋮ Accurate tree-based missing data imputation and data fusion within the statistical learning paradigm
Cites Work
- Unnamed Item
- Unnamed Item
- An analog of the minimax theorem for vector payoffs
- Some special Vapnik-Chervonenkis classes
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- A game of prediction with expert advice
- The weighted majority algorithm
- Boosting a weak learning algorithm by majority
- Universal Portfolios
- How to use expert advice
This page was built for publication: A decision-theoretic generalization of on-line learning and an application to boosting