swMATH8517MaRDI QIDQ20526FDOQ20526
Author name not available (Why is that?)
Official website: https://doi.org/10.1006/jcss.1997.1504
Cited In (only showing first 100 items - show all)
- Online learning in online auctions
- Hybrid classification algorithms based on boosting and support vector machines
- No regret learning in oligopolies: Cournot vs. Bertrand
- Statistical Learning of Multi-view Face Detection
- Learning to compete, coordinate, and cooperate in repeated games using reinforcement learning
- Ternary Bradley-Terry model-based decoding for multi-class classification and its extensions
- A simple extension of boosting for asymmetric mislabeled data
- Online variance minimization
- Forecasting financial and macroeconomic variables using data reduction methods: new empirical evidence
- An improved branch-and-bound method for maximum monomial agreement
- From cluster ensemble to structure ensemble
- Tree models for difference and change detection in a complex environment
- Estimating a sharp convergence bound for randomized ensembles
- Sparse weighted voting classifier selection and its linear programming relaxations
- Risk bounds for CART classifiers under a margin condition
- Negative correlation in incremental learning
- A support vector machine-based ensemble algorithm for breast cancer diagnosis
- A conversation with Leo Breiman.
- Title not available (Why is that?)
- Further results on the margin explanation of boosting: new algorithm and experiments
- Functional gradient ascent for probit regression
- Hierarchical linear support vector machine
- Machine learning approaches for discrimination of extracellular matrix proteins using hybrid feature space
- Improved customer choice predictions using ensemble methods
- Information theoretic combination of pattern classifiers
- Practical speech emotion recognition based on online learning: from acted data to elicited data
- Stochastic boosting algorithms
- Stochastic boosting algorithms
- On a method for constructing ensembles of regression models
- Scheme of boosting in the problems of combinatorial optimization induced by the collective training algorithms
- Looking for lumps: boosting and bagging for density estimation.
- Boosted manifold principal angles for image set-based recognition
- An extensive comparison of recent classification tools applied to microarray data
- Reducing forgeries in writer-independent off-line signature verification through ensemble of classifiers
- Direct kernel perceptron (DKP): ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation
- Method for quickly inferring the mechanisms of large-scale complex networks based on the census of subgraph concentrations
- Multi-label classification and extracting predicted class hierarchies
- Weight-selected attribute bagging for credit scoring
- Greedy optimization classifiers ensemble based on diversity
- Online learning from local features for video-based face recognition
- Process consistency for AdaBoost.
- A testing based extraction algorithm for identifying significant communities in networks
- Estimating classification error rate: repeated cross-validation, repeated hold-out and bootstrap
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies
- Concept drift detection via competence models
- MILIS
- A boosting approach for supervised Mahalanobis distance metric learning
- Representing and recognizing objects with massive local image patches
- Deformation of log-likelihood loss function for multiclass boosting
- Variable selection and updating in model-based discriminant analysis for high dimensional data with food authenticity applications
- Cox process functional learning
- Supervised projection approach for boosting classifiers
- Analysis of boosting algorithms using the smooth margin function
- Unsupervised weight parameter estimation method for ensemble learning
- Boosting conditional probability estimators
- Estimating the algorithmic variance of randomized ensembles via the bootstrap
- Online aggregation of unbounded losses using shifting experts with confidence
- Hybrid cluster ensemble framework based on the random combination of data transformation operators
- Multiple kernel boosting framework based on information measure for classification
- Robust regression using biased objectives
- Learning to Rank for Information Retrieval
- EROS: Ensemble rough subspaces
- Mathematical optimization in classification and regression trees
- Cost-sensitive ensemble learning: a unifying framework
- Adaptive linear and normalized combination of radial basis function networks for function approximation and regression
- Identifying the interacting positions of a protein using Boolean learning and support vector machines
- The dynamics of adaboost: cyclic behavior and convergence of margins
- Learning with continuous experts using drifting games
- Boosting GARCH and neural networks for the prediction of heteroskedastic time series
- Tutorial series on brain-inspired computing. VI: Geometrical structure of boosting algorithm
- Terminated Ramp--Support Vector machines: A nonparametric data dependent kernel
- Deep neural networks, gradient-boosted trees, random forests: statistical arbitrage on the S\&P 500
- Optimization of tree ensembles
- Regression trees and forests for non-homogeneous Poisson processes
- Classification by evolutionary ensembles
- Drifting games and Brownian motion
- Boosting of granular models
- A robust approach to model-based classification based on trimming and constraints. Semi-supervised learning in presence of outliers and label noise
- Multilabel classification through random graph ensembles
- Detection of differential item functioning in Rasch models by boosting techniques
- Breast cancer prediction using the isotonic separation technique
- Face detection with boosted Gaussian features
- Nonstochastic Multi-Armed Bandits with Graph-Structured Feedback
- Optimal learning for sequential sampling with non-parametric beliefs
- Knowledge acquisition and development of accurate rules for predicting protein stability changes
- Self-improved gaps almost everywhere for the agnostic approximation of monomials
- Relational networks of conditional preferences
- A Fisher consistent multiclass loss function with variable margin on positive examples
- Performance improvement of classifier fusion for batch samples based on upper integral
- Committee polyhedral separability: complexity and polynomial approximation
- Deep learning of support vector machines with class probability output networks
- Boosting in the presence of noise
- Machine learning feature selection methods for landslide susceptibility mapping
- Fast pedestrian detection system with a two layer cascade of classifiers
- Soft-max boosting
- A local boosting algorithm for solving classification problems
- Using boosting to prune double-bagging ensembles
- On generalization performance and non-convex optimization of extended \(\nu \)-support vector machine
- Neural network ensembles: evaluation of aggregation algorithms
- Probabilistic combination of classification rules and its application to medical diagnosis
This page was built for software: AdaBoost.MH