Soft margins for AdaBoost
From MaRDI portal
Publication:5928994
DOI10.1023/A:1007618119488zbMath0969.68128WikidataQ57525590 ScholiaQ57525590MaRDI QIDQ5928994
Takashi Onoda, Gunnar Rätsch, Klaus-Robert Müller
Publication date: 2001
Published in: Machine Learning (Search for Journal in Brave)
Related Items
A New Discriminative Kernel from Probabilistic Models, Subspace-based support vector machines for pattern classification, Robust relevance vector machine for classification with variational inference, Semi-supervised AUC optimization based on positive-unlabeled learning, Prospects of quantum-classical optimization for digital design, Optimally regularised kernel Fisher discriminant classification, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, Optimal dyadic decision trees, Multi-class learning by smoothed boosting, Structured large margin machines: sensitive to data distributions, An analysis of diversity measures, Bayesian Trigonometric Support Vector Classifier, Optimizing resources in model selection for support vector machine, A nested heuristic for parameter tuning in support vector machines, Multiple Kernel Learning with Gaussianity Measures, Robustifying AdaBoost by Adding the Naive Error Rate, Different Paradigms for Choosing Sequential Reweighting Algorithms, Noise peeling methods to improve boosting algorithms, A Unified Classification Model Based on Robust Optimization, Semi-supervised learning with density-ratio estimation, Learning kernel logistic regression in the presence of class label noise, Imperfect imaGANation: implications of GANs exacerbating biases on facial data augmentation and snapchat face lenses, Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression, Validation-Based Sparse Gaussian Process Classifier Design, Low rank updated LS-SVM classifiers for fast variable selection, The role of optimization in some recent advances in data-driven decision-making, FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE, Robust Boosting Algorithm Against Mislabeling in Multiclass Problems, Blasso for object categorization and retrieval: towards interpretable visual models, SpicyMKL: a fast algorithm for multiple kernel learning with thousands of kernels, The C-loss function for pattern classification, Using LogitBoost classifier to predict protein structural classes, Statistical analysis of kernel-based least-squares density-ratio estimation, Computational complexity of kernel-based density-ratio estimation: a condition number analysis, Learning Kernel Perceptrons on Noisy Data Using Random Projections, Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability, Least-squares two-sample test, Boosting algorithms: regularization, prediction and model fitting, A novel SVM+NDA model for classification with an application to face recognition, Semi-supervised local Fisher discriminant analysis for dimensionality reduction, Random classification noise defeats all convex potential boosters, A noise-detection based AdaBoost algorithm for mislabeled data, Classification by nearness in complementary subspaces, Optimized fixed-size kernel models for large data sets, Active Learning Using Hint Information, Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation, Fully corrective boosting with arbitrary loss and regularization, Active learning for noisy oracle via density power divergence, Information estimators for weighted observations, Vote counting measures for ensemble classifiers., Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers., An empirical comparison of ensemble methods based on classification trees, The synergy between PAV and AdaBoost, Efficient Learning and Feature Selection in High-Dimensional Regression, A fast linear-in-the-parameters classifier construction algorithm using orthogonal forward selection to minimize leave-one-out misclassification rate, Support vector machine via nonlinear rescaling method, Analysis of boosting algorithms using the smooth margin function, Classifier learning with a new locality regularization method, Using boosting to prune double-bagging ensembles, An efficient modified boosting method for solving classification problems, A novel Bayesian logistic discriminant model: an application to face recognition, Greedy function approximation: A gradient boosting machine., The synergy between PAV and AdaBoost, KPCA-based training of a kernel fuzzy classifier with ellipsoidal regions, A hybrid novelty score and its use in keystroke dynamics-based user authentication, Supervised projection approach for boosting classifiers, Natural Discriminant Analysis Using Interactive Potts Models, Robust Loss Functions for Boosting, A Conditional Entropy Minimization Criterion for Dimensionality Reduction and Multiple Kernel Learning, Dimensionality reduction for density ratio estimation in high-dimensional spaces, Deformation of log-likelihood loss function for multiclass boosting, Bankruptcy Prediction: A Comparison of Some Statistical and Machine Learning Techniques, A boosting method with asymmetric mislabeling probabilities which depend on covariates, Direct importance estimation for covariate shift adaptation, Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions, Feature selection via sensitivity analysis of SVM probabilistic outputs, Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimina\-tion, A Hybrid Approach of Boosting Against Noisy Data, Variable Selection for Kernel Classification, Parallelizing AdaBoost by weights dynamics, Relative Density-Ratio Estimation for Robust Distribution Comparison, Kernel logistic PLS: a tool for supervised nonlinear dimensionality reduction and binary classifi\-cation, Support Vector Machines for Dyadic Data, A stochastic approximation view of boosting, A local boosting algorithm for solving classification problems, Negative correlation in incremental learning, Structural Online Learning, Bayesian approach to feature selection and parameter tuning for support vector machine classifiers, Kernel Least-Squares Models Using Updates of the Pseudoinverse, Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors), Kernel learning at the first level of inference, AdaBoost and robust one-bit compressed sensing, Backward elimination model construction for regression and classification using leave-one-out criteria, A geometric approach to leveraging weak learners