SVM-boosting based on Markov resampling: theory and algorithm
DOI10.1016/J.NEUNET.2020.07.036zbMATH Open1480.62024OpenAlexW3048868447WikidataQ98658431 ScholiaQ98658431MaRDI QIDQ2057733FDOQ2057733
Authors: Hongwei Jiang, Bin Zou, Chen Xu, Jie Xu, Yuan Yan Tang
Publication date: 7 December 2021
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2020.07.036
Recommendations
Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Nonparametric statistical resampling methods (62G09)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Title not available (Why is that?)
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Bagging predictors
- Regularization networks and support vector machines
- The Generalization Ability of Online Algorithms for Dependent Data
- Theory of Reproducing Kernels
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Boosting with early stopping: convergence and consistency
- Arcing classifiers. (With discussion)
- Title not available (Why is that?)
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Improved boosting algorithms using confidence-rated predictions
- Convexity, Classification, and Risk Bounds
- Title not available (Why is that?)
- Boosting a weak learning algorithm by majority
- Learning rates of least-square regularized regression
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- On the Bayes-risk consistency of regularized boosting methods.
- Learning and generalisation. With applications to neural networks.
- AdaBoost is consistent
- Process consistency for AdaBoost.
- On weak base hypotheses and their implications for boosting regression and classification
- Boosted kernel ridge regression: optimal learning rates and early stopping
- Multiclass boosting: margins, codewords, losses, and algorithms
- A theory of multiclass boosting
Cited In (3)
Uses Software
This page was built for publication: SVM-boosting based on Markov resampling: theory and algorithm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2057733)