Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers
From MaRDI portal
Publication:2948481
DOI10.1007/978-3-319-17142-5_32zbMATH Open1459.68169arXiv1408.2714OpenAlexW311675162MaRDI QIDQ2948481FDOQ2948481
Authors:
Publication date: 30 September 2015
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Abstract: We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov's margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binary-class plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.
Full work available at URL: https://arxiv.org/abs/1408.2714
Recommendations
Cites Work
- In defense of one-vs-all classification
- Prediction of time series by statistical learning: general losses and fast rates
- Analysis of comparative data with hierarchical autocorrelation
- Exponential inequalities for sums of random vectors
- Optimal aggregation of classifiers in statistical learning.
- Fast learning rates for plug-in classifiers
- Fast rates for support vector machines using Gaussian kernels
- Minimum complexity regression estimation with weakly dependent observations
- New analysis and algorithm for learning with drifting distributions
- On the Rate of Convergence of Local Averaging Plug-In Classification Rules Under a Margin Condition
- Statistical analysis of some multi-category large margin classification methods
- Generalization error for multi-class margin classification
- Classification via local multi-resolution projections
- On the complexity of learning from drifting distributions
- The complexity of learning according to two models of a drifting environment
- Plug-in Approach to Active Learning
- Fast learning from \(\alpha\)-mixing observations
- Generalization and Robustness of Batched Weighted Average Algorithm with V-Geometrically Ergodic Markov Data
Cited In (7)
- Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
- Adaptive group Lasso neural network models for functions of few variables and time-dependent data
- Fast learning rates for plug-in classifiers
- Posterior concentration and fast convergence rates for generalized Bayesian learning
- An adaptive multiclass nearest neighbor classifier
- A generalization bound of deep neural networks for dependent data
- Towards convergence rate analysis of random forests for classification
This page was built for publication: Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2948481)