Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers

From MaRDI portal
Publication:2948481

DOI10.1007/978-3-319-17142-5_32zbMATH Open1459.68169arXiv1408.2714OpenAlexW311675162MaRDI QIDQ2948481FDOQ2948481


Authors:


Publication date: 30 September 2015

Published in: Lecture Notes in Computer Science (Search for Journal in Brave)

Abstract: We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov's margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binary-class plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.


Full work available at URL: https://arxiv.org/abs/1408.2714




Recommendations



Cites Work


Cited In (7)





This page was built for publication: Learning from non-iid data: fast rates for the one-vs-all multiclass plug-in classifiers

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2948481)