Fast learning rates for plug-in classifiers under the margin condition

From MaRDI portal
Publication:6475752

arXivmath/0507180MaRDI QIDQ6475752FDOQ6475752


Authors: Jean-Yves Audibert, Alexandre B. Tsybakov Edit this on Wikidata


Publication date: 8 July 2005

Abstract: It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than n1/2. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order n1, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {it super-fast} rates, i.e., the rates faster than n1. We establish minimax lower bounds showing that the obtained rates cannot be improved.













This page was built for publication: Fast learning rates for plug-in classifiers under the margin condition

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6475752)