Fast learning rates for plug-in classifiers under the margin condition
From MaRDI portal
Publication:6475752
arXivmath/0507180MaRDI QIDQ6475752FDOQ6475752
Authors: Jean-Yves Audibert, Alexandre B. Tsybakov
Publication date: 8 July 2005
Abstract: It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than . The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order , and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {it super-fast} rates, i.e., the rates faster than . We establish minimax lower bounds showing that the obtained rates cannot be improved.
This page was built for publication: Fast learning rates for plug-in classifiers under the margin condition
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6475752)