A necessary and sufficient condition for convergence of error probability estimates in k-nn discrimination (Q1082016)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A necessary and sufficient condition for convergence of error probability estimates in k-nn discrimination
scientific article

    Statements

    A necessary and sufficient condition for convergence of error probability estimates in k-nn discrimination (English)
    0 references
    0 references
    1985
    0 references
    Let \((X,\theta)\) be an \(R^ d\times \{1,...,s\}\) valued random vector, \((X_ j,\theta_ j)\), \(j=1,...,n\), be its observed values, \(\theta_{nj}^{(k)}\) be the k-nearest neighbor estimate of \(\theta_ j\), \(R^{(k)}\) be the limit of error probability and \(\hat R_{nk}\triangleq (1/n)\sum^{n}_{j=1}I_{(\theta_ j\neq \theta_{nj}^{(k)})}\) be the error probability estimate. It is shown that for every \(\epsilon >0\), there are constants \(a>0\), \(c<\infty\) such that \[ P(| \hat R_{nk}-R^{(k)}| >\epsilon)<ce^{-an} \] if and only if there is no unregular atom of (X,\(\theta)\) and the various concepts of convergence of \(\hat R{}_{nk}\) to \(R^{(k)}\) are equivalent.
    0 references
    0 references
    convergence of error probabiity estimates
    0 references
    k-nearest neighbor estimate
    0 references
    unregular atom
    0 references