Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates (Q1118294)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates |
scientific article |
Statements
Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates (English)
0 references
1989
0 references
Let \(X_ 1,...,X_{\ell}\) and \(Y_ 1,...,Y_ n\) be independent random samples from the distribution functions (d.f.) F and G respectively. Assume that \(F'=f\) and \(G'=g\). The discriminant rule for classifying an independently sampled observation Z to F if \(\hat f_{\ell}(z)>\hat g_ n(z)\) and to G otherwise, where \(\hat f_{\ell}\) and \(\hat g_ n\) are the estimates of f and g respectively based on a common kernel function and the training X- and Y-samples, are considered optimal in some sense. Let \(P_ f\) denote the probability measure under the assumption that \(Z\sim F\) and set \[ P_ 0=P_ f(f(Z)>g(Z))\quad and\quad P_ n=P_ f(\hat f_ l(Z)>\hat g_ n(Z)). \] In this article we have derived the rate at which \(P_ N\to P_ 0\) as \(N=\ell +n\to \infty\), for the situation where \(\ell =n\), \(F(x)=M(x-\theta_ 2)\) and \(G(x)=M(x- \theta_ 1)\) for some symmetric d.f. M and parameters \(\theta_ 1\), \(\theta_ 2\). We have examined a few special cases of M and have established that the rate of convergence of \(P_ N\) to \(P_ 0\) depends critically on the tail behavior of \(m=M'\).
0 references
optimal classification rule
0 references
probability of misclassification
0 references
kernel function density estimates
0 references
discriminant rule
0 references
rate of convergence
0 references
tail behavior
0 references