Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study (Q1805530)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study
scientific article

    Statements

    Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study (English)
    0 references
    3 October 1996
    0 references
    Divergence measures play an important role in statistical theory, especially in large sample theories of estimation and testing. The underlying reason is that they are indices of statistical distance between probability distributions \(P\) and \(Q\); the smaller these indices are the harder it is to discriminate between \(P\) and \(Q\). Many divergence measures have been proposed in the literature. In order to do a unified study of their statistical properties, we propose a generalized divergence, called \((\underline h, \underline \varphi)\)-divergence, which includes as particular cases divergence measures well known from the literature. Under different assumptions, it is shown that the asymptotic distributions of the \((\underline h, \underline \varphi)\)-divergence statistics are either normal or chi square. The chi square and the likelihood ratio test statistics are particular cases of the \((\underline h, \underline \varphi)\)-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are derived too. Applications to testing statistical hypotheses in multinomial populations are given. The Pitman and Bahadur efficiencies of tests of goodness of fit and independence based on these statistics are obtained. To finish, appendices with the asymptotic variances of many well known divergence and entropy statistics are presented.
    0 references
    diversity
    0 references
    test of homogeneity
    0 references
    test of independence
    0 references
    Pitman efficiency
    0 references
    Bahadur efficiency
    0 references
    generalized divergence
    0 references
    divergence measures
    0 references
    likelihood ratio test statistics
    0 references
    entropy statistics
    0 references
    multinomial populations
    0 references
    tests of goodness of fit
    0 references
    asymptotic variances
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers