Implications of the Cressie-Read family of additive divergences for information recovery (Q406232)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Implications of the Cressie-Read family of additive divergences for information recovery
scientific article

    Statements

    Implications of the Cressie-Read family of additive divergences for information recovery (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    8 September 2014
    0 references
    Summary: To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR) family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in statistical modeling is that the noisy indirect data are observed and known and the sampling model-error distribution-probability space, consistent with the data, is unknown. To address the unknown sampling process underlying the data, we consider a convex combination of two or more estimators derived from members of the flexible CR family of divergence measures and optimize that combination to select an estimator that minimizes expected quadratic loss. Sampling experiments are used to illustrate the finite sample properties of the resulting estimator and the nature of the recovered sampling distribution.
    0 references
    0 references
    conditional moment equations
    0 references
    Cressie-Read divergence
    0 references
    information theoretic methods
    0 references
    minimum power divergence
    0 references
    information functionals
    0 references
    0 references