Adjustment by minimum discriminant information (Q1069230)

From MaRDI portal
Revision as of 11:54, 12 July 2023 by Importer (talk | contribs) (‎Created a new Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Adjustment by minimum discriminant information
scientific article

    Statements

    Adjustment by minimum discriminant information (English)
    0 references
    0 references
    1984
    0 references
    Let \({\mathcal M}\) be the set of all probability measures (p. m.) on \((R_ k,B_ k)\) and \(T:R_ k\to R_{\ell}\) measurable. Fix \(t\in R_{\ell}\) and put \(C=\{A\in {\mathcal M}:\int TdA=t\}\). Let for \(P\in {\mathcal M}\) the \(Q\in C\) be the minimal discriminant information adjusted (MDIA) p. m. of \(P\), i.e. the p. m. which is the closest one to \(P\) in the sense of Kullback-Leibler discriminant information. Under mild conditions it is proved that for \(X_ 1,X_ 2,\ldots\), i.i.d. according to \(P\) the \(Q_ n\)- MDIA p. m. of \(P_ n\) (empirical distribution) - converge weakly to \(Q\) a. s. and that for \(D:R_ k\to R_ 1\) the \(\int D dQ_ n\) is an asymptotically unbiased and normal estimate of \(\int D dQ\).
    0 references
    minimum discriminant information adjustment
    0 references
    maximum likelihood estimate
    0 references
    weak convergence
    0 references
    probability estimation
    0 references
    minimal distance method
    0 references
    consistency
    0 references
    asymptotic normality
    0 references
    weighting
    0 references
    Kullback-Leibler discriminant information
    0 references
    empirical distribution
    0 references
    asymptotically unbiased
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references