Adjustment by minimum discriminant information (Q1069230): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
RedirectionBot (talk | contribs)
Property / reviewed by
 
Property / reviewed by: Jan Ámos Víšek / rank
Normal rank
 

Revision as of 00:29, 23 February 2024

scientific article
Language Label Description Also known as
English
Adjustment by minimum discriminant information
scientific article

    Statements

    Adjustment by minimum discriminant information (English)
    0 references
    0 references
    1984
    0 references
    Let \({\mathcal M}\) be the set of all probability measures (p. m.) on \((R_ k,B_ k)\) and \(T:R_ k\to R_{\ell}\) measurable. Fix \(t\in R_{\ell}\) and put \(C=\{A\in {\mathcal M}:\int TdA=t\}\). Let for \(P\in {\mathcal M}\) the \(Q\in C\) be the minimal discriminant information adjusted (MDIA) p. m. of \(P\), i.e. the p. m. which is the closest one to \(P\) in the sense of Kullback-Leibler discriminant information. Under mild conditions it is proved that for \(X_ 1,X_ 2,\ldots\), i.i.d. according to \(P\) the \(Q_ n\)- MDIA p. m. of \(P_ n\) (empirical distribution) - converge weakly to \(Q\) a. s. and that for \(D:R_ k\to R_ 1\) the \(\int D dQ_ n\) is an asymptotically unbiased and normal estimate of \(\int D dQ\).
    0 references
    minimum discriminant information adjustment
    0 references
    maximum likelihood estimate
    0 references
    weak convergence
    0 references
    probability estimation
    0 references
    minimal distance method
    0 references
    consistency
    0 references
    asymptotic normality
    0 references
    weighting
    0 references
    Kullback-Leibler discriminant information
    0 references
    empirical distribution
    0 references
    asymptotically unbiased
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references