Bounds for the asymptotic distribution of the likelihood ratio (Q2192735)

From MaRDI portal
Revision as of 01:25, 20 February 2024 by RedirectionBot (talk | contribs) (‎Removed claim: author (P16): Item:Q638342)
scientific article
Language Label Description Also known as
English
Bounds for the asymptotic distribution of the likelihood ratio
scientific article

    Statements

    Bounds for the asymptotic distribution of the likelihood ratio (English)
    0 references
    0 references
    17 August 2020
    0 references
    In the context of Wilks' theorem, the authors derive explicit error bounds in the chi-square approximation for the log likelihood ratio statistic in the setting of multivariate IID data \(\mathbf{X}_1,\dots,\mathbf{X}_n\). This result is proved under various existence, boundedness and differentiability conditions (on the underlying log-likelihood, for example), the majority of which are standard regularity conditions in this setting. The error bounds go to zero as \(n\rightarrow\infty\) if either the dimension of the parameter space is fixed, or if it grows slowly enough with \(n\). This main result is complemented by simulations and applications to the settings of exponential data, Gaussian data, and logistic regression. The proof uses Stein's method. Several interesting directions for future research are noted, for example the extension of the present results to the case of weakly dependent data.
    0 references
    0 references
    log-likelihood ratio statistics
    0 references
    Wilks' theorem
    0 references
    Stein's method
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references