The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square (Q2273603)

From MaRDI portal
scientific article
Language Label Description Also known as
English
The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
scientific article

    Statements

    The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square (English)
    0 references
    0 references
    0 references
    0 references
    24 September 2019
    0 references
    The paper deals with a testing problem of binary regressions (e.g. the logistic model or the probit model), and concentrates on the distribution of twice the log-likelihood ratio (2LLR) under the null-hypothesis. The authors illustrate that the commonly used Wilks' theorem stating that the distribution of 2LLR will tend for increasing sample size \(n\) to a chi-square, does not hold when the number of variables included in the model is not negligible compared to \(n\). Instead, for a class of models, a rescaled chi-square has to be used to approximate the 2LLR distribution and the corresponding \(p\)-values adequately. The \(p\)-values obtained using the chi-square approximation without the rescaling factor are too small. The authors also show that the required rescaling factor can be obtained by solving a nonlinear system of two equations with two variables. The paper is well written and a very nice example of rigorous mathematical statistics. A large part of the paper is dedicated to proofs and thus it may be mainly of interest for researchers working in the field of mathematical statistics. People working with these models to analyze data might be satisfied with the take home messages.
    0 references
    0 references
    logistic regression
    0 references
    likelihood-ratio tests
    0 references
    Wilks' theorem
    0 references
    high-dimensionality
    0 references
    goodness of fit
    0 references
    approximate message passing
    0 references
    concentration inequalities
    0 references
    convex geometry
    0 references
    leave-one-out analysis
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references