Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions (Q1092542)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions
scientific article

    Statements

    Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions (English)
    0 references
    0 references
    0 references
    1987
    0 references
    During the last fifteen years, Akaike's entropy-based information criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the AIC procedure and provides its analytical extensions in two ways without violating Akaike's main principles. These extensions make AIC asymptotically consistent and penalize overparameterization more stringently to pick only the simplest of the ``true'' models. These selection criteria are called CAIC and CAICF. Asymptotic properties of AIC and its extensions are investigated, and empirical performances of these criteria are studied in choosing the correct degree of a polynomial model in two different Monto Carlo experiments under different conditions.
    0 references
    0 references
    model selection
    0 references
    Akaike's entropy-based information criterion
    0 references
    AIC procedure
    0 references
    overparameterization
    0 references
    CAIC
    0 references
    CAICF
    0 references
    Asymptotic properties
    0 references
    empirical performances
    0 references
    polynomial model
    0 references
    Monto Carlo experiments
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references