Discrimination distance bounds and statistical applications (Q1826204)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Discrimination distance bounds and statistical applications
scientific article

    Statements

    Discrimination distance bounds and statistical applications (English)
    0 references
    0 references
    1990
    0 references
    Bounds are obtained for the Kullback-Leibler discrimination distance between two random vectors X and Y. If X is a sequence of independent random variables whose densities have similar tail behavior and \(Y=AX\), where A is an invertible matrix, then the bounds are a product of terms depending on A and X separately. These bounds help indicate the best possible rate of convergence for any estimator of the parameters of an auto-regressive process with innovations in the domain of attraction of a stable law. We provide a general theorem establishing the link between total variation proximity of measures and the rate of convergence of statistical estimates to complete our argument.
    0 references
    Kullback-Leibler discrimination distance
    0 references
    tail behavior
    0 references
    rate of convergence
    0 references
    domain of attraction
    0 references
    stable law
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references