Discrimination distance bounds and statistical applications (Q1826204): Difference between revisions

From MaRDI portal
ReferenceBot (talk | contribs)
Changed an Item
Set OpenAlex properties.
 
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf01208258 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1990562626 / rank
 
Normal rank

Latest revision as of 08:27, 30 July 2024

scientific article
Language Label Description Also known as
English
Discrimination distance bounds and statistical applications
scientific article

    Statements

    Discrimination distance bounds and statistical applications (English)
    0 references
    0 references
    1990
    0 references
    Bounds are obtained for the Kullback-Leibler discrimination distance between two random vectors X and Y. If X is a sequence of independent random variables whose densities have similar tail behavior and \(Y=AX\), where A is an invertible matrix, then the bounds are a product of terms depending on A and X separately. These bounds help indicate the best possible rate of convergence for any estimator of the parameters of an auto-regressive process with innovations in the domain of attraction of a stable law. We provide a general theorem establishing the link between total variation proximity of measures and the rate of convergence of statistical estimates to complete our argument.
    0 references
    Kullback-Leibler discrimination distance
    0 references
    tail behavior
    0 references
    rate of convergence
    0 references
    domain of attraction
    0 references
    stable law
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references