Relative entropy derivative bounds (Q280464)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Relative entropy derivative bounds
scientific article

    Statements

    Relative entropy derivative bounds (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    10 May 2016
    0 references
    Summary: We show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables.
    0 references
    relative entropy
    0 references
    Kullback-Leibler divergence
    0 references
    Shannon differential entropy
    0 references
    asymptotic equipartition principle
    0 references
    typical set
    0 references
    Fisher information
    0 references
    maximum log likelihood
    0 references

    Identifiers