Abstract: Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies. A general result for the desired differential entropy convergence is provided, taking into account both compactly and uncompactly supported densities. Convergence of differential entropy is also characterized in terms of the Kullback-Liebler discriminant for densities with fairly general supports, and it is shown that convergence in variation of probability measures guarantees such convergence under an appropriate boundedness condition on the densities involved. Results for the discrete setting are also provided, allowing for infinitely supported probability measures, by taking advantage of the equivalence between weak convergence and convergence in variation in this setting.
Recommendations
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- Convergence of Differential Entropies
- Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions
- Entropy and convergence on compact groups
- Density-free convergence properties of various estimators of entropy
Cites work
- scientific article; zbMATH DE number 1713116 (Why is no real title available?)
- scientific article; zbMATH DE number 3153543 (Why is no real title available?)
- scientific article; zbMATH DE number 46153 (Why is no real title available?)
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- scientific article; zbMATH DE number 1354815 (Why is no real title available?)
- scientific article; zbMATH DE number 4116450 (Why is no real title available?)
- scientific article; zbMATH DE number 964178 (Why is no real title available?)
- Convergence properties of functional estimates for discrete distributions
- Entropy and the central limit theorem
- Entropy estimators‐improvements and comparisons
- On the estimation of entropy
- Probability with Martingales
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
Cited in
(13)- Information dependency: strong consistency of Darbellay-Vajda partition estimators
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- On the Shannon entropy and related functionals on convex sets
- Information divergence estimation based on data-dependent partitions
- On the convergence of Shannon entropy of distribution functions in the max domain of attraction of max-stable laws
- A test for independence via Bayesian nonparametric estimation of mutual information
- Convergence of Differential Entropies
- scientific article; zbMATH DE number 3985109 (Why is no real title available?)
- A note on utility based pricing and asymptotic risk diversification
- Entropy and density approximation from Laplace transforms
- Functional sufficient dimension reduction through information maximization with application to classification
- scientific article; zbMATH DE number 3731918 (Why is no real title available?)
- Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions
This page was built for publication: On convergence properties of Shannon entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q734295)