On convergence properties of Shannon entropy
From MaRDI portal
Publication:734295
DOI10.1134/S003294600902001XzbMATH Open1173.94404arXiv0710.1275MaRDI QIDQ734295FDOQ734295
Authors: Francisco J. Piera, Patricio Parada
Publication date: 20 October 2009
Published in: Problems of Information Transmission (Search for Journal in Brave)
Abstract: Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies. A general result for the desired differential entropy convergence is provided, taking into account both compactly and uncompactly supported densities. Convergence of differential entropy is also characterized in terms of the Kullback-Liebler discriminant for densities with fairly general supports, and it is shown that convergence in variation of probability measures guarantees such convergence under an appropriate boundedness condition on the densities involved. Results for the discrete setting are also provided, allowing for infinitely supported probability measures, by taking advantage of the equivalence between weak convergence and convergence in variation in this setting.
Full work available at URL: https://arxiv.org/abs/0710.1275
Recommendations
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- Convergence of Differential Entropies
- Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions
- Entropy and convergence on compact groups
- Density-free convergence properties of various estimators of entropy
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the estimation of entropy
- Title not available (Why is that?)
- Probability with Martingales
- Title not available (Why is that?)
- Entropy and the central limit theorem
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- Entropy estimators‐improvements and comparisons
- Convergence properties of functional estimates for discrete distributions
- Title not available (Why is that?)
- Title not available (Why is that?)
Cited In (13)
- Information dependency: strong consistency of Darbellay-Vajda partition estimators
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- On the Shannon entropy and related functionals on convex sets
- Information divergence estimation based on data-dependent partitions
- On the convergence of Shannon entropy of distribution functions in the max domain of attraction of max-stable laws
- A test for independence via Bayesian nonparametric estimation of mutual information
- Convergence of Differential Entropies
- Title not available (Why is that?)
- A note on utility based pricing and asymptotic risk diversification
- Entropy and density approximation from Laplace transforms
- Functional sufficient dimension reduction through information maximization with application to classification
- Title not available (Why is that?)
- Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions
This page was built for publication: On convergence properties of Shannon entropy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q734295)