Estimating Mutual Information Via Kolmogorov Distance
From MaRDI portal
Publication:3549038
DOI10.1109/TIT.2007.903122zbMATH Open1326.94038MaRDI QIDQ3549038FDOQ3549038
Authors: Zhengmin Zhang
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Recommendations
- Measuring Distances Between Variables by Mutual Information
- A computationally efficient estimator for mutual information
- Correlation distance and bounds for mutual information
- A new approach to mutual information
- Estimation of Entropy and Mutual Information
- Distribution of mutual information
- Jackknife approach to the estimation of mutual information
- Non-parametric estimation of mutual information through the entropy of the linkage
- A new approach to mutual information. II
Statistical aspects of information-theoretic topics (62B10) Information theory (general) (94A15) Measures of information, entropy (94A17)
Cited In (17)
- Optimal uniform continuity bound for conditional entropy of classical-quantum states
- Coupling of probability distributions and an extremal problem for the divergence
- Hadamard quantum broadcast channels
- On inequalities between mutual information and variation
- Correlation distance and bounds for mutual information
- Mutual information, variation, and Fano's inequality
- Mutual information of several random variables and its estimation via variation
- On coupling of probability distributions and estimating the divergence through variation
- On oblivious transfer capacity
- On one extremal problem for mutual information
- Continuity bounds on observational entropy and measured relative entropies
- On some extremal problems for mutual information and entropy
- Tight uniform continuity bounds for quantum entropies: conditional entropy, relative entropy distance and energy constraints
- Mutual information functions versus correlation functions.
- Generalization of a Pinsker problem
- Coupling of several random variables
- On one extreme value problem for entropy and error probability
This page was built for publication: Estimating Mutual Information Via Kolmogorov Distance
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549038)