Estimating Mutual Information Via Kolmogorov Distance
From MaRDI portal
Publication:3549038
Recommendations
- Measuring Distances Between Variables by Mutual Information
- A computationally efficient estimator for mutual information
- Correlation distance and bounds for mutual information
- A new approach to mutual information
- Estimation of Entropy and Mutual Information
- Distribution of mutual information
- Jackknife approach to the estimation of mutual information
- Non-parametric estimation of mutual information through the entropy of the linkage
- A new approach to mutual information. II
Cited in
(17)- Coupling of probability distributions and an extremal problem for the divergence
- Optimal uniform continuity bound for conditional entropy of classical-quantum states
- Hadamard quantum broadcast channels
- On inequalities between mutual information and variation
- Correlation distance and bounds for mutual information
- Mutual information, variation, and Fano's inequality
- Mutual information of several random variables and its estimation via variation
- On coupling of probability distributions and estimating the divergence through variation
- On oblivious transfer capacity
- On one extremal problem for mutual information
- On some extremal problems for mutual information and entropy
- Continuity bounds on observational entropy and measured relative entropies
- Tight uniform continuity bounds for quantum entropies: conditional entropy, relative entropy distance and energy constraints
- Mutual information functions versus correlation functions.
- Generalization of a Pinsker problem
- Coupling of several random variables
- On one extreme value problem for entropy and error probability
This page was built for publication: Estimating Mutual Information Via Kolmogorov Distance
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3549038)