On mutual information estimation for mixed-pair random variables
From MaRDI portal
Abstract: We study the mutual information estimation for mixed-pair random variables. One random variable is discrete and the other one is continuous. We develop a kernel method to estimate the mutual information between the two random variables. The estimates enjoy a central limit theorem under some regular conditions on the distributions. The theoretical results are demonstrated by simulation study.
Recommendations
- Mutual information of several random variables and its estimation via variation
- Statistical estimation of mutual information for mixed model
- scientific article; zbMATH DE number 5518678
- scientific article; zbMATH DE number 5644803
- Conditional Mutual Information Estimation for Mixed, Discrete and Continuous Data
- Mutual information for the multinomial distribution
- Multivariate mutual information
Cites work
- scientific article; zbMATH DE number 922427 (Why is no real title available?)
- A class of Rényi information estimators for multidimensional densities
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- Efficient estimation of integral functionals of a density
- Estimation of entropy and other functionals of a multivariate density
- Estimation of integral functionals of a density and its derivatives
- On Kullback-Leibler loss and density estimation
- On the estimation of entropy
- Sample estimate of the entropy of a random vector
Cited in
(15)- Some relations between mutual information and estimation error in Wiener space
- Individually Conditional Individual Mutual Information Bound on Generalization Error
- On the Kullback Leibler information for mixed systems
- Representation of Mutual Information Via Input Estimates
- Statistical estimation of mutual information for mixed model
- Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation
- Jackknife approach to the estimation of mutual information
- Mutual Information Bounds via Adjacency Events
- Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels
- scientific article; zbMATH DE number 5518678 (Why is no real title available?)
- The performance of mutual information for mixture of bivariate normal distributions based on robust kernel estimation
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
- A novel Chow-Liu algorithm and its application to gene differential analysis
- Rejoinder on: ``Hybrid semiparametric Bayesian networks
- Information-theoretic characterizations of conditional mutual independence and Markov random fields
This page was built for publication: On mutual information estimation for mixed-pair random variables
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1726905)