Mutual information of several random variables and its estimation via variation
From MaRDI portal
Publication:2269360
DOI10.1134/S0032946009040012zbMATH Open1190.94021OpenAlexW1966800591MaRDI QIDQ2269360FDOQ2269360
Authors: V. V. Prelov
Publication date: 16 March 2010
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0032946009040012
Recommendations
Cites Work
Cited In (17)
- Some relations between mutual information and estimation error in Wiener space
- On mutual information estimation for mixed-pair random variables
- Representation of Mutual Information Via Input Estimates
- A statistical information theory approach to compare the homogeneity of several variances
- On inequalities between mutual information and variation
- On computation of information via variation and inequalities for the entropy function
- Information theoretic approach to statistical properties of multivariate Cauchy-Lorentz distributions
- Mutual information, variation, and Fano's inequality
- On one extremal problem for mutual information
- Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation
- Mutual information and the \(F\)-theorem
- Title not available (Why is that?)
- On triple mutual information
- Generalization of a Pinsker problem
- Maximizing multi-information
- Maximum independence and mutual information
- Title not available (Why is that?)
This page was built for publication: Mutual information of several random variables and its estimation via variation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2269360)