Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples
From MaRDI portal
Publication:742670
DOI10.3390/E15030721zbMath1296.62016DBLPjournals/entropy/PiresP13OpenAlexW2024993643WikidataQ64391353 ScholiaQ64391353MaRDI QIDQ742670
Rui A. P. Perdigão, Carlos A. L. Pires
Publication date: 19 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15030721
mutual informationmorphismnon-Gaussianitymaximum entropy distributionsentropy biasmutual information distribution
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Mutual information is copula entropy
- Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties
- Entropy and information approaches to genetic diversity and its expression: genomic geography
- Distribution of mutual information from complete and incomplete data
- An introduction to copulas. Properties and applications
- Entropy densities with an application to autoregressive conditional skewness and kurtosis.
- Entropy estimates of small data sets
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- Estimation of Entropy and Mutual Information
This page was built for publication: Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples