Geometric k-nearest neighbor estimation of entropy and mutual information
From MaRDI portal
Publication:4565943
DOI10.1063/1.5011683zbMath1391.94586arXiv1711.00748OpenAlexW3101577262WikidataQ88234224 ScholiaQ88234224MaRDI QIDQ4565943
Warren M. Lord, Erik M. Bollt, Jie Sun
Publication date: 13 June 2018
Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.00748
Nonparametric estimation (62G05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (4)
Entropy-based test for generalised Gaussian distributions ⋮ Entropy estimation via uniformization ⋮ Causal network reconstruction from time series: From theoretical assumptions to practical estimation ⋮ Dynamic gene regulatory network reconstruction and analysis based on clinical transcriptomic data of colorectal cancer
Cites Work
- Unnamed Item
- Unnamed Item
- Non-parametric estimation of mutual information through the entropy of the linkage
- Measuring synchronization in coupled model systems: a comparison of different approaches
- Estimation of entropy and other functionals of a multivariate density
- Estimation of the information by an adaptive partitioning of the observation space
- Chaos in Dynamical Systems
- Causal Network Inference by Optimal Causation Entropy
This page was built for publication: Geometric k-nearest neighbor estimation of entropy and mutual information