Bayesian and quasi-Bayesian estimators for mutual information from discrete data
From MaRDI portal
Publication:742724
DOI10.3390/e15051738zbMath1296.62054OpenAlexW2122477496WikidataQ63987882 ScholiaQ63987882MaRDI QIDQ742724
Evan Archer, Il Memming Park, Jonathan W. Pillow
Publication date: 19 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15051738
Bayesian inference (62F15) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (4)
Estimating functions of distributions defined over spaces of unknown size ⋮ Information entropy, continuous improvement, and US energy performance: a novel stochastic-entropic analysis for ideal solutions (SEA-IS) ⋮ Efficient feature selection using shrinkage estimators ⋮ An operational information decomposition via synergistic disclosure
Cites Work
- Unnamed Item
- Distribution of mutual information from complete and incomplete data
- Estimating Entropy Rates with Bayesian Confidence Intervals
- On measures of dependence
- Information in the Nonstationary Case
- Estimation of Entropy and Mutual Information
- Dynamic Analyses of Information Encoding in Neural Ensembles
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains
- Approximating discrete probability distributions with dependence trees
This page was built for publication: Bayesian and quasi-Bayesian estimators for mutual information from discrete data