Estimating Entropy Rates with Bayesian Confidence Intervals
From MaRDI portal
Publication:3025070
DOI10.1162/0899766053723050zbMath1064.62005WikidataQ46052003 ScholiaQ46052003MaRDI QIDQ3025070
Henry D. I. Abarbanel, Matthew B. Kennel, E. J. Chichilnisky, Jonathon Shlens
Publication date: 4 July 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://escholarship.org/uc/item/9243v6dr
62F25: Parametric tolerance and confidence regions
62F15: Bayesian inference
65C05: Monte Carlo methods
92C20: Neural biology
62B10: Statistical aspects of information-theoretic topics
Cites Work
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Estimating the errors on measured entropy and mutual information
- Predictability, Complexity, and Learning
- Statistical Inference, Occam's Razor, and Statistical Mechanics on the Space of Probability Distributions
- Linear Time Universal Coding and Time Reversal of Tree Sources Via FSM Closure
- Universal Compression of Memoryless Sources Over Unknown Alphabets
- The performance of universal encoding
- On the Complexity of Finite Sequences
- A universal algorithm for sequential data compression
- Compression of individual sequences via variable-rate coding
- The context-tree weighting method: extensions
- Nonparametric entropy estimation for stationary processes and random fields, with applications to English text
- Entropy estimation of symbol sequences
- On the role of pattern matching in information theory
- Estimation of Entropy and Mutual Information
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- The context-tree weighting method: basic properties
- Geodesic Entropic Graphs for Dimension and Entropy Estimation in Manifold Learning
- Monte Carlo sampling methods using Markov chains and their applications
- A formal theory of inductive inference. Part I