Estimating Information Rates with Confidence Intervals in Neural Spike Trains
DOI10.1162/NECO.2007.19.7.1683zbMATH Open1146.68435DBLPjournals/neco/ShlensKAC07OpenAlexW2106012167WikidataQ51913983 ScholiaQ51913983MaRDI QIDQ5457581FDOQ5457581
Henry D. I. Abarbanel, Jonathon Shlens, E. J. Chichilnisky, Matthew B. Kennel
Publication date: 14 April 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.7.1683
Recommendations
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
Statistical aspects of information-theoretic topics (62B10) Learning and adaptive systems in artificial intelligence (68T05) Neural biology (92C20)
Cites Work
- Automatic Block-Length Selection for the Dependent Bootstrap
- A Mathematical Theory of Communication
- The Stationary Bootstrap
- Monte Carlo sampling methods using Markov chains and their applications
- On the Complexity of Finite Sequences
- Estimation of Entropy and Mutual Information
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions
- Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Encoding Model
- Computation in a Single Neuron: Hodgkin and Huxley Revisited
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Entropy estimation of symbol sequences
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- The context-tree weighting method: basic properties
- Distribution of mutual information
- Nonparametric entropy estimation for stationary processes and random fields, with applications to English text
- Title not available (Why is that?)
- Measuring information spatial densities
Cited In (8)
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data
- Estimating Entropy Rates with Bayesian Confidence Intervals
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
- Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis
- Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
- Estimating the Temporal Interval Entropy of Neuronal Discharge
- Indices for Testing Neural Codes
- Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population
This page was built for publication: Estimating Information Rates with Confidence Intervals in Neural Spike Trains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5457581)