Estimating Information Rates with Confidence Intervals in Neural Spike Trains
From MaRDI portal
Publication:5457581
Recommendations
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
Cites work
- scientific article; zbMATH DE number 1505858 (Why is no real title available?)
- A Mathematical Theory of Communication
- Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions
- Automatic Block-Length Selection for the Dependent Bootstrap
- Computation in a Single Neuron: Hodgkin and Huxley Revisited
- Distribution of mutual information
- Entropy estimation of symbol sequences
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- Estimation of Entropy and Mutual Information
- Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Encoding Model
- Measuring information spatial densities
- Monte Carlo sampling methods using Markov chains and their applications
- Nonparametric entropy estimation for stationary processes and random fields, with applications to English text
- On the Complexity of Finite Sequences
- The Stationary Bootstrap
- The context-tree weighting method: basic properties
Cited in
(12)- Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
- Estimating Entropy Rates with Bayesian Confidence Intervals
- A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information
- Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis
- Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
- Estimating the Temporal Interval Entropy of Neuronal Discharge
- Indices for Testing Neural Codes
- Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population
This page was built for publication: Estimating Information Rates with Confidence Intervals in Neural Spike Trains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5457581)