Estimating the Temporal Interval Entropy of Neuronal Discharge
From MaRDI portal
Publication:4832473
Recommendations
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Estimating Information Rates with Confidence Intervals in Neural Spike Trains
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- A simple method for estimating the entropy of neural activity
Cites work
- scientific article; zbMATH DE number 1164155 (Why is no real title available?)
- A Mathematical Theory of Communication
- A Test of Goodness of Fit
- Detecting and Estimating Signals over Noisy and Unreliable Synapses: Information-Theoretic Analysis
- Network Amplification of Local Fluctuations Causes High Spike Rate Variability, Fractal Firing Patterns and Oscillatory Local Field Potentials
- Neural coding and decoding: communication channels and quantization
- Sequential interval histogram analysis of non-stationary neuronal spike trains
Cited in
(7)- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- A simple method for estimating the entropy of neural activity
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- Conditional entropies, phase synchronization and changes in the directionality of information flow in neural systems
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Entropy factor for randomness quantification in neuronal data
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
This page was built for publication: Estimating the Temporal Interval Entropy of Neuronal Discharge
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4832473)