Entropy factor for randomness quantification in neuronal data
From MaRDI portal
Publication:2179072
Recommendations
- Fano factor estimation
- Estimating the Temporal Interval Entropy of Neuronal Discharge
- A continuous entropy rate estimator for spike trains using a K-means-based context tree
- Entropy, mutual information, and systematic measures of structured spiking neural networks
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
Cites work
- scientific article; zbMATH DE number 3168165 (Why is no real title available?)
- scientific article; zbMATH DE number 3239078 (Why is no real title available?)
- A characterization of the time-rescaled gamma process as a model for spike trains
- An introductory review of information theory in the context of computational neuroscience
- Bias analysis in entropy estimation
- Elements of Information Theory
- Estimating Instantaneous Irregularity of Neuronal Firing
- Fano factor estimation
- Firing Variability Is Higher than Deduced from the Empirical Coefficient of Variation
- Impact of spike train autostructure on probability distribution of joint spike events
- Measures of statistical dispersion based on Shannon and Fisher information concepts
- Optimizing time histograms for non-Poissonian spike trains
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- Spiking Neuron Models
- Statistical structure of neural spiking under non-Poissonian or other non-white stimulation
- The Properties of Recurrent-Event Processes
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
- The gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model
Cited in
(3)
This page was built for publication: Entropy factor for randomness quantification in neuronal data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2179072)