Entropy factor for randomness quantification in neuronal data
From MaRDI portal
Publication:2179072
DOI10.1016/J.NEUNET.2017.07.016zbMATH Open1439.92019DBLPjournals/nn/RajdlLK17OpenAlexW2749209900WikidataQ48009696 ScholiaQ48009696MaRDI QIDQ2179072FDOQ2179072
Kamil Rajdl, Petr Lansky, Lubomir Kostal
Publication date: 12 May 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2017.07.016
Measures of information, entropy (94A17) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Elements of Information Theory
- Spiking Neuron Models
- Optimizing time histograms for non-Poissonian spike trains
- The gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
- Firing Variability Is Higher than Deduced from the Empirical Coefficient of Variation
- Statistical structure of neural spiking under non-Poissonian or other non-white stimulation
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- An introductory review of information theory in the context of computational neuroscience
- Estimating Instantaneous Irregularity of Neuronal Firing
- Fano factor estimation
- The Properties of Recurrent-Event Processes
- Measures of statistical dispersion based on Shannon and Fisher information concepts
- Bias analysis in entropy estimation
- Impact of Spike Train Autostructure on Probability Distribution of Joint Spike Events
Cited In (2)
This page was built for publication: Entropy factor for randomness quantification in neuronal data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2179072)