Nonparametric estimation of information-based measures of statistical dispersion
From MaRDI portal
Publication:406103
DOI10.3390/e14071221zbMath1306.62089OpenAlexW1989597311MaRDI QIDQ406103
Publication date: 8 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e14071221
Density estimation (62G07) Nonparametric estimation (62G05) Statistical aspects of information-theoretic topics (62B10)
Related Items (3)
Non-parametric estimation of mutual information through the entropy of the linkage ⋮ The effect of interspike interval statistics on the information gain under the rate coding hypothesis ⋮ Measures of statistical dispersion based on Shannon and Fisher information concepts
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Wavelet Fisher's information measure of \(1/f^\alpha\) signals
- Parametric Bayesian estimation of differential entropy and relative entropy
- On minimum Fisher information distributions with restricted support and fixed variance
- Fisher information and semiclassical treatments
- Sample estimate of the entropy of a random vector
- On the estimation of entropy
- Fisher information and spline interpolation
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- Entropy-based measure of uncertainty in past lifetime distributions
- Best asymptotic normality of the kernel density entropy estimator for smooth densities
- Spiking Neuron Models
- MONTE CARLO COMPARISON OF FOUR NORMALITY TESTS USING DIFFERENT ENTROPY ESTIMATES
- Differences in Spiking Patterns Among Cortical Neurons
- Firing Variability Is Higher than Deduced from the Empirical Coefficient of Variation
- Nonparametric Roughness Penalties for Probability Densities
This page was built for publication: Nonparametric estimation of information-based measures of statistical dispersion