Estimation of Entropy and Mutual Information
From MaRDI portal
Publication:4816848
DOI10.1162/089976603321780272zbMath1052.62003MaRDI QIDQ4816848
Publication date: 14 September 2004
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976603321780272
62G05: Nonparametric estimation
60F05: Central limit and other weak theorems
62B10: Statistical aspects of information-theoretic topics
Related Items
Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity, Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions, Sequential Fixed-Point ICA Based on Mutual Information Minimization, Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques, Estimating Information Rates with Confidence Intervals in Neural Spike Trains, Edgeworth Approximation of Multivariate Differential Entropy, Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population, Encoding stimulus information by spike numbers and mean response time in primary auditory cortex, Estimation of generalized entropies with sample spacing, Similarity of interspike interval distributions and information gain in a stationary neuronal firing, Encoding uncertainty in the hippocampus, Estimating Entropy Rates with Bayesian Confidence Intervals, The Spike-Triggered Average of the Integrate-and-Fire Cell Driven by Gaussian White Noise, Variance estimators for the Lempel-Ziv entropy rate estimator
Cites Work
- Unnamed Item
- Achieving information bounds in non and semiparametric models
- An Efron-Stein inequality for nonsymmetric statistics
- The jackknife estimate of variance
- Geometrizing rates of convergence. II
- Minimax lower bounds and moduli of continuity
- Consistency of data-driven histogram methods for density estimation and classification
- Limit theorems for the logarithm of sample spacings
- Weighted sums of certain dependent random variables
- On the mathematical foundations of learning
- Convergence properties of functional estimates for discrete distributions
- Second-order noiseless source coding theorems
- Estimation of the information by an adaptive partitioning of the observation space
- On Choosing and Bounding Probability Metrics
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
- On a Class of Problems Related to the Random Division of an Interval
- Cramér-Rao type integral inequalities for general loss functions