Information in the Nonstationary Case
From MaRDI portal
Publication:3613610
Recommendations
- Estimation of Entropy and Mutual Information
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- Estimate of mutual information carried by neuronal responses from small data samples
- Nonparametric estimation of information-based measures of statistical dispersion
- Calculating the mutual information between two spike trains
Cites work
- scientific article; zbMATH DE number 847242 (Why is no real title available?)
- A Mathematical Theory of Communication
- Convergence properties of functional estimates for discrete distributions
- Discrimination between monotonic trends and long-range dependence
- Dynamic Analyses of Information Encoding in Neural Ensembles
- Estimating Entropy Rates with Bayesian Confidence Intervals
Cited in
(3)
This page was built for publication: Information in the Nonstationary Case
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3613610)