Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
From MaRDI portal
Publication:5441302
DOI10.1162/neco.2007.19.11.2913zbMath1129.92023OpenAlexW1987669950WikidataQ31130197 ScholiaQ31130197MaRDI QIDQ5441302
Riccardo Senatore, Stefano Panzeri, Marcelo A. Montemurro
Publication date: 11 February 2008
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.11.2913
Applications of statistics to biology and medical sciences; meta analysis (62P10) Neural biology (92C20)
Related Items (12)
Estimation bias in maximum entropy models ⋮ A kernel-based calculation of information on a metric space ⋮ Mutual Information Expansion for Studying the Role of Correlations in Population Codes: How Important Are Autocorrelations? ⋮ The impact of high-order interactions on the rate of synchronous discharge and information transmission in somatosensory cortex ⋮ General Poisson Exact Breakdown of the Mutual Information to Study the Role of Correlations in Populations of Neurons ⋮ Pursuit of food \textit{versus} pursuit of information in a Markovian perception-action loop model of foraging ⋮ A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information ⋮ Bayesian and quasi-Bayesian estimators for mutual information from discrete data ⋮ Modeling the Correlated Activity of Neural Populations: A Review ⋮ Calculating the Mutual Information between Two Spike Trains ⋮ Unnamed Item ⋮ Unnamed Item
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- A Unified Approach to the Study of Temporal, Correlational, and Rate Coding
- Estimating Entropy Rates with Bayesian Confidence Intervals
- Metric-space analysis of spike trains: theory, algorithms and application
- Information geometry on hierarchy of probability distributions
- Neural coding and decoding: communication channels and quantization
- Estimation of Entropy and Mutual Information
- On information rates for mismatched decoders
- Data-Robust Tight Lower Bounds to the Information Carried by Spike Times of a Neuronal Population
This page was built for publication: Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques