Information-Theoretic Bounds and Approximations in Neural Population Coding
From MaRDI portal
Publication:5157152
DOI10.1162/neco_a_01056zbMath1472.92022arXiv1611.01414OpenAlexW2551959291WikidataQ50000079 ScholiaQ50000079MaRDI QIDQ5157152
No author found.
Publication date: 12 October 2021
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1611.01414
Neural networks for/in biological studies, artificial life and related topics (92B20) Coding theorems (Shannon theory) (94A24)
Related Items (2)
Multiple Timescale Online Learning Rules for Information Maximization with Energetic Constraints ⋮ Quantifying Information Conveyed by Large Neuronal Populations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Fisher and Shannon Information in Finite Neural Populations
- Information-theoretic asymptotics of Bayes methods
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- Optimal Short-Term Population Coding: When Fisher Information Fails
- Estimation of Entropy and Mutual Information
- Fisher information and stochastic complexity
- Efficient Sensory Encoding and Bayesian Inference with Heterogeneous Neural Populations
- Mutual Information, Fisher Information, and Efficient Coding
- Elements of Information Theory
- Difficulty of Singularity in Population Coding
This page was built for publication: Information-Theoretic Bounds and Approximations in Neural Population Coding