Mutual Information, Fisher Information, and Efficient Coding
From MaRDI portal
Publication:5380393
DOI10.1162/NECO_A_00804zbMath1414.92056DBLPjournals/neco/WeiS16OpenAlexW2203957096WikidataQ50758217 ScholiaQ50758217MaRDI QIDQ5380393
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00804
Neural networks for/in biological studies, artificial life and related topics (92B20) Measures of information, entropy (94A17)
Related Items (4)
Fisher information in Poissonian model neurons ⋮ Heterogeneous Synaptic Weighting Improves Neural Coding in the Presence of Common Noise ⋮ Multiple Timescale Online Learning Rules for Information Maximization with Energetic Constraints ⋮ Information-Theoretic Bounds and Approximations in Neural Population Coding
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fisher and Shannon Information in Finite Neural Populations
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Information-theoretic asymptotics of Bayes methods
- Is the Homunculus “Aware” of Sensory Adaptation?
- Information theoretic inequalities
- Could information theory provide an ecological theory of sensory processing?
- Optimal Short-Term Population Coding: When Fisher Information Fails
- Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer
- Fisher information and stochastic complexity
- Efficient Sensory Encoding and Bayesian Inference with Heterogeneous Neural Populations
- Extensions of Fisher Information and Stam's Inequality
- Optimal Population Codes for Space: Grid Cells Outperform Place Cells
- On Information and Sufficiency
This page was built for publication: Mutual Information, Fisher Information, and Efficient Coding