Combinatorial neural codes from a mathematical coding theory perspective

From MaRDI portal
Publication:5378236

DOI10.1162/NECO_A_00459zbMATH Open1448.94290arXiv1212.5188OpenAlexW2127780167WikidataQ48952466 ScholiaQ48952466MaRDI QIDQ5378236FDOQ5378236


Authors: Vladimir Itskov, Katherine Morrison, Zachary Roth, Judy L. Walker, Carina Curto Edit this on Wikidata


Publication date: 12 June 2019

Published in: Neural Computation (Search for Journal in Brave)

Abstract: Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes does not support accurate error correction, although the error-correcting performance of RF codes "catches up" to that of random comparison codes when a small tolerance to error is introduced. On the other hand, RF codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.


Full work available at URL: https://arxiv.org/abs/1212.5188




Recommendations



Cites Work


Cited In (18)





This page was built for publication: Combinatorial neural codes from a mathematical coding theory perspective

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378236)