Encoding binary neural codes in networks of threshold-linear neurons
From MaRDI portal
Publication:5378281
DOI10.1162/NECO_A_00504zbMATH Open1415.92014DBLPjournals/neco/CurtoDI13arXiv1212.0031OpenAlexW2147103442WikidataQ44829448 ScholiaQ44829448MaRDI QIDQ5378281FDOQ5378281
Carina Curto, Vladimir Itskov, Anda Degeratu
Publication date: 12 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Abstract: Networks of neurons in the brain encode preferred patterns of neural activity via their synaptic connections. Despite receiving considerable attention, the precise relationship between network connectivity and encoded patterns is still poorly understood. Here we consider this problem for networks of threshold-linear neurons whose computational function is to learn and store a set of binary patterns (e.g., a neural code) as "permitted sets" of the network. We introduce a simple Encoding Rule that selectively turns "on" synapses between neurons that co-appear in one or more patterns. The rule uses synapses that are binary, in the sense of having only two states ("on" or "off"), but also heterogeneous, with weights drawn from an underlying synaptic strength matrix S. Our main results precisely describe the stored patterns that result from the Encoding Rule -- including unintended "spurious" states -- and give an explicit characterization of the dependence on S. In particular, we find that binary patterns are successfully stored in these networks when the excitatory connections between neurons are geometrically balanced -- i.e., they satisfy a set of geometric constraints. Furthermore, we find that certain types of neural codes are "natural" in the context of these networks, meaning that the full code can be accurately learned from a highly undersampled set of patterns. Interestingly, many commonly observed neural codes in cortical and hippocampal areas are natural in this sense. As an application, we construct networks that encode hippocampal place field codes nearly exactly, following presentation of only a small fraction of patterns. To obtain our results, we prove new theorems using classical ideas from convex and distance geometry, such as Cayley-Menger determinants, revealing a novel connection between these areas of mathematics and coding properties of neural networks.
Full work available at URL: https://arxiv.org/abs/1212.0031
Recommendations
- Pattern completion in symmetric threshold-linear networks
- Permitted sets and convex coding in nonthreshold linear networks
- Precise capacity analysis in binary networks with multiple coding level inputs
- Sparse coding for layered neural networks
- What determines the capacity of autoassociative memories in the brain?
Cites Work
- Matrix Analysis
- Remarks to Maurice Frechet's article ``Sur la definition axiomatique d'une classe d'espaces vectoriels distancies applicables vectoriellement sur l'espace de Hilbert
- Title not available (Why is that?)
- Title not available (Why is that?)
- Neural networks and physical systems with emergent collective computational abilities.
- Title not available (Why is that?)
- Mathematical foundations of neuroscience
- Dynamics of pattern formation in lateral-inhibition type neural fields
- Spatiotemporal dynamics of continuum neural fields
- Title not available (Why is that?)
- Fundamentals of Error-Correcting Codes
- Title not available (Why is that?)
- Neurons with graded response have collective computational properties like those of two-state neurons.
- Title not available (Why is that?)
- Geometry of cuts and metrics
- Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
- The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes
- Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks
- Modeling Brain Function
- Rate Models for Conductance-Based Cortical Neuronal Networks
- Combinatorial Neural Codes from a Mathematical Coding Theory Perspective
- Selectively Grouping Neurons in Recurrent Networks of Lateral Inhibition
- Flexible memory networks
Cited In (16)
- Linearly decodable functions from neural population codes
- Stable fixed points of combinatorial threshold-linear networks
- Heteroclinic Cycles in a Competitive Network
- Fixed Points of Competitive Threshold-Linear Networks
- Dynamic switching of neural codes in networks with gap junctions
- Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer
- Asymmetry of Neuronal Combinatorial Codes Arises from Minimizing Synaptic Weight Change
- Sequential Attractors in Combinatorial Threshold-Linear Networks
- Periodic Solutions in Threshold-Linear Networks and Their Entrainment
- Pattern Completion in Symmetric Threshold-Linear Networks
- Oscillatory networks: insights from piecewise-linear modeling
- Diversity of emergent dynamics in competitive threshold-linear networks
- Efficient Neural Codes That Minimize Lp Reconstruction Error
- The combinatorial code and the graph rules of dale networks
- Nerve theorems for fixed points of neural networks
- Synchronicity in Non-smooth Competitive Networks with Threshold Nonlinearities
Uses Software
This page was built for publication: Encoding binary neural codes in networks of threshold-linear neurons
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378281)