Robust exponential memory in Hopfield networks
From MaRDI portal
Abstract: The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically-coupled McCulloch-Pitts neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems and store memories as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of error-correcting codes that achieve Shannon's channel capacity bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.
Recommendations
- Typical error pattern recovery of the Hopfield memory under error-tolerant conditions
- scientific article; zbMATH DE number 563663
- Information storage in Hopfield model with reduced complexity
- On a model of associative memory with huge storage capacity
- Exact memory association in a simple Hopfield neural network with two parameters
Cites work
- scientific article; zbMATH DE number 1952026 (Why is no real title available?)
- scientific article; zbMATH DE number 1380608 (Why is no real title available?)
- scientific article; zbMATH DE number 3231758 (Why is no real title available?)
- A Mathematical Theory of Communication
- A logical calculus of the ideas immanent in nervous activity
- Beitrag zur Theorie des Ferromagnetismus
- Capacity of neural networks with discrete synaptic couplings
- Collective Computation With Continuous Variables
- Combinatorial neural codes from a mathematical coding theory perspective
- Efficient Associative Computation with Discrete Synapses
- Enumeration of Seven-Argument Threshold Functions
- First draft of a report on the EDVAC
- Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements
- Logarithmic Regret Algorithms for Online Convex Optimization
- Neural networks and physical systems with emergent collective computational abilities
- On Computable Numbers, with an Application to the Entscheidungsproblem
- On the capacity of neural networks with binary weights
- Polynomial-Time Approximation Algorithms for the Ising Model
- Statistical physics of irregular low-density parity-check codes
- The capacity of the Hopfield associative memory
- Updating Quasi-Newton Matrices with Limited Storage
Cited in
(4)
This page was built for publication: Robust exponential memory in Hopfield networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q723688)