Dreaming neural networks: rigorous results
From MaRDI portal
Publication:5134375
Abstract: Recently a daily routine for associative neural networks has been proposed: the network Hebbian-learns during the awake state (thus behaving as a standard Hopfield model), then, during its sleep state, optimizing information storage, it consolidates pure patterns and removes spurious ones: this forces the synaptic matrix to collapse to the projector one (ultimately approaching the Kanter-Sompolinksy model). This procedure keeps the learning Hebbian-based (a biological must) but, by taking advantage of a (properly stylized) sleep phase, still reaches the maximal critical capacity (for symmetric interactions). So far this emerging picture (as well as the bulk of papers on unlearning techniques) was supported solely by mathematically-challenging routes, e.g. mainly replica-trick analysis and numerical simulations: here we rely extensively on Guerra's interpolation techniques developed for neural networks and, in particular, we extend the generalized stochastic stability approach to the case. Confining our description within the replica symmetric approximation (where the previous ones lie), the picture painted regarding this generalization (and the previously existing variations on theme) is here entirely confirmed. Further, still relying on Guerra's schemes, we develop a systematic fluctuation analysis to check where ergodicity is broken (an analysis entirely absent in previous investigations). We find that, as long as the network is awake, ergodicity is bounded by the Amit-Gutfreund-Sompolinsky critical line (as it should), but, as the network sleeps, sleeping destroys spin glass states by extending both the retrieval as well as the ergodic region: after an entire sleeping session the solely surviving regions are retrieval and ergodic ones and this allows the network to achieve the perfect retrieval regime (the number of storable patterns equals the number of neurons in the network).
Recommendations
- Hebbian learning, its correlation catastrophe, and unlearning
- Unlearning in the paramagnetic phase of neural network models
- On the unlearning procedure yielding a high-performance associative memory neural network
- Self-organization of day cycle and hierarchical associative memory in live neural network
- Sleeping Our Way to Weight Normalization and Stable Learning
Cites work
- scientific article; zbMATH DE number 1273988 (Why is no real title available?)
- scientific article; zbMATH DE number 1121933 (Why is no real title available?)
- scientific article; zbMATH DE number 2211481 (Why is no real title available?)
- A new mechanical approach to handle generalized Hopfield neural networks
- About the ergodic regime in the analogical Hopfield neural networks: Moments of the partition function
- Determining computational complexity from characteristic ``phase transitions
- Entropy landscape of solutions in the binary perceptron problem
- Equilibrium statistical mechanics of bipartite spin systems
- Exponential inequalities and convergence of moments in the replica-symmetric regime of the Hopfield model.
- Free energies of Boltzmann machines: self-averaging, annealed and replica symmetric approximations in the thermodynamic limit
- Gibbs states and the set of solutions of random constraint satisfaction problems
- Gibbs states of the Hopfield model in the regime of perfect memory
- Gibbs states of the Hopfield model with extensively many patterns.
- Immune networks: multitasking capabilities near saturation
- Modeling Brain Function
- Multitasking attractor networks with neuronal threshold noise
- Neural networks and physical systems with emergent collective computational abilities
- Neural networks retrieving Boolean patterns in a sea of Gaussian ones
- On the equivalence of Hopfield networks and Boltzmann machines
- On the replica symmetric equations for the Hopfield model
- On the stability of the quenched state in mean-field spin-glass models
- Parallel retrieval of correlated patterns: from Hopfield networks to Boltzmann machines
- Replica symmetry breaking in neural networks with modified pseudo-inverse interactions
- Rigorous results for the Hopfield model with many patterns
- Spin-glass stochastic stability: a rigorous proof
- Statistical mechanics of Hopfield-like neural networks with modified interactions
- Statistical mechanics of learning
- The replica symmetric approximation of the analogical neural network
- The replica-symmetric solution without replica trick for the Hopfield model
- The space of interactions in neural network models
- Universality in bipartite mean field spin glasses
- Unlearning in the paramagnetic phase of neural network models
Cited in
(15)- Interpolating between boolean and extremely high noisy patterns through minimal dense associative memories
- Generalized Guerra's interpolation schemes for dense associative neural networks
- Replica symmetry breaking in neural networks: a few steps toward rigorous results
- A spectral approach to Hebbian-like neural networks
- Eigenvector dreaming
- Interacting dreaming neural networks
- The relativistic Hopfield model with correlated patterns
- On the effective initialisation for restricted Boltzmann machines via duality with Hopfield model
- PDE/statistical mechanics duality: relation between Guerra's interpolated p-spin ferromagnets and the Burgers hierarchy
- The decimation scheme for symmetric matrix factorization
- Sleeping Our Way to Weight Normalization and Stable Learning
- Replica symmetry breaking in dense Hebbian neural networks
- Learning and retrieval operational modes for three-layer restricted Boltzmann machines
- Self-organization of day cycle and hierarchical associative memory in live neural network
- Hebbian dreaming for small datasets
This page was built for publication: Dreaming neural networks: rigorous results
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5134375)