Graph degree sequence solely determines the expected Hopfield network pattern stability
DOI10.1162/NECO_A_00685zbMATH Open1414.92008OpenAlexW2130596241WikidataQ51014817 ScholiaQ51014817MaRDI QIDQ5380189FDOQ5380189
Authors: Daniel Berend, Shlomi Dolev, Ariel Hanemann
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00685
Recommendations
- High storage capacity in the Hopfield model with auto-interactions -- stability analysis
- Power law decay of stored pattern stability in sparse Hopfield neural networks
- scientific article; zbMATH DE number 897979
- Analogue neural networks on correlated random graphs
- Asymptotic eigenvectors, topological patterns and recurrent networks
Learning and adaptive systems in artificial intelligence (68T05) Applications of graph theory (05C90) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- Emergence of Scaling in Random Networks
- Collective dynamics of `small-world' networks
- Error detecting and error correcting codes
- Neural networks and physical systems with emergent collective computational abilities
- Rigorous results for the Hopfield model with many patterns
- The capacity of the Hopfield associative memory
- Capacity of an associative memory model on random graph architectures
- Rigorous bounds on the storage capacity of the dilute Hopfield model
- The Hopfield model on a sparse Erdös-Renyi graph
- On the critical capacity of the Hopfield model.
- Lower bounds on the restitution error in the Hopfield model
Cited In (1)
This page was built for publication: Graph degree sequence solely determines the expected Hopfield network pattern stability
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380189)