Minimal perceptrons for memorizing complex patterns
From MaRDI portal
Abstract: Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agreed with simulations based on the back-propagation algorithm.
Recommendations
- Neural networks as systems for recognizing patterns
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- scientific article; zbMATH DE number 4211542
- scientific article; zbMATH DE number 939873
- scientific article; zbMATH DE number 67614
Cites work
- scientific article; zbMATH DE number 1392848 (Why is no real title available?)
- A Mathematical Theory of Communication
- Approximation by superpositions of a sigmoidal function
- Elements of Information Theory
- Feedforward nets for interpolation and classification
- Learning representations by back-propagating errors
- Multi-class pattern classification using neural networks
- Neural networks and physical systems with emergent collective computational abilities
This page was built for publication: Minimal perceptrons for memorizing complex patterns
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1619854)