Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures
From MaRDI portal
Publication:3386435
Learning and adaptive systems in artificial intelligence (68T05) Artificial neural networks and deep learning (68T07) Data structures (68P05) Neural networks for/in biological studies, artificial life and related topics (92B20) Machine vision and scene understanding (68T45) Networks and circuits as models of computation; circuit complexity (68Q06)
Abstract: The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition. Here we show how this may be accomplished within the framework of Vector Symbolic Architectures (VSA) (Plate, 1991; Gayler, 1998; Kanerva, 1996), whereby data structures are encoded by combining high-dimensional vectors with operations that together form an algebra on the space of distributed representations. In particular, we propose an efficient solution to a hard combinatorial search problem that arises when decoding elements of a VSA data structure: the factorization of products of multiple code vectors. Our proposed algorithm, called a resonator network, is a new type of recurrent neural network that interleaves VSA multiplication operations and pattern completion. We show in two examples -- parsing of a tree-like data structure and parsing of a visual scene -- how the factorization problem arises and how the resonator network can solve it. More broadly, resonator networks open the possibility to apply VSAs to myriad artificial intelligence problems in real-world domains. A companion paper (Kent et al., 2020) presents a rigorous analysis and evaluation of the performance of resonator networks, showing it out-performs alternative approaches.
Recommendations
- Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods
- Representing objects, relations, and sequences
- Vector-derived transformation binding: an improved binding operation for deep symbol-like processing in neural networks
- scientific article; zbMATH DE number 1784866
- Linear recursive distributed representations
Cites work
- scientific article; zbMATH DE number 67438 (Why is no real title available?)
- A theory of sequence indexing and working memory in recurrent neural networks
- Binding and normalization of binary sparse distributed representations by context-dependent thinning
- Learning to represent spatial transformations with factored higher-order Boltzmann machines
- Neural networks and physical systems with emergent collective computational abilities
- Randomly connected sigma–pi neurons can form associator networks
- Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods
- Tensor product variable binding and the representation of symbolic structures in connectionist systems
- The concentration of measure phenomenon
Cited in
(3)
This page was built for publication: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3386435)