Representing Objects, Relations, and Sequences
From MaRDI portal
Publication:5378243
DOI10.1162/NECO_A_00467zbMATH Open1414.68056arXiv1501.07627OpenAlexW2042588857WikidataQ45959317 ScholiaQ45959317MaRDI QIDQ5378243FDOQ5378243
T. Wendy Okaywe, Stephen I. Gallant
Publication date: 12 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Abstract: Vector Symbolic Architectures (VSAs) are high-dimensional vector representations of objects (eg., words, image parts), relations (eg., sentence structures), and sequences for use with machine learning algorithms. They consist of a vector addition operator for representing a collection of unordered objects, a Binding operator for associating groups of objects, and a methodology for encoding complex structures. We first develop Constraints that machine learning imposes upon VSAs: for example, similar structures must be represented by similar vectors. The constraints suggest that current VSAs should represent phrases ("The smart Brazilian girl") by binding sums of terms, in addition to simply binding the terms directly. We show that matrix multiplication can be used as the binding operator for a VSA, and that matrix elements can be chosen at random. A consequence for living systems is that binding is mathematically possible without the need to specify, in advance, precise neuron-to-neuron connection properties for large numbers of synapses. A VSA that incorporates these ideas, MBAT (Matrix Binding of Additive Terms), is described that satisfies all Constraints. With respect to machine learning, for some types of problems appropriate VSA representations permit us to prove learnability, rather than relying on simulations. We also propose dividing machine (and neural) learning and representation into three Stages, with differing roles for learning in each stage. For neural modeling, we give "representational reasons" for nervous systems to have many recurrent connections, as well as for the importance of phrases in language processing. Sizing simulations and analyses suggest that VSAs in general, and MBAT in particular, are ready for real-world applications.
Full work available at URL: https://arxiv.org/abs/1501.07627
Cites Work
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Title not available (Why is that?)
- Enumeration of Seven-Argument Threshold Functions
- Title not available (Why is that?)
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Title not available (Why is that?)
- Title not available (Why is that?)
- Binding and normalization of binary sparse distributed representations by context-dependent thinning
- Tensor product variable binding and the representation of symbolic structures in connectionist systems
- Self-organizing maps
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Boundedness of an Iterative Procedure for Solving a System of Linear Inequalities
- Title not available (Why is that?)
- Linear recursive distributed representations
- Title not available (Why is that?)
Cited In (9)
- A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks
- Sequence Memory Based on Coherent Spin-Interaction Neural Networks
- A Theoretical Perspective on Hyperdimensional Computing
- Title not available (Why is that?)
- Formation of similarity-reflecting binary vectors with random binary projections
- Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing
- Estimation of vectors similarity by their randomized binary projections
- Vector data transformation using random binary matrices
- MODELING OCCURRENCES OF OBJECTS IN RELATIONS
This page was built for publication: Representing Objects, Relations, and Sequences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378243)