Representing objects, relations, and sequences
From MaRDI portal
Publication:5378243
Abstract: Vector Symbolic Architectures (VSAs) are high-dimensional vector representations of objects (eg., words, image parts), relations (eg., sentence structures), and sequences for use with machine learning algorithms. They consist of a vector addition operator for representing a collection of unordered objects, a Binding operator for associating groups of objects, and a methodology for encoding complex structures. We first develop Constraints that machine learning imposes upon VSAs: for example, similar structures must be represented by similar vectors. The constraints suggest that current VSAs should represent phrases ("The smart Brazilian girl") by binding sums of terms, in addition to simply binding the terms directly. We show that matrix multiplication can be used as the binding operator for a VSA, and that matrix elements can be chosen at random. A consequence for living systems is that binding is mathematically possible without the need to specify, in advance, precise neuron-to-neuron connection properties for large numbers of synapses. A VSA that incorporates these ideas, MBAT (Matrix Binding of Additive Terms), is described that satisfies all Constraints. With respect to machine learning, for some types of problems appropriate VSA representations permit us to prove learnability, rather than relying on simulations. We also propose dividing machine (and neural) learning and representation into three Stages, with differing roles for learning in each stage. For neural modeling, we give "representational reasons" for nervous systems to have many recurrent connections, as well as for the importance of phrases in language processing. Sizing simulations and analyses suggest that VSAs in general, and MBAT in particular, are ready for real-world applications.
Recommendations
- Vector-derived transformation binding: an improved binding operation for deep symbol-like processing in neural networks
- Tensor product variable binding and the representation of symbolic structures in connectionist systems
- Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures
- scientific article; zbMATH DE number 67438
- Optimal Quadratic Binding for Relational Reasoning in Vector Symbolic Neural Architectures
Cites work
- scientific article; zbMATH DE number 3829300 (Why is no real title available?)
- scientific article; zbMATH DE number 42973 (Why is no real title available?)
- scientific article; zbMATH DE number 3551942 (Why is no real title available?)
- scientific article; zbMATH DE number 2033142 (Why is no real title available?)
- scientific article; zbMATH DE number 795581 (Why is no real title available?)
- scientific article; zbMATH DE number 3314813 (Why is no real title available?)
- Artificial general intelligence 2008. Proceedings of the 1st AGI conference.
- Binding and normalization of binary sparse distributed representations by context-dependent thinning
- Enumeration of Seven-Argument Threshold Functions
- Linear recursive distributed representations
- Natural language processing (almost) from scratch
- On the Boundedness of an Iterative Procedure for Solving a System of Linear Inequalities
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Self-organizing maps
- Tensor product variable binding and the representation of symbolic structures in connectionist systems
Cited in
(16)- Vector-derived transformation binding: an improved binding operation for deep symbol-like processing in neural networks
- Sequence Memory Based on Coherent Spin-Interaction Neural Networks
- Combinatorial representations of token sequences
- Embedding probabilities in predication space with Hermitian holographic reduced representations
- scientific article; zbMATH DE number 4203702 (Why is no real title available?)
- Formation of similarity-reflecting binary vectors with random binary projections
- Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures
- A theory of sequence indexing and working memory in recurrent neural networks
- A theoretical perspective on hyperdimensional computing
- Estimation of vectors similarity by their randomized binary projections
- Vector data transformation using random binary matrices
- Tensor product variable binding and the representation of symbolic structures in connectionist systems
- MODELING OCCURRENCES OF OBJECTS IN RELATIONS
- Symbolic computation using cellular automata-based hyperdimensional computing
- Tensor representation of topographically organized semantic spaces
- Representing types as neural events
This page was built for publication: Representing objects, relations, and sequences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378243)