On optimal nonlinear associative recall (Q1223269): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: A simple neural network generating an interactive memory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convolution and correlation algebras / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5565769 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The design of nonlinear filters and control systems. Part 1 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Correlation Matrix Memories / rank
 
Normal rank
Property / cites work
 
Property / cites work: Measurement of the Kernels of a Non-linear System of Finite Order† / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Eigenfunctions of Laplace's Tidal Equations over a Sphere / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5849795 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3231532 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Correlation memory models - a first approximation in a general learning scheme / rank
 
Normal rank
Property / cites work
 
Property / cites work: On optimal nonlinear associative recall / rank
 
Normal rank

Revision as of 17:11, 12 June 2024

scientific article
Language Label Description Also known as
English
On optimal nonlinear associative recall
scientific article

    Statements

    On optimal nonlinear associative recall (English)
    0 references
    1975
    0 references
    The paper [see also the author, ``On optimal discrete estimation.'' in: Symposium on Nonlinear Systems, Caltech, G. D. McCann (ed.)and P. Z. Marmareli (ed.) (1975)] considers the problem of determining the nonlinear mapping which optimally associates, in the least square sense, two sets of discrete, finite, column vectors, forming two matrices \(X\) (''input'') and \(Y\) (''output''), with the same number of columns and an arbitrary number of rows. The optimum mapping is seeked in the class of polynomial mappings of degree \(k\). The mapping is defined by \[ Y = L_0 + L_1(X) + L_2(X, X) + \ldots + L_k(X, \ldots,X) \tag{1} \] where \(L_i\) \((i = 1, \ldots, k)\) is an \(i\)-linear, symmetric mapping \(V\times V\times\cdots\times V\rightarrow W\). \(V\) and \(W\) are vector spaces over the real field. With \((L_m)_{i,\alpha_1\cdots\alpha_m}\) defined as the \(m\)-way matrix associated to the mapping \(L_n\), equation (1) can be rewritten as \[ Y_{ij} = (L_0)_{ij} + \sum_{\alpha_i}(L_1)_{i,\alpha_1}X_{\alpha_1j}+ \sum_{\alpha_1\alpha_2}(L_1)_{i,\alpha_1\alpha_2}X_{\alpha_1j} X_{\alpha_2j} + \ldots + \sum_{\alpha_1\cdots\alpha_k} (L_k)_{i,\alpha_1\cdots\alpha_k} X_{\alpha_1j}\cdots X_{\alpha_kj}. \tag{2} \] Since equation (2) is a linear mapping on appropriate cross products of the elements of \(X\), the optimum polynomial of degree \(k\) can be found in terms of the pseudo-inverse of the corresponding matrix containing all the cross products of \(X\) up to the degree \(k\). It is shown in the paper that an alternative iteration scheme, based on cycles of successive optimal corrections until the order \(k\), converges to the optimal polynomial estimator of order \(k\). Various properties of polynomial estimators are further characterized. Conditions on the input matrix \(X\) under which the optimal estimate is linear are derived. ``Stationarity'' of the random processes of which \(X\) and \(Y\) are assumed to represent a set of sample sequences implies that the optimal linear estimate is given by a discrete counterpart of the Wiener-Hopf equation. If the input signals are ``noise-like'', the ``classical'' holographic scheme of associative memory is obtained as the optimal estimator. Nonlinear encoding of the ``input'' \(X\) either before or after the polynomial estimator affects, and in some cases simplifies, the polynomial estimator problem. For instance, a suitable input (output) coding followed (preceded) by a linear mapping can be equivalent to equations (1), (2). The problem of identifying nonlinear mappings from ``input-output experiments'' is also easily treated in terms of this discrete approach and provides a method essentially equivalent to the ``white-noise'' identification method for functionals. Finally, the paper suggests some implications on models of associative distributed memory in brain sciences.
    0 references
    0 references

    Identifiers