Qualitative behavior of differential equations associated with artificial neural networks (Q702415): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
Normalize DOI.
 
(One intermediate revision by one other user not shown)
Property / DOI
 
Property / DOI: 10.1023/B:JODY.0000041285.36221.bf / rank
Normal rank
 
Property / DOI
 
Property / DOI: 10.1023/B:JODY.0000041285.36221.BF / rank
 
Normal rank

Latest revision as of 01:09, 10 December 2024

scientific article
Language Label Description Also known as
English
Qualitative behavior of differential equations associated with artificial neural networks
scientific article

    Statements

    Qualitative behavior of differential equations associated with artificial neural networks (English)
    0 references
    0 references
    0 references
    17 January 2005
    0 references
    Here, the following matrix differential equation is considered \[ \frac{dW}{dt}=T_{\varepsilon}CW-WW^TCW, \tag{1} \] where \(W(t)\) is a real \(n\times k\)-matrix, \(C\) is a symmetric matrix and \(T_{\varepsilon}\) is a tridiagonal matrix. Equation (1) is proposed by Oja, Kingsley and Adams as a learning model, more precisely, a model that describes the dynamics of an artificial neural network. The elements of the matrix \(W\) are related with the change of information between the neurons, while the parameter \(\varepsilon\) describes the probability of temporary synaptic formation. Using techniques from matrix calculus, the authors establish conditions under which (1) is a gradient, semi-gradient or a gradient-like system. They also study the asymptotic behavior of solutions of system (1).
    0 references
    learning models
    0 references
    matrix differential equations
    0 references
    gradient systems
    0 references
    asymptotic behaviour
    0 references

    Identifiers