A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (Q2469912)

From MaRDI portal
Revision as of 15:40, 27 June 2024 by ReferenceBot (talk | contribs) (‎Changed an Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
scientific article

    Statements

    A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (English)
    0 references
    0 references
    0 references
    0 references
    11 February 2008
    0 references
    The authors introduce a recurrent neutral net work (RNN) to extract some eigenpair. The RNN whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable \(z(t)\) is a compex vector. By the analytic expression of \(| z(t)| ^2\), the convergence properties of the RNN are analysed in detail. With a general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. A practice of a \(7\times 7\) matrix indicates the validity of this method.
    0 references
    complex differential system
    0 references
    real matrix
    0 references
    eigenvalues
    0 references
    imaginary part
    0 references
    real part
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references