A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (Q2469912): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Added link to MaRDI item.
links / mardi / namelinks / mardi / name
 

Revision as of 01:07, 3 February 2024

scientific article
Language Label Description Also known as
English
A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
scientific article

    Statements

    A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (English)
    0 references
    0 references
    0 references
    0 references
    11 February 2008
    0 references
    The authors introduce a recurrent neutral net work (RNN) to extract some eigenpair. The RNN whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable \(z(t)\) is a compex vector. By the analytic expression of \(| z(t)| ^2\), the convergence properties of the RNN are analysed in detail. With a general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. A practice of a \(7\times 7\) matrix indicates the validity of this method.
    0 references
    complex differential system
    0 references
    real matrix
    0 references
    eigenvalues
    0 references
    imaginary part
    0 references
    real part
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references