A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (Q2469912): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: A nonlinear neural network model of mixture of local principal component analysis: Application to handwritten digits recognition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural networks based approach for computing eigenvectors and eigenvalues of symmetric matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized neural networks for spectral analysis: dynamics and Liapunov functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural networks for computing eigenvalues and eigenvectors / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4255531 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A matrix inverse eigenvalue problem and its application / rank
 
Normal rank
Property / cites work
 
Property / cites work: A principal component analysis algorithm with invariant norm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Real-time neural computation of the eigenvector corresponding to the largest eigenvalue of positive matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Training spatially homogeneous fully recurrent neural networks in eigenvalue space. / rank
 
Normal rank
Property / cites work
 
Property / cites work: A neural network for computing eigenvectors and eigenvalues / rank
 
Normal rank
Property / cites work
 
Property / cites work: Complex recurrent neural network for computing the inverse and pseudo-inverse of the complex matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Recurrent Neural Networks for Computing Pseudoinverses of Rank-Deficient Matrices / rank
 
Normal rank

Latest revision as of 15:40, 27 June 2024

scientific article
Language Label Description Also known as
English
A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
scientific article

    Statements

    A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices (English)
    0 references
    0 references
    0 references
    0 references
    11 February 2008
    0 references
    The authors introduce a recurrent neutral net work (RNN) to extract some eigenpair. The RNN whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable \(z(t)\) is a compex vector. By the analytic expression of \(| z(t)| ^2\), the convergence properties of the RNN are analysed in detail. With a general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. A practice of a \(7\times 7\) matrix indicates the validity of this method.
    0 references
    complex differential system
    0 references
    real matrix
    0 references
    eigenvalues
    0 references
    imaginary part
    0 references
    real part
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references