A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
DOI10.1016/j.camwa.2006.09.004zbMath1135.15006OpenAlexW2057255661MaRDI QIDQ2469912
Yiguang Liu, Liping Cao, Zhisheng You
Publication date: 11 February 2008
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2006.09.004
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20) Eigenvalues, singular values, and eigenvectors (15A18) Asymptotics and summation methods for ordinary differential equations in the complex domain (34M30)
Related Items
Cites Work
- Unnamed Item
- Training spatially homogeneous fully recurrent neural networks in eigenvalue space.
- A neural network for computing eigenvectors and eigenvalues
- Neural networks for computing eigenvalues and eigenvectors
- Complex recurrent neural network for computing the inverse and pseudo-inverse of the complex matrix
- A matrix inverse eigenvalue problem and its application
- Neural networks based approach for computing eigenvectors and eigenvalues of symmetric matrix
- Generalized neural networks for spectral analysis: dynamics and Liapunov functions
- A principal component analysis algorithm with invariant norm
- Real-time neural computation of the eigenvector corresponding to the largest eigenvalue of positive matrix
- Recurrent Neural Networks for Computing Pseudoinverses of Rank-Deficient Matrices
- A nonlinear neural network model of mixture of local principal component analysis: Application to handwritten digits recognition
This page was built for publication: A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices