Training spatially homogeneous fully recurrent neural networks in eigenvalue space.
From MaRDI portal
Publication:676596
DOI10.1016/S0893-6080(96)00079-2zbMath1067.68589OpenAlexW1997442072WikidataQ52196857 ScholiaQ52196857MaRDI QIDQ676596
Renzo Perfetti, Emanuele Massarelli
Publication date: 7 August 1997
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0893-6080(96)00079-2
Recurrent neural networksGlobal asymptotic stabilityDiscrete Fourier transformFeature extractionEigenvalue spaceLearning algorithmSpatial homogeneity
Related Items (4)
Neural-network-based approach for extracting eigenvectors and eigenvalues of real normal matrices and some extension to real matrices ⋮ A concise functional neural network computing the largest modulus eigenvalues and their corresponding eigenvectors of a real skew matrix ⋮ A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices ⋮ A functional neural network computing some eigenvalues and eigenvectors of a special real matrix
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Application of adjoint operators to neural learning
- Dynamics and architecture for neural computation
- Cellular neural networks: theory
- Cellular neural networks: Theory and circuit design
- Sparsely interconnected neural networks for associative memories with applications to cellular neural networks
- Robust Stability and Diagonal Liapunov Functions
- The Relaxation Method for Linear Inequalities
This page was built for publication: Training spatially homogeneous fully recurrent neural networks in eigenvalue space.