Training spatially homogeneous fully recurrent neural networks in eigenvalue space.
DOI10.1016/S0893-6080(96)00079-2zbMATH Open1067.68589OpenAlexW1997442072WikidataQ52196857 ScholiaQ52196857MaRDI QIDQ676596FDOQ676596
Authors: Renzo Perfetti, Emanuele Massarelli
Publication date: 7 August 1997
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0893-6080(96)00079-2
Global asymptotic stabilityRecurrent neural networksDiscrete Fourier transformEigenvalue spaceFeature extractionLearning algorithmSpatial homogeneity
Cites Work
- Title not available (Why is that?)
- Cellular neural networks: theory
- The Relaxation Method for Linear Inequalities
- Sparsely interconnected neural networks for associative memories with applications to cellular neural networks
- Robust Stability and Diagonal Liapunov Functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Dynamics and architecture for neural computation
- Cellular neural networks: Theory and circuit design
- Application of adjoint operators to neural learning
Cited In (4)
- Neural-network-based approach for extracting eigenvectors and eigenvalues of real normal matrices and some extension to real matrices
- A concise functional neural network computing the largest modulus eigenvalues and their corresponding eigenvectors of a real skew matrix
- A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
- A functional neural network computing some eigenvalues and eigenvectors of a special real matrix
This page was built for publication: Training spatially homogeneous fully recurrent neural networks in eigenvalue space.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q676596)