Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
From MaRDI portal
Publication:3149504
DOI10.1162/089976602320263980zbMATH Open1010.68857DBLPjournals/neco/SchmidhuberGE02OpenAlexW2154039517WikidataQ52011065 ScholiaQ52011065MaRDI QIDQ3149504FDOQ3149504
Authors: Jürgen Schmidhuber, Felix Gers, Douglas Eck
Publication date: 13 May 2003
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976602320263980
Cites Work
Cited In (6)
- Learning to imitate stochastic time series in a compositional way by chaos
- Training Recurrent Networks by Evolino
- Hierarchical linear dynamical systems for unsupervised musical note recognition
- LSTM
- A model for learning to segment temporal sequences, utilizing a mixture of RNN experts together with adaptive variance
- Stack-like and queue-like dynamics in recurrent neural networks
This page was built for publication: Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3149504)