Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
DOI10.1016/J.NEUNET.2006.01.020zbMATH Open1111.68097OpenAlexW2109101264WikidataQ51941965 ScholiaQ51941965MaRDI QIDQ872410FDOQ872410
Authors: Michal Čerňanský, Matej Makula, Ľubica Beňušková
Publication date: 27 March 2007
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2006.01.020
Recommendations
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Stack-like and queue-like dynamics in recurrent neural networks
- scientific article; zbMATH DE number 1843095
- Comparing simple recurrent networks and \(n\)-grams in a large corpus
- scientific article; zbMATH DE number 1843098
recurrent neural networksvariable length Markov modellanguage processinglinguistic structuresMarkovian architectural biasneural prediction machinesnext-symbol predictionstate space analysis
Learning and adaptive systems in artificial intelligence (68T05) Natural language processing (68T50)
Cites Work
- The power of amnesia: Learning probabilistic automata with variable memory length
- On the Emergence of Rules in Neural Networks
- Recurrent Neural Networks with Small Weights Implement Definite Memory Machines
- Architectural Bias in Recurrent Neural Networks: Fractal Analysis
- Predicting the future of discrete sequences from fractal representations of the past
Cited In (11)
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Learning grammatical structure with Echo State Networks
- Comparing simple recurrent networks and \(n\)-grams in a large corpus
- Learning the dynamics of embedded clauses
- Title not available (Why is that?)
- Title not available (Why is that?)
- Elman Backpropagation as Reinforcement for Simple Recurrent Networks
- Title not available (Why is that?)
- Pre-wiring and pre-training: what does a neural network need to learn truly general identity rules?
- Title not available (Why is that?)
- Stack-like and queue-like dynamics in recurrent neural networks
This page was built for publication: Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q872410)