Stack-like and queue-like dynamics in recurrent neural networks
From MaRDI portal
Publication:3375554
DOI10.1080/09540090500317291zbMATH Open1084.68097OpenAlexW2158191188MaRDI QIDQ3375554FDOQ3375554
Authors: André Grüning
Publication date: 14 March 2006
Published in: Connection Science (Search for Journal in Brave)
Full work available at URL: http://epubs.surrey.ac.uk/22858/2/gruening_05_connection_science.pdf
Recommendations
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
- Learning the dynamics of embedded clauses
- Elements for a general memory structure: properties of recurrent neural networks used to form situation models
- Comparing simple recurrent networks and \(n\)-grams in a large corpus
Learning and adaptive systems in artificial intelligence (68T05) Formal languages and automata (68Q45)
Cites Work
- Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
- Title not available (Why is that?)
- Title not available (Why is that?)
- Three models for the description of language
- Symbolic dynamics. One-sided, two-sided and countable state Markov shifts
- A logical calculus of the ideas immanent in nervous activity
- Rules and arithmetics
- On the computational power of neural nets
- QRT FIFO automata, breadth-first grammars and their relations
- The calculi of emergence: Computation, dynamics and induction
- Title not available (Why is that?)
- Spatiotemporal connectionist networks: A taxonomy and review
- Rule Extraction from Recurrent Neural Networks: ATaxonomy and Review
- Recurrent Neural Networks with Small Weights Implement Definite Memory Machines
- Architectural Bias in Recurrent Neural Networks: Fractal Analysis
- Dynamical recognizers: real-time language recognition by analog computers
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Why it might pay to assume that languages are infinite
- Learning the dynamics of embedded clauses
Cited In (5)
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Learning the dynamics of embedded clauses
- Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
- Elman Backpropagation as Reinforcement for Simple Recurrent Networks
- \(G\)-networks: A unifying model for neural and queueing networks
Uses Software
This page was built for publication: Stack-like and queue-like dynamics in recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3375554)