Stack-like and queue-like dynamics in recurrent neural networks
From MaRDI portal
Publication:3375554
Recommendations
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
- Learning the dynamics of embedded clauses
- Elements for a general memory structure: properties of recurrent neural networks used to form situation models
- Comparing simple recurrent networks and n-grams in a large corpus
Cites work
- scientific article; zbMATH DE number 3664335 (Why is no real title available?)
- scientific article; zbMATH DE number 1330031 (Why is no real title available?)
- scientific article; zbMATH DE number 1010621 (Why is no real title available?)
- A logical calculus of the ideas immanent in nervous activity
- Architectural Bias in Recurrent Neural Networks: Fractal Analysis
- Dynamical recognizers: real-time language recognition by analog computers
- Learning Nonregular Languages: A Comparison of Simple Recurrent Networks and LSTM
- Learning the dynamics of embedded clauses
- On the computational power of neural nets
- QRT FIFO automata, breadth-first grammars and their relations
- Recurrent Neural Networks with Small Weights Implement Definite Memory Machines
- Rule Extraction from Recurrent Neural Networks: ATaxonomy and Review
- Rules and arithmetics
- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Spatiotemporal connectionist networks: A taxonomy and review
- Symbolic dynamics. One-sided, two-sided and countable state Markov shifts
- The calculi of emergence: Computation, dynamics and induction
- Three models for the description of language
- Why it might pay to assume that languages are infinite
Cited in
(5)- Simple recurrent networks learn context-free and context-sensitive languages by counting.
- Elman Backpropagation as Reinforcement for Simple Recurrent Networks
- Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures
- Learning the dynamics of embedded clauses
- \(G\)-networks: A unifying model for neural and queueing networks
This page was built for publication: Stack-like and queue-like dynamics in recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3375554)