Reservoir computing approaches to recurrent neural network training
DOI10.1016/j.cosrev.2009.03.005zbMath1302.68235OpenAlexW2171865010WikidataQ105583675 ScholiaQ105583675MaRDI QIDQ458488
Mantas Lukoševičius, Herbert Jaeger
Publication date: 7 October 2014
Published in: Computer Science Review (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cosrev.2009.03.005
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20) Research exposition (monographs, survey articles) pertaining to computer science (68-02)
Related Items (only showing first 100 items - show all)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Isolated word recognition with the Liquid State Machine: a case study
- Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning
- The cerebellum as a liquid state machine
- Fading memory and time series prediction in recurrent networks with different forms of plasticity
- Edge of chaos and prediction of computational performance for neural circuit models
- Optimization and applications of echo state networks with leaky- integrator neurons
- Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning
- Decoupled echo state networks with lateral inhibition
- An associative memory readout for ESNs with applications to dynamical pattern recognition
- An experimental unification of reservoir computing methods
- Automatic speech recognition using a predictive echo state network classifier
- Emergence of Scaling in Random Networks
- Reducing the Dimensionality of Data with Neural Networks
- Synergies Between Intrinsic and Synaptic Plasticity Mechanisms
- Reservoir optimization in recurrent neural networks using properties of Kronecker product
- Training Recurrent Networks by Evolino
- Spiking Neurons Can Learn to Solve Information Bottleneck Problems and Extract Independent Components
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Dynamical Working Memory and Timed Responses: The Role of Reverberating Loops in the Olivo-Cerebellar System
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
- Collective dynamics of ‘small-world’ networks
- Neural networks and physical systems with emergent collective computational abilities.
- Analysis and Design of Echo State Networks
This page was built for publication: Reservoir computing approaches to recurrent neural network training