Memory in linear recurrent neural networks in continuous time
DOI10.1016/J.NEUNET.2009.08.008zbMATH Open1396.68093OpenAlexW2062004915WikidataQ39900109 ScholiaQ39900109MaRDI QIDQ1784560FDOQ1784560
Authors: Michiel Hermans, Benjamin Schrauwen
Publication date: 27 September 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2009.08.008
Recommendations
- Memory and forecasting capacities of nonlinear recurrent networks
- Neural networks with memory
- Associative memory by recurrent neural networks with delay elements
- scientific article; zbMATH DE number 7714085
- scientific article; zbMATH DE number 790955
- Learning Beyond Finite Memory in Recurrent Networks of Spiking Neurons
- scientific article; zbMATH DE number 50688
- scientific article; zbMATH DE number 49141
- LINEAR PROGRAMMING AND RECURRENT ASSOCIATIVE MEMORIES
Learning and adaptive systems in artificial intelligence (68T05) Modes of computation (nondeterministic, parallel, interactive, probabilistic, etc.) (68Q10)
Cites Work
- Principal component analysis.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Neural networks and physical systems with emergent collective computational abilities
- On the computational power of circuits of spiking neurons
- Spiking Neuron Models
- Optimization and applications of echo state networks with leaky- integrator neurons
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Edge of chaos and prediction of computational performance for neural circuit models
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
- Analysis and Design of Echo State Networks
Cited In (20)
- Evanescent coupling of nonlinear integrated cavities for all-optical reservoir computing
- On the characteristics and structures of dynamical systems suitable for reservoir computing
- Short-term memory capacity in networks via the restricted isometry property
- Recurrent Neural Networks with Small Weights Implement Definite Memory Machines
- An experimental unification of reservoir computing methods
- Modelling memory functions with recurrent neural networks consisting of input compensation units: I. Static situations
- Input-anticipating critical reservoirs show power law forgetting of unexpected input events
- A theory of sequence indexing and working memory in recurrent neural networks
- Deep time-delay reservoir computing: dynamics and memory capacity
- Title not available (Why is that?)
- Stability analysis of reservoir computers dynamics via Lyapunov functions
- Memory and forecasting capacities of nonlinear recurrent networks
- Title not available (Why is that?)
- Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons
- Synthesis of recurrent neural networks for dynamical system simulation
- Revisiting the memory capacity in reservoir computing of directed acyclic network
- Universal discrete-time reservoir computers with stochastic inputs and linear readouts using non-homogeneous state-affine systems
- Dimension reduction in recurrent networks by canonicalization
- Memristor models for machine learning
- Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers
This page was built for publication: Memory in linear recurrent neural networks in continuous time
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1784560)