Memory and forecasting capacities of nonlinear recurrent networks
DOI10.1016/J.PHYSD.2020.132721zbMATH Open1484.68185arXiv2004.11234OpenAlexW3018848489MaRDI QIDQ2116285FDOQ2116285
Authors: Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega
Publication date: 16 March 2022
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2004.11234
Recommendations
- Distributed sequence memory of multidimensional inputs in recurrent networks
- scientific article; zbMATH DE number 1843097
- scientific article; zbMATH DE number 2033224
- Programmed interactions in higher-order neural networks: Maximal capacity
- Elements for a general memory structure: properties of recurrent neural networks used to form situation models
machine learningrecurrent neural networkreservoir computingmemory capacityecho state network (ESN)forecasting capacity
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Learning and adaptive systems in artificial intelligence (68T05) Neural nets and related approaches to inference from stochastic processes (62M45) Networks and circuits as models of computation; circuit complexity (68Q06)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Toeplitz and circulant matrices: a review.
- On some properties of positive definite Toeplitz matrices and their possible applications
- Title not available (Why is that?)
- Approximating nonlinear fading-memory operators using neural network models
- A local echo state property through the largest Lyapunov exponent
- Memory in linear recurrent neural networks in continuous time
- The identification of nonlinear discrete-time fading-memory systems using neural network models
- Re-visiting the echo state property
- Echo state networks are universal
- Title not available (Why is that?)
- Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality
- The asymptotic performance of linear echo state neural networks
- Title not available (Why is that?)
- Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks
- Nonlinear Memory Capacity of Parallel Time-Delay Reservoir Computers in the Processing of Multidimensional Signals
- Title not available (Why is that?)
- Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
- Short-Term Memory Capacity in Networks via the Restricted Isometry Property
Cited In (5)
- Learn to synchronize, synchronize to learn
- Learning strange attractors with reservoir systems
- Modelling memory functions with recurrent neural networks consisting of input compensation units: I. Static situations
- Memory in linear recurrent neural networks in continuous time
- Dimension reduction in recurrent networks by canonicalization
This page was built for publication: Memory and forecasting capacities of nonlinear recurrent networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2116285)