Embedding and approximation theorems for echo state networks
DOI10.1016/j.neunet.2020.05.013zbMath1468.68098arXiv1908.05202OpenAlexW3025101483WikidataQ95647877 ScholiaQ95647877MaRDI QIDQ1982435
James Hook, Allen Hart, Jonathan H. P. Dawes
Publication date: 8 September 2021
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.05202
dynamical systemLorenz equationsrecurrent neural networkspersistent homologyreservoir computingdelay embedding
Persistent homology and applications, topological data analysis (55N31) Simulation of dynamical systems (37M05) Networks and circuits as models of computation; circuit complexity (68Q06)
Related Items (15)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The structure of Lorenz attractors
- Embedology
- Robust estimation of tangent maps and Liapunov spectra
- Exploring the topology of dynamical reconstructions
- Echo state networks are universal
- Data-driven reconstruction of nonlinear dynamics from sparse observation
- Topology from time series
- Automatic speech recognition using a predictive echo state network classifier
- Learning grammatical structure with Echo State Networks
- The self-intersections of a smooth \(n\)-manifold in \(2n\)-space
- javaPlex: A Research Software Package for Persistent (Co)Homology
- Deblurring Images
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Deterministic Nonperiodic Flow
- Barcodes: The persistent topology of data
- Invariant manifolds
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Embedding and approximation theorems for echo state networks