Learning strange attractors with reservoir systems
DOI10.1088/1361-6544/ace492zbMath1525.37020arXiv2108.05024OpenAlexW3187680631MaRDI QIDQ6169729
Lyudmila Grigoryeva, Juan-Pablo Ortega, Allen Hart
Publication date: 15 August 2023
Published in: Nonlinearity (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2108.05024
dynamical systemsattractorgeneralized synchronizationreservoir computingecho state propertyTakens embeddingfading memory property
Attractors and repellers of smooth dynamical systems and their topological structure (37C70) Strange attractors, chaotic dynamics of systems with hyperbolic behavior (37D45) Dynamical systems involving smooth mappings and diffeomorphisms (37C05)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Manifolds, tensor analysis, and applications.
- Approximating nonlinear fading-memory operators using neural network models
- Embedology
- The synchronization of chaotic systems
- Embedding and approximation theorems for echo state networks
- Dimension reduction in recurrent networks by canonicalization
- Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
- Memory and forecasting capacities of nonlinear recurrent networks
- Echo state networks are universal
- Differential Topology
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Deterministic Nonperiodic Flow
- Chaos in Dynamical Systems
- Fundamentals of synchronization in chaotic systems, concepts, and applications
- Invertible generalized synchronization: A putative mechanism for implicit learning in neural systems
- Stability and memory-loss go hand-in-hand: three results in dynamics and computation
- Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
- Fading memory echo state networks are universal
This page was built for publication: Learning strange attractors with reservoir systems