Learning strange attractors with reservoir systems
DOI10.1088/1361-6544/ACE492zbMATH Open1525.37020arXiv2108.05024OpenAlexW3187680631MaRDI QIDQ6169729FDOQ6169729
Authors: Lyudmila Grigoryeva, Allen Hart, Juan-Pablo Ortega
Publication date: 15 August 2023
Published in: Nonlinearity (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2108.05024
Recommendations
- Learning dynamics by reservoir computing (In Memory of Prof. Pavol Brunovský)
- Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers
- Learning Theory for Dynamical Systems
- Embedding and approximation theorems for echo state networks
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
dynamical systemsattractorgeneralized synchronizationreservoir computingecho state propertyTakens embeddingfading memory property
Strange attractors, chaotic dynamics of systems with hyperbolic behavior (37D45) Dynamical systems involving smooth mappings and diffeomorphisms (37C05) Attractors and repellers of smooth dynamical systems and their topological structure (37C70)
Cites Work
- Title not available (Why is that?)
- Deterministic Nonperiodic Flow
- Manifolds, tensor analysis, and applications.
- Embedology
- The synchronization of chaotic systems
- Title not available (Why is that?)
- Differential Topology
- Title not available (Why is that?)
- Title not available (Why is that?)
- Chaos in Dynamical Systems
- Title not available (Why is that?)
- Fundamentals of synchronization in chaotic systems, concepts, and applications
- Title not available (Why is that?)
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Reservoir computing approaches to recurrent neural network training
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Approximating nonlinear fading-memory operators using neural network models
- Memory and forecasting capacities of nonlinear recurrent networks
- Embedding and approximation theorems for echo state networks
- Echo state networks are universal
- Dimension reduction in recurrent networks by canonicalization
- Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
- Fading memory echo state networks are universal
- Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks
- Invertible generalized synchronization: a putative mechanism for implicit learning in neural systems
- Stability and memory-loss go hand-in-hand: three results in dynamics and computation
Cited In (5)
- A robust method for distinguishing between learned and spurious attractors
- Learning Theory for Dynamical Systems
- Data-driven cold starting of good reservoirs
- Learning dynamics by reservoir computing (In Memory of Prof. Pavol Brunovský)
- Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers
This page was built for publication: Learning strange attractors with reservoir systems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6169729)