Learning strange attractors with reservoir systems
From MaRDI portal
Publication:6169729
Abstract: This paper shows that the celebrated Embedding Theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space. This embedding coincides with a natural generalized synchronization that arises in this setup and that yields a topological conjugacy between the state-space dynamics driven by the generic observations of the dynamical system and the dynamical system itself. This result provides additional tools for the representation, learning, and analysis of chaotic attractors and sheds additional light on the reservoir computing phenomenon that appears in the context of recurrent neural networks.
Recommendations
- Learning dynamics by reservoir computing (In Memory of Prof. Pavol Brunovský)
- Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers
- Learning Theory for Dynamical Systems
- Embedding and approximation theorems for echo state networks
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
Cites work
- scientific article; zbMATH DE number 3810550 (Why is no real title available?)
- scientific article; zbMATH DE number 52737 (Why is no real title available?)
- scientific article; zbMATH DE number 3484308 (Why is no real title available?)
- scientific article; zbMATH DE number 1182386 (Why is no real title available?)
- scientific article; zbMATH DE number 1745905 (Why is no real title available?)
- scientific article; zbMATH DE number 3241861 (Why is no real title available?)
- scientific article; zbMATH DE number 3273748 (Why is no real title available?)
- scientific article; zbMATH DE number 3331185 (Why is no real title available?)
- scientific article; zbMATH DE number 3184207 (Why is no real title available?)
- Approximating nonlinear fading-memory operators using neural network models
- Chaos in Dynamical Systems
- Deterministic Nonperiodic Flow
- Differential Topology
- Dimension reduction in recurrent networks by canonicalization
- Echo state networks are universal
- Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
- Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks
- Embedding and approximation theorems for echo state networks
- Embedology
- Fading memory echo state networks are universal
- Fundamentals of synchronization in chaotic systems, concepts, and applications
- Invertible generalized synchronization: a putative mechanism for implicit learning in neural systems
- Manifolds, tensor analysis, and applications.
- Memory and forecasting capacities of nonlinear recurrent networks
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Reservoir computing approaches to recurrent neural network training
- Stability and memory-loss go hand-in-hand: three results in dynamics and computation
- The synchronization of chaotic systems
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
Cited in
(5)- Data-driven cold starting of good reservoirs
- Learning dynamics by reservoir computing (In Memory of Prof. Pavol Brunovský)
- Learning Theory for Dynamical Systems
- Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers
- A robust method for distinguishing between learned and spurious attractors
This page was built for publication: Learning strange attractors with reservoir systems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6169729)