Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
DOI10.1016/j.physd.2021.132882zbMath1491.37075arXiv2005.06967OpenAlexW3024515584WikidataQ114141965 ScholiaQ114141965MaRDI QIDQ2077652
Allen Hart, James Hook, Jonathan H. P. Dawes
Publication date: 21 February 2022
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2005.06967
Lorenz equationstime series analysisrecurrent neural networksreservoir computingliquid state machinedelay embedding
Time series analysis of dynamical systems (37M10) Computational methods for ergodic theory (approximation of invariant measures, computation of Lyapunov exponents, entropy, etc.) (37M25)
Related Items (7)
Uses Software
Cites Work
- The Lorenz attractor is mixing
- Structural stability of Lorenz attractors
- The structure of Lorenz attractors
- Mathematical problems for the next century
- A rigorous ODE solver and Smale's 14th problem
- Re-visiting the echo state property
- Data-driven reconstruction of nonlinear dynamics from sparse observation
- Automatic speech recognition using a predictive echo state network classifier
- Learning grammatical structure with Echo State Networks
- Deblurring Images
- Dynamical systems with generalized hyperbolic attractors: hyperbolic, ergodic and topological properties
- The Lorenz attractor exists
- The rate of convergence in ergodic theorems
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Deterministic Nonperiodic Flow
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems