Echo state networks trained by Tikhonov least squares are \(L^2(\mu)\) approximators of ergodic dynamical systems
From MaRDI portal
Publication:2077652
DOI10.1016/j.physd.2021.132882zbMath1491.37075arXiv2005.06967WikidataQ114141965 ScholiaQ114141965MaRDI QIDQ2077652
Allen Hart, James Hook, Jonathan H. P. Dawes
Publication date: 21 February 2022
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2005.06967
Lorenz equations; time series analysis; recurrent neural networks; reservoir computing; liquid state machine; delay embedding
37M10: Time series analysis of dynamical systems
37M25: Computational methods for ergodic theory (approximation of invariant measures, computation of Lyapunov exponents, entropy, etc.)
Related Items
Symmetry kills the square in a multifunctional reservoir computer, Reservoir Computing with an Inertial Form, Dimension reduction in recurrent networks by canonicalization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Lorenz attractor is mixing
- Structural stability of Lorenz attractors
- The structure of Lorenz attractors
- Mathematical problems for the next century
- A rigorous ODE solver and Smale's 14th problem
- Re-visiting the echo state property
- Data-driven reconstruction of nonlinear dynamics from sparse observation
- Automatic speech recognition using a predictive echo state network classifier
- Learning grammatical structure with Echo State Networks
- Deblurring Images
- Dynamical systems with generalized hyperbolic attractors: hyperbolic, ergodic and topological properties
- The Lorenz attractor exists
- The rate of convergence in ergodic theorems
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data