A local echo state property through the largest Lyapunov exponent
From MaRDI portal
Publication:2418123
DOI10.1016/j.neunet.2015.12.013zbMath1418.62358arXiv1402.1619WikidataQ50718791 ScholiaQ50718791MaRDI QIDQ2418123
Gilles Wainrib, Mathieu N. Galtier
Publication date: 3 June 2019
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.1619
Lyapunov exponents; mean field theory; reservoir computing; echo state networks; time-series predictors
62M10: Time series, auto-correlation, regression, etc. in statistics (GARCH)
37N25: Dynamical systems in biology
92B20: Neural networks for/in biological studies, artificial life and related topics
62M45: Neural nets and related approaches to inference from stochastic processes
Related Items
Unnamed Item, A novel method based on the pseudo-orbits to calculate the largest Lyapunov exponent from chaotic equations, ECHO STATE QUEUEING NETWORKS: A COMBINATION OF RESERVOIR COMPUTING AND RANDOM NEURAL NETWORKS, On the use of interval extensions to estimate the largest Lyapunov exponent from chaotic data, Memory and forecasting capacities of nonlinear recurrent networks, Simple estimation method for the second-largest Lyapunov exponent of chaotic differential equations, Fractional multiscale phase permutation entropy for quantifying the complexity of nonlinear time series, Echo state networks are universal, Reservoir Computing with Computational Matter
Cites Work
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
- Spectral analysis of large dimensional random matrices
- A limit theorem for the norm of random matrices
- On a certain Banach space in connection with minimax series
- On contraction analysis for non-linear systems
- Mean-field equations, bifurcation map and route to chaos in discrete time neural networks
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Large deviations for Langevin spin glass dynamics
- Random matrices: universality of ESDs and the circular law
- Optimization and applications of echo state networks with leaky- integrator neurons
- Optimal system size for complex dynamics in random neural networks near criticality
- Recurrent Kernel Machines: Computing with Infinite Echo State Networks
- Oscillation and Chaos in Physiological Control Systems
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
- Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
- A Biological Gradient Descent for Prediction Through a Combination of STDP and Homeostatic Plasticity
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES