Approximation bounds for random neural networks and reservoir systems
DOI10.1214/22-aap1806arXiv2002.05933OpenAlexW3006049408MaRDI QIDQ6103961
Juan-Pablo Ortega, Lyudmila Grigoryeva, Lukas Gonon
Publication date: 5 June 2023
Published in: The Annals of Applied Probability (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.05933
neural networksapproximation errorreservoir computingecho state networksrandom function approximation
Artificial neural networks and deep learning (68T07) Computational methods for problems pertaining to probability theory (60-08) Random operators and equations (aspects of stochastic analysis) (60H25) Stochastic learning and adaptive control (93E35) Approximation by other special function classes (41A30)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Probability in Banach spaces. Isoperimetry and processes
- Approximating nonlinear fading-memory operators using neural network models
- On the near optimality of the stochastic approximation of smooth functions by neural networks
- Embedding and approximation theorems for echo state networks
- Echo state networks are universal
- Multidimensional Stochastic Processes as Rough Paths
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- The identification of nonlinear discrete-time fading-memory systems using neural network models
- Fading memory echo state networks are universal
This page was built for publication: Approximation bounds for random neural networks and reservoir systems