scientific article; zbMATH DE number 7306919
From MaRDI portal
Publication:5149035
Lyudmila Grigoryeva, Lukas Gonon, Juan-Pablo Ortega
Publication date: 5 February 2021
Full work available at URL: https://arxiv.org/abs/1910.13886
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
weak dependenceRademacher complexityempirical risk minimizationSASrisk boundsreservoir computingPAC boundsecho state networksESNstate affine systemsrandom reservoirsRC
Related Items
Fading memory echo state networks are universal ⋮ Transport in reservoir computing ⋮ Universal regular conditional distributions via probabilistic transformers ⋮ Approximation bounds for random neural networks and reservoir systems ⋮ The Mori-Zwanzig formulation of deep learning ⋮ The Khinchin inequality for multiple sums revisited ⋮ Designing universal causal deep learning models: The geometric (Hyper)transformer ⋮ Unnamed Item ⋮ Stability and memory-loss go hand-in-hand: three results in dynamics and computation ⋮ Dimension reduction in recurrent networks by canonicalization ⋮ Error bounds of the invariant statistics in machine learning of ergodic Itô diffusions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model selection for weakly dependent time series forecasting
- Reservoir computing approaches to recurrent neural network training
- Polynomial response maps
- A new approach to equations with memory
- Some limit theorems for empirical processes (with discussion)
- Uniform convergence of Vapnik-Chervonenkis classes under ergodic sampling
- Vapnik-Chervonenkis dimension of recurrent neural networks
- Multilayer feedforward networks are universal approximators
- Memory in linear recurrent neural networks in continuous time
- Generalized autoregressive conditional heteroscedasticity
- Long memory processes and fractional integration in econometrics
- Re-visiting the echo state property
- Echo state networks are universal
- Sequential complexities and uniform martingale laws of large numbers
- Generalization bounds for non-stationary mixing processes
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
- On the general theory of fading memory
- Weak dependence. With examples and applications.
- On the mathematical foundations of learning
- The Asymptotic Performance of Linear Echo State Neural Networks
- Learning Theory
- Support Vector Machines
- Fading memory and the problem of approximating nonlinear operators with Volterra series
- The stochastic equation Yn+1=AnYn + Bn with stationary coefficients
- Fractional differencing
- Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation
- Realization theory of discrete-time nonlinear systems: Part I-The bounded case
- Scale-sensitive dimensions, uniform convergence, and learnability
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- 10.1162/153244302760200704
- 10.1162/153244303321897690
- Neural Network Learning
- Uniform Central Limit Theorems
- Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
- Nonparametric risk bounds for time-series forecasting
- Nonlinear Memory Capacity of Parallel Time-Delay Reservoir Computers in the Processing of Multidimensional Signals
- Understanding Machine Learning
- Convexity, Classification, and Risk Bounds
- Approximation by superpositions of a sigmoidal function
This page was built for publication: