scientific article; zbMATH DE number 6982315
From MaRDI portal
Publication:4558166
zbMath1439.68010arXiv1712.00754MaRDI QIDQ4558166
Lyudmila Grigoryeva, Juan-Pablo Ortega
Publication date: 21 November 2018
Full work available at URL: https://arxiv.org/abs/1712.00754
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
universalitymachine learningSASreservoir computingecho state networksESNstate-affine systemslinear trainingfading memory propertyecho state affine systemsstochastic signal treatment
Learning and adaptive systems in artificial intelligence (68T05) Biologically inspired models of computation (DNA computing, membrane computing, etc.) (68Q07)
Related Items
Unnamed Item ⋮ Unnamed Item ⋮ Echo state networks are universal ⋮ Fading memory echo state networks are universal ⋮ Transport in reservoir computing ⋮ Approximation bounds for random neural networks and reservoir systems ⋮ Designing universal causal deep learning models: The geometric (Hyper)transformer ⋮ Dimension reduction in recurrent networks by canonicalization ⋮ Error bounds of the invariant statistics in machine learning of ergodic Itô diffusions ⋮ Unnamed Item ⋮ Learning nonlinear input-output maps with dissipative quantum systems ⋮ Memory and forecasting capacities of nonlinear recurrent networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Learning with generalization capability by kernel methods of bounded complexity
- Polynomial response maps
- Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
- Time series: Theory and methods
- Manifolds, tensor analysis, and applications.
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Approximating nonlinear fading-memory operators using neural network models
- Fading memory and stability.
- Multilayer feedforward networks are universal approximators
- Memory in linear recurrent neural networks in continuous time
- Generalized autoregressive conditional heteroscedasticity
- Re-visiting the echo state property
- Echo state networks are universal
- Optimization and applications of echo state networks with leaky- integrator neurons
- An experimental unification of reservoir computing methods
- A local echo state property through the largest Lyapunov exponent
- Martingales in Banach Spaces
- The Asymptotic Performance of Linear Echo State Neural Networks
- Fading memory and the problem of approximating nonlinear operators with Volterra series
- Closedness of sum spaces and the generalized Schrödinger problem
- Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation
- Realization theory of discrete-time nonlinear systems: Part I-The bounded case
- Universal approximation bounds for superpositions of a sigmoidal function
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
- Nonlinear Memory Capacity of Parallel Time-Delay Reservoir Computers in the Processing of Multidimensional Signals
- A Representation Theorem for Continuous Functions of Several Variables
- Approximation by superpositions of a sigmoidal function
This page was built for publication: