Recurrent Kernel Machines: Computing with Infinite Echo State Networks
From MaRDI portal
Publication:2885086
DOI10.1162/NECO_a_00200zbMath1238.68125WikidataQ43784190 ScholiaQ43784190MaRDI QIDQ2885086
Michiel Hermans, Benjamin Schrauwen
Publication date: 21 May 2012
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00200
68T05: Learning and adaptive systems in artificial intelligence
92B20: Neural networks for/in biological studies, artificial life and related topics
Related Items
On Kernel Method–Based Connectionist Models and Supervised Deep Learning Without Backpropagation, Direct adaptive control for nonlinear systems using a TSK fuzzy echo state network based on fractional-order learning algorithm, Neural kernels for recursive support vector regression as a model for episodic memory, A local echo state property through the largest Lyapunov exponent
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- The spectral radius of large random matrices
- Optimization and applications of echo state networks with leaky- integrator neurons
- An experimental unification of reservoir computing methods
- Automatic speech recognition using a predictive echo state network classifier
- Large-Margin Classification in Infinite Neural Networks
- Training Recurrent Networks by Evolino
- Fading memory and the problem of approximating nonlinear operators with Volterra series
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Training a Support Vector Machine in the Primal