A constrained regularization approach for input-driven recurrent neural networks
From MaRDI portal
Recommendations
- scientific article; zbMATH DE number 1928643
- Improving Generalization Capabilities of Dynamic Neural Networks
- Training recurrent neural networks by sequential least squares and the alternating direction method of multipliers
- Using Fourier-neural recurrent networks to fit sequential input/output data
- Distributed sequence memory of multidimensional inputs in recurrent networks
Cites work
- scientific article; zbMATH DE number 3914081 (Why is no real title available?)
- An experimental unification of reservoir computing methods
- Analysis and Design of Echo State Networks
- Asymptotic properties of incrementally stable systems
- Local stability of recurrent networks with time-varying weights and inputs
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- On global asymptotic stability of a class of nonlinear systems arising in neural network theory
- Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning
- Optimization and applications of echo state networks with leaky- integrator neurons
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
This page was built for publication: A constrained regularization approach for input-driven recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q691290)