Reservoir Computing with Computational Matter
From MaRDI portal
Publication:3295753
DOI10.1007/978-3-319-65826-1_14zbMath1436.68115OpenAlexW2883733869MaRDI QIDQ3295753
Zoran Konkoli, Susan Stepney, Stefano Nichele, Matthew Dale
Publication date: 10 July 2020
Published in: Natural Computing Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-65826-1_14
Neural networks for/in biological studies, artificial life and related topics (92B20) Biologically inspired models of computation (DNA computing, membrane computing, etc.) (68Q07)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Re-visiting the echo state property
- A practical method for calculating largest Lyapunov exponents from small data sets
- Edge of chaos and prediction of computational performance for neural circuit models
- A local echo state property through the largest Lyapunov exponent
- Evolving Carbon Nanotube Reservoir Computers
- When does a physical system compute?
- Heterotic Computing
- Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
This page was built for publication: Reservoir Computing with Computational Matter