Input-anticipating critical reservoirs show power law forgetting of unexpected input events
From MaRDI portal
Publication:5380243
DOI10.1162/NECO_A_00730zbMATH Open1475.92017arXiv1404.6334OpenAlexW2028324893WikidataQ50596067 ScholiaQ50596067MaRDI QIDQ5380243FDOQ5380243
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Abstract: Usually, reservoir computing shows an exponential memory decay. This paper investigates under which circumstances echo state networks can show a power law forgetting. That means traces of earlier events can be found in the reservoir for very long time spans. Such a setting requires critical connectivity exactly at the limit of what is permissible according the echo state condition. However, for general matrices the limit cannot be determined exactly from theory. In addition, the behavior of the network is strongly influenced by the input flow. Results are presented that use certain types of restricted recurrent connectivity and anticipation learning with regard to the input, where indeed power law forgetting can be achieved.
Full work available at URL: https://arxiv.org/abs/1404.6334
Recommendations
- An experimental unification of reservoir computing methods
- Memory in linear recurrent neural networks in continuous time
- scientific article; zbMATH DE number 7164779
- Re-visiting the echo state property
- Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks
Neural networks for/in biological studies, artificial life and related topics (92B20) Memory and learning in psychology (91E40)
Cites Work
This page was built for publication: Input-anticipating critical reservoirs show power law forgetting of unexpected input events
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380243)