Input-anticipating critical reservoirs show power law forgetting of unexpected input events
From MaRDI portal
Publication:5380243
Abstract: Usually, reservoir computing shows an exponential memory decay. This paper investigates under which circumstances echo state networks can show a power law forgetting. That means traces of earlier events can be found in the reservoir for very long time spans. Such a setting requires critical connectivity exactly at the limit of what is permissible according the echo state condition. However, for general matrices the limit cannot be determined exactly from theory. In addition, the behavior of the network is strongly influenced by the input flow. Results are presented that use certain types of restricted recurrent connectivity and anticipation learning with regard to the input, where indeed power law forgetting can be achieved.
Recommendations
- An experimental unification of reservoir computing methods
- Memory in linear recurrent neural networks in continuous time
- scientific article; zbMATH DE number 7164779
- Re-visiting the echo state property
- Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks
Cites work
- scientific article; zbMATH DE number 194139 (Why is no real title available?)
- scientific article; zbMATH DE number 1130404 (Why is no real title available?)
- Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons
- Criticality in neural systems
- Re-visiting the echo state property
This page was built for publication: Input-anticipating critical reservoirs show power law forgetting of unexpected input events
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380243)