Continuous online sequence learning with an unsupervised neural network model
From MaRDI portal
Publication:5380592
Abstract: The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory is recently proposed as a theoretical framework for sequence learning in the cortex. In this paper, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable-order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods: autoregressive integrated moving average (ARIMA), feedforward neural networks: online sequential extreme learning machine (ELM), and recurrent neural networks: long short-term memory (LSTM) and echo-state networks (ESN), on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyper- parameters tuning. Therefore the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem, but is also applicable to a wide range of real-world problems such as discrete and continuous sequence prediction, anomaly detection, and sequence classification.
Recommendations
- Learning a sparse code for temporal sequences using STDP and sequence compression
- Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning
- Sequence memory based on an oscillatory neural network
- On-line learning on temporal manifolds
- Implicit sequence learning in recurrent neural networks
Cites work
- scientific article; zbMATH DE number 6378127 (Why is no real title available?)
- scientific article; zbMATH DE number 4070373 (Why is no real title available?)
- scientific article; zbMATH DE number 42973 (Why is no real title available?)
- scientific article; zbMATH DE number 2090195 (Why is no real title available?)
- Empirical evaluation of the improved rprop learning algorithms
- Investigating the Fault Tolerance of Neural Networks
- Knowledge discovery from data streams.
- Learning in non-stationary environments. Methods and applications
- On-board mining of data streams in sensor networks
- Pattern recognition and machine learning.
- Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting
- Supervised sequence labelling with recurrent neural networks.
- The hierarchical hidden Markov model: Analysis and applications
- Time series analysis by state space methods.
Cited in
(2)
This page was built for publication: Continuous online sequence learning with an unsupervised neural network model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380592)