Gated Orthogonal Recurrent Units: On Learning to Forget
DOI10.1162/NECO_A_01174zbMATH Open1476.68232arXiv1706.02761OpenAlexW2626194254WikidataQ91588964 ScholiaQ91588964MaRDI QIDQ5154142FDOQ5154142
Authors: Li Jing, Çağlar Gülçehre, John Peurifoy, Yi-Chen Shen, Max Tegmark, Marin Soljačić, Yoshua Bengio
Publication date: 1 October 2021
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.02761
Recommendations
- Sgornn: Combining scalar gates and orthogonal constraints in recurrent networks
- Gated Graph Recurrent Neural Networks
- Overcoming catastrophic forgetting in neural networks
- On duality of regularized exponential and linear forgetting
- A family of universal recurrent networks
- scientific article; zbMATH DE number 2042276
- A recalling-enhanced recurrent neural network: conjugate gradient learning algorithm and its convergence analysis
- Toward training recurrent neural networks for lifelong learning
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cited In (5)
- A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
- Stock market predictions using FastRNN-based model
- Divide and conquer: learning chaotic dynamical systems with multistep penalty neural ordinary differential equations
- Sgornn: Combining scalar gates and orthogonal constraints in recurrent networks
- Title not available (Why is that?)
Uses Software
This page was built for publication: Gated Orthogonal Recurrent Units: On Learning to Forget
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5154142)