Gated Orthogonal Recurrent Units: On Learning to Forget

From MaRDI portal
Publication:5154142

DOI10.1162/NECO_A_01174zbMATH Open1476.68232arXiv1706.02761OpenAlexW2626194254WikidataQ91588964 ScholiaQ91588964MaRDI QIDQ5154142FDOQ5154142


Authors: Li Jing, Çağlar Gülçehre, John Peurifoy, Yi-Chen Shen, Max Tegmark, Marin Soljačić, Yoshua Bengio Edit this on Wikidata


Publication date: 1 October 2021

Published in: Neural Computation (Search for Journal in Brave)

Abstract: We present a novel recurrent neural network (RNN) based model that combines the remembering ability of unitary RNNs with the ability of gated RNNs to effectively forget redundant/irrelevant information in its memory. We achieve this by extending unitary RNNs with a gating mechanism. Our model is able to outperform LSTMs, GRUs and Unitary RNNs on several long-term dependency benchmark tasks. We empirically both show the orthogonal/unitary RNNs lack the ability to forget and also the ability of GORU to simultaneously remember long term dependencies while forgetting irrelevant information. This plays an important role in recurrent neural networks. We provide competitive results along with an analysis of our model on many natural sequential tasks including the bAbI Question Answering, TIMIT speech spectrum prediction, Penn TreeBank, and synthetic tasks that involve long-term dependencies such as algorithmic, parenthesis, denoising and copying tasks.


Full work available at URL: https://arxiv.org/abs/1706.02761




Recommendations




Cited In (5)

Uses Software





This page was built for publication: Gated Orthogonal Recurrent Units: On Learning to Forget

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5154142)