Recurrent Neural Networks With Auxiliary Memory Units

IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1652-1661. doi: 10.1109/TNNLS.2017.2677968. Epub 2017 Mar 21.

Abstract

Memory is one of the most important mechanisms in recurrent neural networks (RNNs) learning. It plays a crucial role in practical applications, such as sequence learning. With a good memory mechanism, long term history can be fused with current information, and can thus improve RNNs learning. Developing a suitable memory mechanism is always desirable in the field of RNNs. This paper proposes a novel memory mechanism for RNNs. The main contributions of this paper are: 1) an auxiliary memory unit (AMU) is proposed, which results in a new special RNN model (AMU-RNN), separating the memory and output explicitly and 2) an efficient learning algorithm is developed by employing the technique of error flow truncation. The proposed AMU-RNN model, together with the developed learning algorithm, can learn and maintain stable memory over a long time range. This method overcomes both the learning conflict problem and gradient vanishing problem. Unlike the traditional method, which mixes the memory and output with a single neuron in a recurrent unit, the AMU provides an auxiliary memory neuron to maintain memory in particular. By separating the memory and output in a recurrent unit, the problem of learning conflicts can be eliminated easily. Moreover, by using the technique of error flow truncation, each auxiliary memory neuron ensures constant error flow during the learning process. The experiments demonstrate good performance of the proposed AMU-RNNs and the developed learning algorithm. The method exhibits quite efficient learning performance with stable convergence in the AMU-RNN learning and outperforms the state-of-the-art RNN models in sequence generation and sequence classification tasks.

Publication types

  • Research Support, Non-U.S. Gov't