Abstract
An increasing number of neural memory networks have been developed, leading to the need for a systematic approach to analyze and compare their underlying memory structures. Thus, in this paper, we first create a framework for memory organization and then compare four popular dynamic models: vanilla recurrent neural network, long short-term memory, neural stack, and neural RAM. This analysis helps to open the dynamic neural networks' black box from the memory usage prospective. Accordingly, a taxonomy for these networks and their variants is proposed and proved using a unifying architecture. With the taxonomy, both network architectures and learning tasks are classified into four classes, and a one-to-one mapping is built between them to help practitioners select the appropriate architecture. To exemplify each task type, four synthetic tasks with different memory requirements are selected. Moreover, we use some signal processing applications and two natural language processing applications to evaluate the methodology in a realistic setting.
| Original language | English |
|---|---|
| Article number | 8807370 |
| Pages (from-to) | 1780-1793 |
| Number of pages | 14 |
| Journal | IEEE Transactions on Neural Networks and Learning Systems |
| Volume | 31 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - Jun 2020 |
| Externally published | Yes |
Keywords
- Long short-term memory (LSTM)
- neural RAM
- neural stack
- recurrent neural network (RNN)