A Taxonomy for Neural Memory Networks

  • Ying Ma*
  • , Jose C. Principe
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Citations (Scopus)

Abstract

An increasing number of neural memory networks have been developed, leading to the need for a systematic approach to analyze and compare their underlying memory structures. Thus, in this paper, we first create a framework for memory organization and then compare four popular dynamic models: vanilla recurrent neural network, long short-term memory, neural stack, and neural RAM. This analysis helps to open the dynamic neural networks' black box from the memory usage prospective. Accordingly, a taxonomy for these networks and their variants is proposed and proved using a unifying architecture. With the taxonomy, both network architectures and learning tasks are classified into four classes, and a one-to-one mapping is built between them to help practitioners select the appropriate architecture. To exemplify each task type, four synthetic tasks with different memory requirements are selected. Moreover, we use some signal processing applications and two natural language processing applications to evaluate the methodology in a realistic setting.

Original languageEnglish
Article number8807370
Pages (from-to)1780-1793
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number6
DOIs
Publication statusPublished - Jun 2020
Externally publishedYes

Keywords

  • Long short-term memory (LSTM)
  • neural RAM
  • neural stack
  • recurrent neural network (RNN)

Fingerprint

Dive into the research topics of 'A Taxonomy for Neural Memory Networks'. Together they form a unique fingerprint.

Cite this