Stochastic graph recurrent neural network

Tijin Yan, Hongwei Zhang, Zirui Li, Yuanqing Xia*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Representation learning over dynamic graphs has attracted much attention because of its wide applications. Recently, sequential probabilistic generative models have achieved impressive results because they can model data distributions. However, modeling the distribution of dynamic graphs is still extremely challenging. Existing methods usually ignore the mutual interference of stochastic states and deterministic states. Besides, the assumption that latent variables follow Gaussian distributions is usually inappropriate. To address these problems, we propose stochastic graph recurrent neural network (SGRNN), a sequential generative model for the representation learning over dynamic graphs. It separates stochastic states and deterministic states in the iterative process. To improve the flexibility of latent variables, we set the prior distribution and posterior distribution as semi-implicit distributions and propose DSI-SGRNN. In addition, to alleviate the KL-vanishing problem in SGRNN, a simple and interpretable structure is proposed based on the lower bound of KL-divergence. The proposed structure introduces a few extra parameters and can be implemented with a few lines of code modification. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed model.

Original languageEnglish
Pages (from-to)1003-1015
Number of pages13
JournalNeurocomputing
Volume500
DOIs
Publication statusPublished - 21 Aug 2022

Keywords

  • Dynamic graph
  • Posterior collapse
  • Representation learning
  • Variational inference

Fingerprint

Dive into the research topics of 'Stochastic graph recurrent neural network'. Together they form a unique fingerprint.

Cite this