Variational Gridded Graph Convolution Network for Node Classification

Xiaobin Hong, Tong Zhang, Zhen Cui*, Jian Yang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

The existing graph convolution methods usually suffer high computational burdens, large memory requirements, and intractable batch-processing. In this paper, we propose a high-efficient variational gridded graph convolution network (VG-GCN) to encode non-regular graph data, which overcomes all these aforementioned problems. To capture graph topology structures efficiently, in the proposed framework, we propose a hierarchically-coarsened random walk (hcr-walk) by taking advantage of the classic random walk and node/edge encapsulation. The hcr-walk greatly mitigates the problem of exponentially explosive sampling times which occur in the classic version, while preserving graph structures well. To efficiently encode local hcr-walk around one reference node, we project hcr-walk into an ordered space to form image-like grid data, which favors those conventional convolution networks. Instead of the direct 2-D convolution filtering, a variational convolution block (VCB) is designed to model the distribution of the random-sampling hcr-walk inspired by the well-formulated variational inference. We experimentally validate the efficiency and effectiveness of our proposed VG-GCN, which has high computation speed, and the comparable or even better performance when compared with baseline GCNs.

Original languageEnglish
Article number9497877
Pages (from-to)1697-1708
Number of pages12
JournalIEEE/CAA Journal of Automatica Sinica
Volume8
Issue number10
DOIs
Publication statusPublished - Oct 2021
Externally publishedYes

Keywords

  • Graph coarsening
  • gridding
  • node classification
  • random walk
  • variational convolution

Fingerprint

Dive into the research topics of 'Variational Gridded Graph Convolution Network for Node Classification'. Together they form a unique fingerprint.

Cite this