HeGCL: Advance Self-Supervised Learning in Heterogeneous Graph-Level Representation

Gen Shi, Yifan Zhu, Jian K. Liu, Xuesong Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Representation learning in heterogeneous graphs with massive unlabeled data has aroused great interest. The heterogeneity of graphs not only contains rich information, but also raises difficult barriers to designing unsupervised or self-supervised learning (SSL) strategies. Existing methods such as random walk-based approaches are mainly dependent on the proximity information of neighbors and lack the ability to integrate node features into a higher-level representation. Furthermore, previous self-supervised or unsupervised frameworks are usually designed for node-level tasks, which are commonly short of capturing global graph properties and may not perform well in graph-level tasks. Therefore, a label-free framework that can better capture the global properties of heterogeneous graphs is urgently required. In this article, we propose a self-supervised heterogeneous graph neural network (GNN) based on cross-view contrastive learning (HeGCL). The HeGCL presents two views for encoding heterogeneous graphs: the meta-path view and the outline view. Compared with the meta-path view that provides semantic information, the outline view encodes the complex edge relations and captures graph-level properties by using a nonlocal block. Thus, the HeGCL learns node embeddings through maximizing mutual information (MI) between global and semantic representations coming from the outline and meta-path view, respectively. Experiments on both node-level and graph-level tasks show the superiority of the proposed model over other methods, and further exploration studies also show that the introduction of nonlocal block brings a significant contribution to graph-level tasks.

Original languageEnglish
Pages (from-to)13914-13925
Number of pages12
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume35
Issue number10
DOIs
Publication statusPublished - 2024

Keywords

  • Graph neural networks (GNNs)
  • heterogeneous graphs
  • self-supervised learning (SSL)

Fingerprint

Dive into the research topics of 'HeGCL: Advance Self-Supervised Learning in Heterogeneous Graph-Level Representation'. Together they form a unique fingerprint.

Cite this