Graph attention network with dynamic representation of relations for knowledge graph completion

Xin Zhang, Chunxia Zhang*, Jingtao Guo, Cheng Peng, Zhendong Niu, Xindong Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)

Abstract

Knowledge graph completion (KGC) aims to predict the missing element in a triple based on known triples or facts. Recently, plenty of representation learning methods for KGC have achieved the promising performance, especially ones based on graph neural networks and their variants. Those methods exploit local neighborhood information to update the embedding of target entities. However, the existing works have the following two problems. First, those approaches focus on the representation learning of entities, while the relation representation usually adopts a simple linear transformation, which cannot capture the distinctive semantic intensions of the same relation in different triples. Second, different types of entity information are simply combined together, resulting in the loss of global properties including the type and the global importance of entities, which is prone to cause over-smoothing phenomenon. To address these two problems, we propose a Graph Attention Network with Dynamic Representation of Relations and global information (DRR-GAT) for knowledge graph completion. Specifically, the task of dynamic representation of relations is to learn the distinctive representation of the same relation in different triples. This goal is achieved via a path Transformer. To this end, path Transformer is designed to take the path information as its input, where only those paths from the target entity to the neighborhood relations with the same type as the target relation are considered. Sequentially, the mechanism of global embeddings is incorporated into graph attention network to capture the global information of entities and relations. Experimental performance outperforms the state-of-the-art methods, indicating the effectiveness of our proposed approach.

Original languageEnglish
Article number119616
JournalExpert Systems with Applications
Volume219
DOIs
Publication statusPublished - 1 Jun 2023

Keywords

  • Dynamic representation of relation
  • Global information embedding
  • Graph attention network
  • Knowledge graph completion
  • Transformer encoder

Fingerprint

Dive into the research topics of 'Graph attention network with dynamic representation of relations for knowledge graph completion'. Together they form a unique fingerprint.

Cite this