Predicting Information Diffusion Cascades Using Graph Attention Networks

Meng Wang, Kan Li*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)

Abstract

Effective information cascade prediction plays a very important role in suppressing the spread of rumors in social networks and providing accurate social recommendations on social platforms. This paper improves existing models and proposes an end-to-end deep learning method called CasGAT. The method of graph attention network is designed to optimize the processing of large networks. After that, we only need to pay attention to the characteristics of neighbor nodes. Our approach greatly reduces the processing complexity of the model. We use realistic datasets to demonstrate the effectiveness of the model and compare the improved model with three baselines. Extensive results demonstrate that our model outperformed the three baselines in the prediction accuracy.

Original languageEnglish
Title of host publicationNeural Information Processing - 27th International Conference, ICONIP 2020, Proceedings
EditorsHaiqin Yang, Kitsuchart Pasupa, Andrew Chi-Sing Leung, James T. Kwok, Jonathan H. Chan, Irwin King
PublisherSpringer Science and Business Media Deutschland GmbH
Pages104-112
Number of pages9
ISBN (Print)9783030638191
DOIs
Publication statusPublished - 2020
Event27th International Conference on Neural Information Processing, ICONIP 2020 - Bangkok, Thailand
Duration: 18 Nov 202022 Nov 2020

Publication series

NameCommunications in Computer and Information Science
Volume1332
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference27th International Conference on Neural Information Processing, ICONIP 2020
Country/TerritoryThailand
CityBangkok
Period18/11/2022/11/20

Keywords

  • Graph attention network
  • Information cascade prediction
  • Social network

Fingerprint

Dive into the research topics of 'Predicting Information Diffusion Cascades Using Graph Attention Networks'. Together they form a unique fingerprint.

Cite this