自注意力超图池化网络

Translated title of the contribution: Self-attention Hypergraph Pooling Network

Ying Fu Zhao, Fu Sheng Jin*, Rong Hua Li, Hong Chao Qin, Peng Cui, Guo Ren Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Recently, graph-based convolutional neural networks(GCNs) have attracted much attention by generalizing convolutional neural networks to graph data, which includes redefining the convolution and the pooling operations on graphs. Due to the limitation that graph data can only focus on dyadic relations, it can not perform well in real practice. In contrast, a hypergraph can capture high-order data interaction and is easy to deal with complex data representation using its flexible hyperedges. However, the existing methods for hypergraph convolutional networks are still not mature, currently there is no effective operation for hypergraph pooling. Therefore, a hypergraph pooling network with self-attention mechanism is proposed. Using hypergraph structure for data modeling, this model can learn node hidden features with high-order data information through hyper-convolution operation which introduce self-attention mechanism, and select important nodes both on structure and content through hyper-pooling operation, and then obtain more accurate hypergraph representation. Experiments on text classification, dish classification and protein classification tasks show that the proposed method outperforms recent state-of-the art methods.

Translated title of the contributionSelf-attention Hypergraph Pooling Network
Original languageChinese (Traditional)
JournalRuan Jian Xue Bao/Journal of Software
Volume34
Issue number10
DOIs
Publication statusPublished - 2023

Fingerprint

Dive into the research topics of 'Self-attention Hypergraph Pooling Network'. Together they form a unique fingerprint.

Cite this