自注意力超图池化网络

Ying Fu Zhao, Fu Sheng Jin*, Rong Hua Li, Hong Chao Qin, Peng Cui, Guo Ren Wang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

2 引用 (Scopus)

摘要

Recently, graph-based convolutional neural networks(GCNs) have attracted much attention by generalizing convolutional neural networks to graph data, which includes redefining the convolution and the pooling operations on graphs. Due to the limitation that graph data can only focus on dyadic relations, it can not perform well in real practice. In contrast, a hypergraph can capture high-order data interaction and is easy to deal with complex data representation using its flexible hyperedges. However, the existing methods for hypergraph convolutional networks are still not mature, currently there is no effective operation for hypergraph pooling. Therefore, a hypergraph pooling network with self-attention mechanism is proposed. Using hypergraph structure for data modeling, this model can learn node hidden features with high-order data information through hyper-convolution operation which introduce self-attention mechanism, and select important nodes both on structure and content through hyper-pooling operation, and then obtain more accurate hypergraph representation. Experiments on text classification, dish classification and protein classification tasks show that the proposed method outperforms recent state-of-the art methods.

投稿的翻译标题Self-attention Hypergraph Pooling Network
源语言繁体中文
期刊Ruan Jian Xue Bao/Journal of Software
34
10
DOI
出版状态已出版 - 2023

关键词

  • convolutional neural network
  • graph neural network
  • hypergraph
  • hypergraph neural network
  • pooling

指纹

探究 '自注意力超图池化网络' 的科研主题。它们共同构成独一无二的指纹。

引用此