TY - JOUR
T1 - 自注意力超图池化网络
AU - Zhao, Ying Fu
AU - Jin, Fu Sheng
AU - Li, Rong Hua
AU - Qin, Hong Chao
AU - Cui, Peng
AU - Wang, Guo Ren
N1 - Publisher Copyright:
© 2023 Chinese Academy of Sciences. All rights reserved.
PY - 2023
Y1 - 2023
N2 - Recently, graph-based convolutional neural networks(GCNs) have attracted much attention by generalizing convolutional neural networks to graph data, which includes redefining the convolution and the pooling operations on graphs. Due to the limitation that graph data can only focus on dyadic relations, it can not perform well in real practice. In contrast, a hypergraph can capture high-order data interaction and is easy to deal with complex data representation using its flexible hyperedges. However, the existing methods for hypergraph convolutional networks are still not mature, currently there is no effective operation for hypergraph pooling. Therefore, a hypergraph pooling network with self-attention mechanism is proposed. Using hypergraph structure for data modeling, this model can learn node hidden features with high-order data information through hyper-convolution operation which introduce self-attention mechanism, and select important nodes both on structure and content through hyper-pooling operation, and then obtain more accurate hypergraph representation. Experiments on text classification, dish classification and protein classification tasks show that the proposed method outperforms recent state-of-the art methods.
AB - Recently, graph-based convolutional neural networks(GCNs) have attracted much attention by generalizing convolutional neural networks to graph data, which includes redefining the convolution and the pooling operations on graphs. Due to the limitation that graph data can only focus on dyadic relations, it can not perform well in real practice. In contrast, a hypergraph can capture high-order data interaction and is easy to deal with complex data representation using its flexible hyperedges. However, the existing methods for hypergraph convolutional networks are still not mature, currently there is no effective operation for hypergraph pooling. Therefore, a hypergraph pooling network with self-attention mechanism is proposed. Using hypergraph structure for data modeling, this model can learn node hidden features with high-order data information through hyper-convolution operation which introduce self-attention mechanism, and select important nodes both on structure and content through hyper-pooling operation, and then obtain more accurate hypergraph representation. Experiments on text classification, dish classification and protein classification tasks show that the proposed method outperforms recent state-of-the art methods.
KW - convolutional neural network
KW - graph neural network
KW - hypergraph
KW - hypergraph neural network
KW - pooling
UR - http://www.scopus.com/inward/record.url?scp=85161431042&partnerID=8YFLogxK
U2 - 10.13328/j.cnki.jos.006881
DO - 10.13328/j.cnki.jos.006881
M3 - 文章
AN - SCOPUS:85161431042
SN - 1000-9825
VL - 34
JO - Ruan Jian Xue Bao/Journal of Software
JF - Ruan Jian Xue Bao/Journal of Software
IS - 10
ER -