TY - GEN
T1 - SAIL
T2 - 36th AAAI Conference on Artificial Intelligence, AAAI 2022
AU - Yu, Lu
AU - Pei, Shichao
AU - Ding, Lizhong
AU - Zhou, Jun
AU - Li, Longfei
AU - Zhang, Chuxu
AU - Zhang, Xiangliang
N1 - Publisher Copyright:
Copyright © 2022, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2022/6/30
Y1 - 2022/6/30
N2 - This paper studies learning node representations with graph neural networks (GNNs) for unsupervised scenario. Specifically, we derive a theoretical analysis and provide an empirical demonstration about the non-steady performance of GNNs over different graph datasets, when the supervision signals are not appropriately defined. The performance of GNNs depends on both the node feature smoothness and the locality of graph structure. To smooth the discrepancy of node proximity measured by graph topology and node feature, we proposed SAIL - a novel Self-Augmented graph contrastive Learning framework, with two complementary self-distilling regularization modules, i.e., intra- and inter-graph knowledge distillation. We demonstrate the competitive performance of SAIL on a variety of graph applications. Even with a single GNN layer, SAIL has consistently competitive or even better performance on various benchmark datasets, comparing with state-of-the-art baselines.
AB - This paper studies learning node representations with graph neural networks (GNNs) for unsupervised scenario. Specifically, we derive a theoretical analysis and provide an empirical demonstration about the non-steady performance of GNNs over different graph datasets, when the supervision signals are not appropriately defined. The performance of GNNs depends on both the node feature smoothness and the locality of graph structure. To smooth the discrepancy of node proximity measured by graph topology and node feature, we proposed SAIL - a novel Self-Augmented graph contrastive Learning framework, with two complementary self-distilling regularization modules, i.e., intra- and inter-graph knowledge distillation. We demonstrate the competitive performance of SAIL on a variety of graph applications. Even with a single GNN layer, SAIL has consistently competitive or even better performance on various benchmark datasets, comparing with state-of-the-art baselines.
UR - http://www.scopus.com/inward/record.url?scp=85137150052&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85137150052
T3 - Proceedings of the 36th AAAI Conference on Artificial Intelligence, AAAI 2022
SP - 8927
EP - 8935
BT - AAAI-22 Technical Tracks 8
PB - Association for the Advancement of Artificial Intelligence
Y2 - 22 February 2022 through 1 March 2022
ER -