Scalable graph neural networks via bidirectional propagation

Ming Chen, Zhewei Wei*, Bolin Ding, Yaliang Li, Ye Yuan, Xiaoyong Du, Ji Rong Wen

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

75 引用 (Scopus)

摘要

Graph Neural Networks (GNN) is an emerging field for learning on non-Euclidean data. Recently, there has been increased interest in designing GNN that scales to large graphs. Most existing methods use "graph sampling" or "layer-wise sampling" techniques to reduce training time. However, these methods still suffer from degrading performance and scalability problems when applying to graphs with billions of edges. This paper presents GBP, a scalable GNN that utilizes a localized bidirectional propagation process from both the feature vectors and the training/testing nodes. Theoretical analysis shows that GBP is the first method that achieves sub-linear time complexity for both the precomputation and the training phases. An extensive empirical study demonstrates that GBP achieves state-of-the-art performance with significantly less training/testing time. Most notably, GBP can deliver superior performance on a graph with over 60 million nodes and 1.8 billion edges in less than half an hour on a single machine.

源语言英语
期刊Advances in Neural Information Processing Systems
2020-December
出版状态已出版 - 2020
活动34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
期限: 6 12月 202012 12月 2020

指纹

探究 'Scalable graph neural networks via bidirectional propagation' 的科研主题。它们共同构成独一无二的指纹。

引用此