TY - JOUR
T1 - Subgraph-aware graph structure revision for spatial–temporal graph modeling
AU - Wang, Yuhu
AU - Zhang, Chunxia
AU - Xiang, Shiming
AU - Pan, Chunhong
N1 - Publisher Copyright:
© 2022 Elsevier Ltd
PY - 2022/10
Y1 - 2022/10
N2 - Spatial–temporal graph modeling has been widely studied in many fields, such as traffic forecasting and energy analysis, where data has time and space properties. Existing methods focus on capturing stable and dynamic spatial correlations by constructing physical and virtual graphs along with graph convolution and temporal modeling. However, existing methods tending to smooth node features may obscure the spatial–temporal patterns among nodes. Worse, the graph structure is not always available in some fields, while the manually constructed stable or dynamic graphs cannot necessarily reflect the true spatial correlations either. This paper proposes a Subgraph-Aware Graph Structure Revision network (SAGSR) to overcome these limitations. Architecturally, a subgraph-aware structure revision graph convolution module (SASR-GCM) is designed, which revises the learned stable graph to obtain a dynamic one to automatically infer the dynamics of spatial correlations. Each of these two graphs is separated into one homophilic subgraph and one heterophilic subgraph by a subgraph-aware graph convolution mechanism, which aggregates similar nodes in the homophilic subgraph with positive weights, while keeping nodes with dissimilar features in the heterophilic subgraph mutually away with negative aggregation weights to avoid pattern obfuscation. By combining a gated multi-scale temporal convolution module (GMS-TCM) for temporal modeling, SAGSR can efficiently capture the spatial–temporal correlations and extract complex spatial–temporal graph features. Extensive experiments, conducted on two specific tasks: traffic flow forecasting and energy consumption forecasting, indicate the effectiveness and superiority of our proposed approach over several competitive baselines.
AB - Spatial–temporal graph modeling has been widely studied in many fields, such as traffic forecasting and energy analysis, where data has time and space properties. Existing methods focus on capturing stable and dynamic spatial correlations by constructing physical and virtual graphs along with graph convolution and temporal modeling. However, existing methods tending to smooth node features may obscure the spatial–temporal patterns among nodes. Worse, the graph structure is not always available in some fields, while the manually constructed stable or dynamic graphs cannot necessarily reflect the true spatial correlations either. This paper proposes a Subgraph-Aware Graph Structure Revision network (SAGSR) to overcome these limitations. Architecturally, a subgraph-aware structure revision graph convolution module (SASR-GCM) is designed, which revises the learned stable graph to obtain a dynamic one to automatically infer the dynamics of spatial correlations. Each of these two graphs is separated into one homophilic subgraph and one heterophilic subgraph by a subgraph-aware graph convolution mechanism, which aggregates similar nodes in the homophilic subgraph with positive weights, while keeping nodes with dissimilar features in the heterophilic subgraph mutually away with negative aggregation weights to avoid pattern obfuscation. By combining a gated multi-scale temporal convolution module (GMS-TCM) for temporal modeling, SAGSR can efficiently capture the spatial–temporal correlations and extract complex spatial–temporal graph features. Extensive experiments, conducted on two specific tasks: traffic flow forecasting and energy consumption forecasting, indicate the effectiveness and superiority of our proposed approach over several competitive baselines.
KW - Graph neural network
KW - Graph structure learning
KW - Spatial–temporal graph modeling
UR - http://www.scopus.com/inward/record.url?scp=85134877893&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2022.07.017
DO - 10.1016/j.neunet.2022.07.017
M3 - Article
C2 - 35905653
AN - SCOPUS:85134877893
SN - 0893-6080
VL - 154
SP - 190
EP - 202
JO - Neural Networks
JF - Neural Networks
ER -