TY - GEN
T1 - SPAC
T2 - 39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025
AU - Yang, Chuhong
AU - Li, Bin
AU - Wu, Nan
N1 - Publisher Copyright:
Copyright © 2025, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2025/4/11
Y1 - 2025/4/11
N2 - Tensor decomposition (TD) models are promising solutions for knowledge graph completion due to their simple structures but powerful representation capacities. These TD models typically adopt Tucker decomposition with a structured core tensor. Some models with a sparse core tensor, such as DistMult and ComplEx, are too simple and thus limit the interaction between embedding components, while other models with a dense core tensor are too complex and may lead to significant overfitting. To address these issues, we propose a new TD model called SPAC (Sparse Partitioning and Adaptive Core tensor pruning). Specifically, SPAC captures coarse and fine-grained semantic information using a hybrid core tensor, where auxiliary cores are used to model sparse interactions and main cores for dense interactions. Moreover, SPAC introduces a gating mechanism to control the output of intermediate variables, enhancing the interaction between different partition groups. Furthermore, SPAC employs an adaptive pruning approach to dynamically adjust the shape of the core tensor. The proposed TD model enhances expressive capacity and reduces the number of parameters in the core tensor. Experiments are conducted on datasets FB15k-237, WN18RR, and YAGO3-10. The results demonstrate that SPAC outperforms state-of-the-art tensor decomposition models, including MEIM and Tucker models. A series of ablation studies show that the gating mechanism and adaptive pruning strategy in SPAC are crucial for the performance improvement.
AB - Tensor decomposition (TD) models are promising solutions for knowledge graph completion due to their simple structures but powerful representation capacities. These TD models typically adopt Tucker decomposition with a structured core tensor. Some models with a sparse core tensor, such as DistMult and ComplEx, are too simple and thus limit the interaction between embedding components, while other models with a dense core tensor are too complex and may lead to significant overfitting. To address these issues, we propose a new TD model called SPAC (Sparse Partitioning and Adaptive Core tensor pruning). Specifically, SPAC captures coarse and fine-grained semantic information using a hybrid core tensor, where auxiliary cores are used to model sparse interactions and main cores for dense interactions. Moreover, SPAC introduces a gating mechanism to control the output of intermediate variables, enhancing the interaction between different partition groups. Furthermore, SPAC employs an adaptive pruning approach to dynamically adjust the shape of the core tensor. The proposed TD model enhances expressive capacity and reduces the number of parameters in the core tensor. Experiments are conducted on datasets FB15k-237, WN18RR, and YAGO3-10. The results demonstrate that SPAC outperforms state-of-the-art tensor decomposition models, including MEIM and Tucker models. A series of ablation studies show that the gating mechanism and adaptive pruning strategy in SPAC are crucial for the performance improvement.
UR - http://www.scopus.com/inward/record.url?scp=105003999694&partnerID=8YFLogxK
U2 - 10.1609/aaai.v39i14.33671
DO - 10.1609/aaai.v39i14.33671
M3 - Conference contribution
AN - SCOPUS:105003999694
T3 - Proceedings of the AAAI Conference on Artificial Intelligence
SP - 15230
EP - 15238
BT - Special Track on AI Alignment
A2 - Walsh, Toby
A2 - Shah, Julie
A2 - Kolter, Zico
PB - Association for the Advancement of Artificial Intelligence
Y2 - 25 February 2025 through 4 March 2025
ER -