TY - GEN
T1 - A Relation-aware Attention Neural Network for Modeling the Usage of Scientific Online Resources
AU - Xu, Yongxiu
AU - Huang, Heyan
AU - Feng, Chong
AU - Zhou, Chuan
AU - Zhang, Jiarui
AU - Hu, Yue
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - More and more online resources for computer science are introduced, used and released in scientific literature in recent years. Knowledge about the usage of these online resources can help researchers easily find the applicable resources for their works. However, most existing methods ignore the importance of the content of the online resource citations. To this end, we manually create SciR, a dataset that contains 3, 012 annotation sentences for this task, and introduce a multi-task learning framework to automatically extract the entities and relations from the context of online resource citations in scientific papers. Furthermore, considering the words in a sentence usually play different roles under different relations. In this paper, we treat different relations as distinctive sub-spaces and model the correlations between words in sentence for each relation type by a supervised biaffine attention network. Based on this relation-aware attention network, our model can not only effectively obtain the word-level correlations under each relation, but also naturally avoid the problem of overlapping relations. To evaluate the effectiveness of our model, we conduct comprehensive experiments on three datasets and the experimental results demonstrate that our model outperforms other state-of-the-art methods on the two tasks of entity recognition and relation extraction.
AB - More and more online resources for computer science are introduced, used and released in scientific literature in recent years. Knowledge about the usage of these online resources can help researchers easily find the applicable resources for their works. However, most existing methods ignore the importance of the content of the online resource citations. To this end, we manually create SciR, a dataset that contains 3, 012 annotation sentences for this task, and introduce a multi-task learning framework to automatically extract the entities and relations from the context of online resource citations in scientific papers. Furthermore, considering the words in a sentence usually play different roles under different relations. In this paper, we treat different relations as distinctive sub-spaces and model the correlations between words in sentence for each relation type by a supervised biaffine attention network. Based on this relation-aware attention network, our model can not only effectively obtain the word-level correlations under each relation, but also naturally avoid the problem of overlapping relations. To evaluate the effectiveness of our model, we conduct comprehensive experiments on three datasets and the experimental results demonstrate that our model outperforms other state-of-the-art methods on the two tasks of entity recognition and relation extraction.
UR - http://www.scopus.com/inward/record.url?scp=85116488719&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533951
DO - 10.1109/IJCNN52387.2021.9533951
M3 - Conference contribution
AN - SCOPUS:85116488719
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -