TY - GEN
T1 - Learning Dynamic Coherence with Graph Attention Network for Biomedical Entity Linking
AU - Bo, Mumeng
AU - Zhang, Meihui
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - Biomedical entity linking, which aligns various disease mentions in unstructured documents to their corresponding standardized entities in a knowledge base (KB), is an essential task in biomedical natural language processing. Unlike in general domain, the specific challenge is that biomedical entities often have many variations in their surface forms, and there are limited biomedical corpora for learning the correspondence. Recently, biomedical entity linking has been shown to significantly benefit from neural-based deep learning approaches. However, existing works mostly have not exploited the topical coherence in their models. Moreover, most of the collective models use a sequence-based approach, which may generate an accumulation of errors and perform unnecessary computation over irrelevant entities. Most importantly, these models ignore the relationships among mentions within a single document, which are very useful for linking the entities. In this paper, we propose an effective graph attention neural network, which can dynamically capture the relationships between entity mentions and learn the coherence representation. Besides, unlike graph-based models in general domain, our model does not require large extra resources to learn representations. We conduct extensive experiments on two biomedical datasets. The results show that our model achieves promising results.
AB - Biomedical entity linking, which aligns various disease mentions in unstructured documents to their corresponding standardized entities in a knowledge base (KB), is an essential task in biomedical natural language processing. Unlike in general domain, the specific challenge is that biomedical entities often have many variations in their surface forms, and there are limited biomedical corpora for learning the correspondence. Recently, biomedical entity linking has been shown to significantly benefit from neural-based deep learning approaches. However, existing works mostly have not exploited the topical coherence in their models. Moreover, most of the collective models use a sequence-based approach, which may generate an accumulation of errors and perform unnecessary computation over irrelevant entities. Most importantly, these models ignore the relationships among mentions within a single document, which are very useful for linking the entities. In this paper, we propose an effective graph attention neural network, which can dynamically capture the relationships between entity mentions and learn the coherence representation. Besides, unlike graph-based models in general domain, our model does not require large extra resources to learn representations. We conduct extensive experiments on two biomedical datasets. The results show that our model achieves promising results.
UR - http://www.scopus.com/inward/record.url?scp=85116412224&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9533687
DO - 10.1109/IJCNN52387.2021.9533687
M3 - Conference contribution
AN - SCOPUS:85116412224
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
Y2 - 18 July 2021 through 22 July 2021
ER -