TY - GEN
T1 - Incorporating instance correlations in distantly supervised relation extraction
AU - Zhang, Luhao
AU - Hu, Linmei
AU - Shi, Chuan
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2020.
PY - 2020
Y1 - 2020
N2 - Distantly-supervised relation extraction has proven to be effective to find relational facts from texts. However, the existing approaches treat the instances in the same bag independently and ignore the semantic structural information. In this paper, we propose a graph convolution network (GCN) model with an attention mechanism to improve relation extraction. For each bag, the model first builds a graph through the dependency tree of each instance in this bag. In this way, the correlations between instances are built through their common words. The learned node (word) embeddings which encode the bag information are then fed into the sentence encoder, i.e., text CNN to obtain better representations of sentences. Besides, an instance-level attention mechanism is introduced to select valid instances and learn the textual relation embedding. Finally, the learned embedding is used to train our relation classifier. Experiments on two benchmark datasets demonstrate that our model significantly outperforms the compared baselines.
AB - Distantly-supervised relation extraction has proven to be effective to find relational facts from texts. However, the existing approaches treat the instances in the same bag independently and ignore the semantic structural information. In this paper, we propose a graph convolution network (GCN) model with an attention mechanism to improve relation extraction. For each bag, the model first builds a graph through the dependency tree of each instance in this bag. In this way, the correlations between instances are built through their common words. The learned node (word) embeddings which encode the bag information are then fed into the sentence encoder, i.e., text CNN to obtain better representations of sentences. Besides, an instance-level attention mechanism is introduced to select valid instances and learn the textual relation embedding. Finally, the learned embedding is used to train our relation classifier. Experiments on two benchmark datasets demonstrate that our model significantly outperforms the compared baselines.
KW - Graph convolution network
KW - Knowledge graph
KW - Relation extraction
UR - http://www.scopus.com/inward/record.url?scp=85080870995&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-41407-8_12
DO - 10.1007/978-3-030-41407-8_12
M3 - Conference contribution
AN - SCOPUS:85080870995
SN - 9783030414061
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 177
EP - 191
BT - Semantic Technology - 9th Joint International Conference, JIST 2019, Proceedings
A2 - Wang, Xin
A2 - Lisi, Francesca Alessandra
A2 - Xiao, Guohui
A2 - Botoeva, Elena
PB - Springer
T2 - 9th Joint International Semantic Technology Conference, JIST 2019
Y2 - 25 November 2019 through 27 November 2019
ER -