摘要
The core objective of distantly Supervised relation extraction (DSRE) is to automatically and at scale extract relational information between two entities from unstructured text. However, the generation of noisy data in the automated labeling process is inevitable. To address the limitations of DSRE in terms of contextual information, current mainstream methods primarily focus on enriching entity representations through external knowledge bases. Nevertheless, there remains a wealth of untapped and rich information within the text itself, which plays a crucial role in helping the model to better understand the relationships between entities. In response to this, we propose a novel strategy that does not require the introduction of additional external knowledge. Instead, it cleverly leverages the principle of semantic similarity to construct contextually rich information that is highly semantically aligned with individual sentences. Furthermore, we incorporate information from Knowledge Graphs (KG) to achieve a deep integration of both textual and KG information. Specifically, we share contextual information at any level of the text with the KG information, merging them as the final bag-level representation for relation extraction. Experimental results demonstrate that updating the KG encoder using only the text encoder yields better performance. Furthermore, experiments conducted on two different domain-specific relation extraction datasets confirmed that our model achieved superior performance.
源语言 | 英语 |
---|---|
文章编号 | 129858 |
期刊 | Neurocomputing |
卷 | 634 |
DOI | |
出版状态 | 已出版 - 14 6月 2025 |