Graph-based reasoning model for multiple relation extraction

Research output: Contribution to journalArticlepeer-review

14 Citations (Scopus)

Abstract

Linguistic knowledge is useful for various NLP tasks, but the difficulty lies in the representation and application. We consider that linguistic knowledge is implied in a large-scale corpus, while classification knowledge, the knowledge related to the definitions of entity and relation types, is implied in the labeled training data. Therefore, a corpus subgraph is proposed to mine more linguistic knowledge from the easily accessible unlabeled data, and sentence subgraphs are used to acquire classification knowledge. They jointly constitute a relation knowledge graph (RKG) to extract relations from sentences in this paper. On RKG, entity recognition can be regarded as a property value filling problem and relation classification can be regarded as a link prediction problem. Thus, the multiple relation extraction can be treated as a reasoning process for knowledge completion. We combine statistical reasoning and neural network reasoning to segment sentences into entity chunks and non-entity chunks, then propose a novel Chunk Graph LSTM network to learn the representations of entity chunks and infer the relations among them. The experiments on two standard datasets demonstrate our model outperforms the previous models for multiple relation extraction.

Original languageEnglish
Pages (from-to)162-170
Number of pages9
JournalNeurocomputing
Volume420
DOIs
Publication statusPublished - 8 Jan 2021
Externally publishedYes

Keywords

  • Information extraction
  • Natural language processing
  • Neural networks
  • Relation extraction

Fingerprint

Dive into the research topics of 'Graph-based reasoning model for multiple relation extraction'. Together they form a unique fingerprint.

Cite this