Extending Embedding Representation by Incorporating Latent Relations

Gao Yang*, Wang Wenbo, Liu Qian, Huang Heyan, Yuefeng Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many latent relations, such as co-occurring associative patterns and semantic conceptual relations, are not well considered. In this paper, we propose an extensible model to incorporate these kinds of valuable latent relations to increase the semantic relatedness of word pairs by learning word embeddings. To assess the effectiveness of our model, we conduct experiments on both information retrieval and text classification tasks. The results indicate the effectiveness of our model as well as its flexibility on different tasks.

Original languageEnglish
Article number8444048
Pages (from-to)52682-52690
Number of pages9
JournalIEEE Access
Volume6
DOIs
Publication statusPublished - 21 Aug 2018

Keywords

  • Word embedding
  • natural language processing
  • text mining

Fingerprint

Dive into the research topics of 'Extending Embedding Representation by Incorporating Latent Relations'. Together they form a unique fingerprint.

Cite this