Extending Embedding Representation by Incorporating Latent Relations

Gao Yang*, Wang Wenbo, Liu Qian, Huang Heyan, Yuefeng Li

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 3
  • Captures
    • Readers: 10
see details

摘要

The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many latent relations, such as co-occurring associative patterns and semantic conceptual relations, are not well considered. In this paper, we propose an extensible model to incorporate these kinds of valuable latent relations to increase the semantic relatedness of word pairs by learning word embeddings. To assess the effectiveness of our model, we conduct experiments on both information retrieval and text classification tasks. The results indicate the effectiveness of our model as well as its flexibility on different tasks.

源语言英语
文章编号8444048
页(从-至)52682-52690
页数9
期刊IEEE Access
6
DOI
出版状态已出版 - 21 8月 2018

指纹

探究 'Extending Embedding Representation by Incorporating Latent Relations' 的科研主题。它们共同构成独一无二的指纹。

引用此

Yang, G., Wenbo, W., Qian, L., Heyan, H., & Li, Y. (2018). Extending Embedding Representation by Incorporating Latent Relations. IEEE Access, 6, 52682-52690. 文章 8444048. https://doi.org/10.1109/ACCESS.2018.2866531