Abstract
The semantic representation of words is a fundamental task in natural language processing and text mining. Learning word embedding has shown its power on various tasks. Most studies are aimed at generating embedding representation of a word based on encoding its context information. However, many latent relations, such as co-occurring associative patterns and semantic conceptual relations, are not well considered. In this paper, we propose an extensible model to incorporate these kinds of valuable latent relations to increase the semantic relatedness of word pairs by learning word embeddings. To assess the effectiveness of our model, we conduct experiments on both information retrieval and text classification tasks. The results indicate the effectiveness of our model as well as its flexibility on different tasks.
Original language | English |
---|---|
Article number | 8444048 |
Pages (from-to) | 52682-52690 |
Number of pages | 9 |
Journal | IEEE Access |
Volume | 6 |
DOIs | |
Publication status | Published - 21 Aug 2018 |
Keywords
- Word embedding
- natural language processing
- text mining