Abstract
Two learning models, Zolu-continuous bags of words (ZL-CBOW) and Zolu-skip-grams (ZL-SG), based on the Zolu function are proposed. The slope of Relu in word2vec has been changed by the Zolu function. The proposed models can process extremely large data sets as well as word2vec without increasing the complexity. Also, the models outperform several word embedding methods both in word similarity and syntactic accuracy. The method of ZL-CBOW outperforms CBOW in accuracy by 8.43% on the training set of capital-world, and by 1.24% on the training set of plural-verbs. Moreover, experimental simulations on word similarity and syntactic accuracy show that ZL-CBOW and ZL-SG are superior to LL-CBOW and LL-SG, respectively.
Original language | English |
---|---|
Pages (from-to) | 526-530 |
Number of pages | 5 |
Journal | Journal of Beijing Institute of Technology (English Edition) |
Volume | 29 |
Issue number | 4 |
DOIs | |
Publication status | Published - Dec 2020 |
Keywords
- Accuracy
- Continuous bags of words
- Word embedding
- Word similarity
- Zolu function