Novel Representations of Word Embedding Based on the Zolu Function

Jihua Lu*, Youcheng Zhang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Two learning models, Zolu-continuous bags of words (ZL-CBOW) and Zolu-skip-grams (ZL-SG), based on the Zolu function are proposed. The slope of Relu in word2vec has been changed by the Zolu function. The proposed models can process extremely large data sets as well as word2vec without increasing the complexity. Also, the models outperform several word embedding methods both in word similarity and syntactic accuracy. The method of ZL-CBOW outperforms CBOW in accuracy by 8.43% on the training set of capital-world, and by 1.24% on the training set of plural-verbs. Moreover, experimental simulations on word similarity and syntactic accuracy show that ZL-CBOW and ZL-SG are superior to LL-CBOW and LL-SG, respectively.

Original languageEnglish
Pages (from-to)526-530
Number of pages5
JournalJournal of Beijing Institute of Technology (English Edition)
Volume29
Issue number4
DOIs
Publication statusPublished - Dec 2020

Keywords

  • Accuracy
  • Continuous bags of words
  • Word embedding
  • Word similarity
  • Zolu function

Fingerprint

Dive into the research topics of 'Novel Representations of Word Embedding Based on the Zolu Function'. Together they form a unique fingerprint.

Cite this