Word2State: Modeling Word Representations as States with Density Matrices

Chenchen Zhang, Qiuchi Li, Zhan Su, Dawei Song*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Polysemy is a common phenomenon in linguistics. Quantum-inspired complex word embeddings based on Semantic Hilbert Space play an important role in natural language processing, which may accurately define a genuine probability distribution over the word space. The existing quantum-inspired works manipulate on the real-valued vectors to compose the complex-valued word embeddings, which lack direct complex-valued pre-trained word representations. Motivated by quantum-inspired complex word embeddings, we propose a complex-valued pre-trained word embedding based on density matrices, called Word2State. Unlike the existing static word embeddings, our proposed model can provide non-linear semantic composition in the form of amplitude and phase, which also defines an authentic probabilistic distribution. We evaluate this model on twelve datasets from the word similarity task and six datasets from the relevant downstream tasks. The experimental results on different tasks demonstrate that our proposed pre-trained word embedding can capture richer semantic information and exhibit greater flexibility in expressing uncertainty.

Original languageEnglish
Pages (from-to)649-660
Number of pages12
JournalChinese Journal of Electronics
Volume34
Issue number2
DOIs
Publication statusPublished - 2025
Externally publishedYes

Keywords

  • Density matrix
  • Natural language processing
  • Quantum language model
  • Word embedding

Cite this