Complex-valued Neural Network-based Quantum Language Models

Peng Zhang, Wenjie Hui, Benyou Wang*, Donghao Zhao, Dawei Song, Christina Lioma, Jakob Grue Simonsen

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

13 引用 (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 11
  • Captures
    • Readers: 11
see details

摘要

Language modeling is essential in Natural Language Processing and Information Retrieval related tasks. After the statistical language models, Quantum Language Model (QLM) has been proposed to unify both single words and compound terms in the same probability space without extending term space exponentially. Although QLM achieved good performance in ad hoc retrieval, it still has two major limitations: (1) QLM cannot make use of supervised information, mainly due to the iterative and non-differentiable estimation of the density matrix, which represents both queries and documents in QLM. (2) QLM assumes the exchangeability of words or word dependencies, neglecting the order or position information of words.This article aims to generalize QLM and make it applicable to more complicated matching tasks (e.g., Question Answering) beyond ad hoc retrieval. We propose a complex-valued neural network-based QLM solution called C-NNQLM to employ an end-to-end approach to build and train density matrices in a light-weight and differentiable manner, and it can therefore make use of external well-trained word vectors and supervised labels. Furthermore, C-NNQLM adopts complex-valued word vectors whose phase vectors can directly encode the order (or position) information of words. Note that complex numbers are also essential in the quantum theory. We show that the real-valued NNQLM (R-NNQLM) is a special case of C-NNQLM.The experimental results on the QA task show that both R-NNQLM and C-NNQLM achieve much better performance than the vanilla QLM, and C-NNQLM's performance is on par with state-of-the-art neural network models. We also evaluate the proposed C-NNQLM on text classification and document retrieval tasks. The results on most datasets show that the C-NNQLM can outperform R-NNQLM, which demonstrates the usefulness of the complex representation for words and sentences in C-NNQLM.

源语言英语
文章编号84
期刊ACM Transactions on Information Systems
40
4
DOI
出版状态已出版 - 10月 2022

指纹

探究 'Complex-valued Neural Network-based Quantum Language Models' 的科研主题。它们共同构成独一无二的指纹。

引用此

Zhang, P., Hui, W., Wang, B., Zhao, D., Song, D., Lioma, C., & Simonsen, J. G. (2022). Complex-valued Neural Network-based Quantum Language Models. ACM Transactions on Information Systems, 40(4), 文章 84. https://doi.org/10.1145/3505138