TY - JOUR
T1 - Neural Variational Gaussian Mixture Topic Model
AU - Tang, Yi Kun
AU - Huang, Heyan
AU - Shi, Xuewen
AU - Mao, Xian Ling
N1 - Publisher Copyright:
© 2023 Association for Computing Machinery.
PY - 2023/3/25
Y1 - 2023/3/25
N2 - Neural variational inference-based topic modeling has gained great success in mining abstract topics from documents. However, these topic models usually mainly focus on optimizing the topic proportions for documents, while the quality and the internal construction of topics are usually neglected. Specifically, these models lack the guarantee that semantically related words are supposed to be assigned to the same topic and are difficult to ensure the interpretability of topics. Moreover, many topical words recur frequently in the top words of different topics, which makes the learned topics semantically redundant and similar, and of little significance for further study. To solve the above problems, we propose a novel neural topic model called Neural Variational Gaussian Mixture Topic Model (NVGMTM). We use Gaussian distribution to depict the semantic relevance between words in the topics. Each topic in NVGMTM is considered as a multivariate Gaussian distribution over words in the word-embedding space. Thus, semantically related words share similar probabilities in each topic, which makes the topics more coherent and interpretable. Experimental results on two public corpora show the proposed model outperforms the state-of-the-art baselines.
AB - Neural variational inference-based topic modeling has gained great success in mining abstract topics from documents. However, these topic models usually mainly focus on optimizing the topic proportions for documents, while the quality and the internal construction of topics are usually neglected. Specifically, these models lack the guarantee that semantically related words are supposed to be assigned to the same topic and are difficult to ensure the interpretability of topics. Moreover, many topical words recur frequently in the top words of different topics, which makes the learned topics semantically redundant and similar, and of little significance for further study. To solve the above problems, we propose a novel neural topic model called Neural Variational Gaussian Mixture Topic Model (NVGMTM). We use Gaussian distribution to depict the semantic relevance between words in the topics. Each topic in NVGMTM is considered as a multivariate Gaussian distribution over words in the word-embedding space. Thus, semantically related words share similar probabilities in each topic, which makes the topics more coherent and interpretable. Experimental results on two public corpora show the proposed model outperforms the state-of-the-art baselines.
KW - Neural variational gaussian mixture topic model
KW - topic discrimination
KW - topic quality
UR - http://www.scopus.com/inward/record.url?scp=85161501200&partnerID=8YFLogxK
U2 - 10.1145/3578583
DO - 10.1145/3578583
M3 - Article
AN - SCOPUS:85161501200
SN - 2375-4699
VL - 22
JO - ACM Transactions on Asian and Low-Resource Language Information Processing
JF - ACM Transactions on Asian and Low-Resource Language Information Processing
IS - 4
M1 - 110
ER -