CSE: Conceptual sentence embeddings based on attention model

Yashen Wang, Heyan Huang, Chong Feng, Qiang Zhou, Jiahui Gu, Xiong Gao

科研成果: 书/报告/会议事项章节会议稿件同行评审

50 引用 (Scopus)

摘要

Most sentence embedding models typically represent each sentence only using word surface, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance representation capability of sentence, we employ conceptualization model to assign associated concepts for each sentence in the text corpus, and then learn conceptual sentence embedding (CSE). Hence, this semantic representation is more expressive than some widely-used text representation models such as latent topic model, especially for short-text. Moreover, we further extend CSE models by utilizing a local attention-based model that select relevant words within the context to make more efficient prediction. In the experiments, we evaluate the CSE models on two tasks, text classification and information retrieval. The experimental results show that the proposed models outperform typical sentence embed-ding models.

源语言英语
主期刊名54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
出版商Association for Computational Linguistics (ACL)
505-515
页数11
ISBN(电子版)9781510827585
DOI
出版状态已出版 - 2016
活动54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Berlin, 德国
期限: 7 8月 201612 8月 2016

出版系列

姓名54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
1

会议

会议54th Annual Meeting of the Association for Computational Linguistics, ACL 2016
国家/地区德国
Berlin
时期7/08/1612/08/16

指纹

探究 'CSE: Conceptual sentence embeddings based on attention model' 的科研主题。它们共同构成独一无二的指纹。

引用此