CSE: Conceptual sentence embeddings based on attention model

Yashen Wang, Heyan Huang, Chong Feng, Qiang Zhou, Jiahui Gu, Xiong Gao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

51 Citations (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 51
  • Captures
    • Readers: 146
see details

Abstract

Most sentence embedding models typically represent each sentence only using word surface, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance representation capability of sentence, we employ conceptualization model to assign associated concepts for each sentence in the text corpus, and then learn conceptual sentence embedding (CSE). Hence, this semantic representation is more expressive than some widely-used text representation models such as latent topic model, especially for short-text. Moreover, we further extend CSE models by utilizing a local attention-based model that select relevant words within the context to make more efficient prediction. In the experiments, we evaluate the CSE models on two tasks, text classification and information retrieval. The experimental results show that the proposed models outperform typical sentence embed-ding models.

Original languageEnglish
Title of host publication54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages505-515
Number of pages11
ISBN (Electronic)9781510827585
DOIs
Publication statusPublished - 2016
Event54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Berlin, Germany
Duration: 7 Aug 201612 Aug 2016

Publication series

Name54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers
Volume1

Conference

Conference54th Annual Meeting of the Association for Computational Linguistics, ACL 2016
Country/TerritoryGermany
CityBerlin
Period7/08/1612/08/16

Fingerprint

Dive into the research topics of 'CSE: Conceptual sentence embeddings based on attention model'. Together they form a unique fingerprint.

Cite this

Wang, Y., Huang, H., Feng, C., Zhou, Q., Gu, J., & Gao, X. (2016). CSE: Conceptual sentence embeddings based on attention model. In 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers (pp. 505-515). (54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers; Vol. 1). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p16-1048