A Local Self-Attention Sentence Model for Answer Selection Task in CQA Systems

Donglei Liu, Hao Lu, Yong Yuan, Rui Qin, Yifan Zhu, Chunxia Zhang, Zhendong Niu*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

Current evidence indicates that the semantic representation of question and answer sentences is better generated by deep neural network-based sentence models than traditional methods in community answer selection tasks. In particular, as a widely recognized language model, the self-attention model computes the similarity between the specific word and the whole sets of words in the same sentence and generates new semantic representation through the similarity-weighted summation of semantic representations of the whole words. However, the self-attention operation entirely considers all the signals with a weighted sum operation, which disperses the distribution of attention, which may result in overlooking the relation of neighboring signals. This issue becomes serious when applying the self-attention model to online community question answering platforms because of the varied length of the user-generated questions and answers. To address this problem, we introduce an attention mechanism enhanced local self-attention (LSA), which restricts the range of original self-attention by a local window mechanism, thereby scaling linearly when increasing the sequence length. Furthermore, we propose stacking multiple LSA layers to model the relationship of multiscale n-gram features. It captures the word-to-word relationship in the first layer and then captures the chunk-to-chunk (such as lexical n-gram phrases) relationship in its deeper layers. We also test the effectiveness of the proposed model by applying the learned representation through the LSA model to a Siamese and a classification network in community question answer selection tasks. Experiments on the public datasets show that the proposed LSA achieves a good performance.

源语言英语
页(从-至)3283-3294
页数12
期刊IEEE Transactions on Computational Social Systems
10
6
DOI
出版状态已出版 - 1 12月 2023

指纹

探究 'A Local Self-Attention Sentence Model for Answer Selection Task in CQA Systems' 的科研主题。它们共同构成独一无二的指纹。

引用此