融合序列语法知识的卷积-自注意力生成式摘要方法

Translated title of the contribution: A Convolution-Self Attention Abstractive Summarization Method Fusing Sequential Grammar Knowledge

Senlin Luo, Ruiyi Wang, Qian Wu*, Limin Pan, Zhouting Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Abstractive summarization is to analyze the core ideas of the document, rephrase or use new words to generate a summary that can summarize the whole document. However, the encoder-decoder model can not fully extract the syntax, that cause the summary not to match the grammar rules. The recurrent neural network is easy to forget the historical information and can not perform parallel computation during training, that cause the main idea of the summary not significant and the coding speed slow. In view of the above problems, a new abstractive summarization method with fusing sequential syntax was proposed for the convolution-self attention model. First, constructing a phrase structure tree for the document and embeding sequential syntax into the encoder, the method could make better use of the syntax when encoding. Then, the convolution-self-attention model was used to replace the recurrent neural network model to encode, learnning the global and local information sufficiently from the document. Experimental results on the CNN/Daily Mail dataset show that, the proposed method is superior to the state-of-the-art methods. At the same time, the generated summaries are more grammatical, the main ideas are more obvious and the encoding speed of the model is faster.

Translated title of the contributionA Convolution-Self Attention Abstractive Summarization Method Fusing Sequential Grammar Knowledge
Original languageChinese (Traditional)
Pages (from-to)93-101
Number of pages9
JournalBeijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology
Volume41
Issue number1
DOIs
Publication statusPublished - Jan 2021

Fingerprint

Dive into the research topics of 'A Convolution-Self Attention Abstractive Summarization Method Fusing Sequential Grammar Knowledge'. Together they form a unique fingerprint.

Cite this