Sharing Pre-trained BERT Decoder for a Hybrid Summarization

Ran Wei, Heyan Huang*, Yang Gao

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Citations (Scopus)

Abstract

Sentence selection and summary generation are two main steps to generate informative and readable summaries. However, most previous works treat them as two separated subtasks. In this paper, we propose a novel extractive-and-abstractive hybrid framework for single document summarization task by jointly learning to select sentence and rewrite summary. It first selects sentences by an extractive decoder and then generate summary according to each selected sentence by an abstractive decoder. Moreover, we apply the BERT pre-trained model as document encoder, sharing the context representations to both decoders. Experiments on the CNN/DailyMail dataset show that the proposed framework outperforms both state-of-the-art extractive and abstractive models.

Original languageEnglish
Title of host publicationChinese Computational Linguistics - 18th China National Conference, CCL 2019, Proceedings
EditorsMaosong Sun, Yang Liu, Zhiyuan Liu, Xuanjing Huang, Heng Ji
PublisherSpringer
Pages169-180
Number of pages12
ISBN (Print)9783030323806
DOIs
Publication statusPublished - 2019
Event18th China National Conference on Computational Linguistics, CCL 2019 - Kunming, China
Duration: 18 Oct 201920 Oct 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11856 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference18th China National Conference on Computational Linguistics, CCL 2019
Country/TerritoryChina
CityKunming
Period18/10/1920/10/19

Keywords

  • Extractive and abstractive
  • Pre-trained based
  • Text summarization

Fingerprint

Dive into the research topics of 'Sharing Pre-trained BERT Decoder for a Hybrid Summarization'. Together they form a unique fingerprint.

Cite this