Conceptual multi-layer neural network model for headline generation

Yidi Guo*, Heyan Huang, Yang Gao, Chi Lu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Citations (Scopus)

Abstract

Neural attention-based models have been widely used recently in headline generation by mapping source document to target headline. However, the traditional neural headline generation models utilize the first sentence of the document as the training input while ignoring the impact of the document concept information on headline generation. In this work, A new neural attention-based model called concept sensitive neural headline model is proposed, which connects the concept information of the document to input text for headline generation and achieves satisfactory results. Besides, we use a multi-layer Bi-LSTM in encoder instead of single layer. Experiments have shown that our model outperforms state-of-the-art systems on DUC-2004 and Gigaword test sets.

Original languageEnglish
Title of host publicationChinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data - 16th China National Conference, CCL 2017 and 5th International Symposium, NLP-NABD 2017, Proceedings
EditorsMaosong Sun, Baobao Chang, Xiaojie Wang, Deyi Xiong
PublisherSpringer Verlag
Pages355-367
Number of pages13
ISBN (Print)9783319690049
DOIs
Publication statusPublished - 2017
Event16th China National Conference on Computational Linguistics, CCL 2017 and 5th International Symposium on Natural Language Processing Based on Naturally Annotated Big Data, NLP-NABD 2017 - Nanjing, China
Duration: 13 Oct 201715 Oct 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10565 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference16th China National Conference on Computational Linguistics, CCL 2017 and 5th International Symposium on Natural Language Processing Based on Naturally Annotated Big Data, NLP-NABD 2017
Country/TerritoryChina
CityNanjing
Period13/10/1715/10/17

Keywords

  • Attention-based
  • Concept
  • Multi-layer Bi-LSTM

Fingerprint

Dive into the research topics of 'Conceptual multi-layer neural network model for headline generation'. Together they form a unique fingerprint.

Cite this