CPLR-SFS: Contrastive Prompt Learning to Reduce Redundancy for Scientific Faceted Summarization

Jing Wen Xiong, Xian Ling Mao*, Yizhe Yang, Heyan Huang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

Scientific faceted summarization is a task to generate four summaries for a scientific article from different facets, including Purpose, Method, Findings, and Value. Existing works usually generate summary independently for each facet using pre-training or prompt-training paradigms. However, these works tend to produce duplicate content among different facets of the same scientific article, because they do not consider the relation among the aforementioned four facets. To solve the redundancy problem, we propose a novel Contrastive Prompt Learning method to Reduce redundancy for Scientific Faceted Summarization, named CPLR-SFS, to generate concise and less-overlapping faceted summaries. Specifically, CPLR-SFS receives the facet-specific prompt to guide the generation and utilizes the faceted contrastive loss for better distinguishing different faceted summaries in semantic space. Extensive experiments on the FacetSum dataset demonstrate that the proposed model can generate better faceted summaries than the state-of-the-art baselines with less redundancy.

Original languageEnglish
Article number012006
JournalJournal of Physics: Conference Series
Volume2506
Issue number1
DOIs
Publication statusPublished - 2023
Event2022 International Joint Conference on Robotics and Artificial Intelligence, JCRAI 2022 - Virtual, Online
Duration: 14 Oct 202217 Oct 2022

Fingerprint

Dive into the research topics of 'CPLR-SFS: Contrastive Prompt Learning to Reduce Redundancy for Scientific Faceted Summarization'. Together they form a unique fingerprint.

Cite this