TY - GEN
T1 - Unsupervised Style Transfer in News Headlines via Discrete Style Space
AU - Liu, Qianhui
AU - Gao, Yang
AU - Yang, Yizhe
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2023.
PY - 2023
Y1 - 2023
N2 - The goal of headline style transfer in this paper is to make a headline more attractive while maintaining its meaning. The absence of parallel training data is one of the main problems in this field. In this work, we design a discrete style space for unsupervised headline style transfer, short for D-HST. This model decomposes the style-dependent text generation into content-feature extraction and style modelling. Then, generation decoder receives input from content, style, and their mixing components. In particular, it is considered that textual style signal is more abstract than the text itself. Therefore, we propose to model the style representation space as a discrete space, and each discrete point corresponds to a particular category of the styles that can be elicited by syntactic structure. Finally, we provide a new style-transfer dataset, named as TechST, which focuses on transferring news headline into those that are more eye-catching in technical social media. In the experiments, we develop two automatic evaluation metrics — style transfer rate (STR) and style-content trade-off (SCT) — along with a few traditional criteria to assess the overall effectiveness of the style transfer. In addition, the human evaluation is thoroughly conducted in terms of assessing the generation quality and creatively mimicking a scenario in which a user clicks on appealing headlines to determine the click-through rate. Our results indicate the D-HST achieves state-of-the-art results in these comprehensive evaluations.
AB - The goal of headline style transfer in this paper is to make a headline more attractive while maintaining its meaning. The absence of parallel training data is one of the main problems in this field. In this work, we design a discrete style space for unsupervised headline style transfer, short for D-HST. This model decomposes the style-dependent text generation into content-feature extraction and style modelling. Then, generation decoder receives input from content, style, and their mixing components. In particular, it is considered that textual style signal is more abstract than the text itself. Therefore, we propose to model the style representation space as a discrete space, and each discrete point corresponds to a particular category of the styles that can be elicited by syntactic structure. Finally, we provide a new style-transfer dataset, named as TechST, which focuses on transferring news headline into those that are more eye-catching in technical social media. In the experiments, we develop two automatic evaluation metrics — style transfer rate (STR) and style-content trade-off (SCT) — along with a few traditional criteria to assess the overall effectiveness of the style transfer. In addition, the human evaluation is thoroughly conducted in terms of assessing the generation quality and creatively mimicking a scenario in which a user clicks on appealing headlines to determine the click-through rate. Our results indicate the D-HST achieves state-of-the-art results in these comprehensive evaluations.
UR - http://www.scopus.com/inward/record.url?scp=85174448563&partnerID=8YFLogxK
U2 - 10.1007/978-981-99-6207-5_6
DO - 10.1007/978-981-99-6207-5_6
M3 - Conference contribution
AN - SCOPUS:85174448563
SN - 9789819962068
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 91
EP - 105
BT - Chinese Computational Linguistics - 22nd China National Conference, CCL 2023, Proceedings
A2 - Sun, Maosong
A2 - Qin, Bing
A2 - Qiu, Xipeng
A2 - Jing, Jiang
A2 - Han, Xianpei
A2 - Rao, Gaoqi
A2 - Chen, Yubo
PB - Springer Science and Business Media Deutschland GmbH
T2 - 22nd China National Conference on Computational Linguistics, CCL 2023
Y2 - 3 August 2023 through 5 August 2023
ER -