A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models

Hanqing Zhang, Haolin Song, Shaoyu Li, Ming Zhou, Dawei Song

科研成果: 期刊稿件文章同行评审

80 引用 (Scopus)

摘要

Controllable Text Generation (CTG) is an emerging area in the field of natural language generation (NLG). It is regarded as crucial for the development of advanced text generation technologies that better meet the specific constraints in practical applications. In recent years, methods using large-scale pre-trained language models (PLMs), in particular the widely used Transformer-based PLMs, have become a new paradigm of NLG, allowing generation of more diverse and fluent text. However, due to the limited level of interpretability of deep neural networks, the controllability of these methods needs to be guaranteed. To this end, controllable text generation using Transformer-based PLMs has become a rapidly growing yet challenging new research hotspot. A diverse range of approaches have emerged in the past 3 to 4 years, targeting different CTG tasks that require different types of controlled constraints. In this article, we present a systematic critical review on the common tasks, main approaches, and evaluation methods in this area. Finally, we discuss the challenges that the field is facing, and put forward various promising future directions. To the best of our knowledge, this is the first survey article to summarize the state-of-the-art CTG techniques from the perspective of Transformer-based PLMs. We hope it can help researchers and practitioners in the related fields to quickly track the academic and technological frontier, providing them with a landscape of the area and a roadmap for future research.

源语言英语
文章编号64
期刊ACM Computing Surveys
56
3
DOI
出版状态已出版 - 6 10月 2023

指纹

探究 'A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models' 的科研主题。它们共同构成独一无二的指纹。

引用此