A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models

Hanqing Zhang, Haolin Song, Shaoyu Li, Ming Zhou, Dawei Song

Research output: Contribution to journalArticlepeer-review

80 Citations (Scopus)

Abstract

Controllable Text Generation (CTG) is an emerging area in the field of natural language generation (NLG). It is regarded as crucial for the development of advanced text generation technologies that better meet the specific constraints in practical applications. In recent years, methods using large-scale pre-trained language models (PLMs), in particular the widely used Transformer-based PLMs, have become a new paradigm of NLG, allowing generation of more diverse and fluent text. However, due to the limited level of interpretability of deep neural networks, the controllability of these methods needs to be guaranteed. To this end, controllable text generation using Transformer-based PLMs has become a rapidly growing yet challenging new research hotspot. A diverse range of approaches have emerged in the past 3 to 4 years, targeting different CTG tasks that require different types of controlled constraints. In this article, we present a systematic critical review on the common tasks, main approaches, and evaluation methods in this area. Finally, we discuss the challenges that the field is facing, and put forward various promising future directions. To the best of our knowledge, this is the first survey article to summarize the state-of-the-art CTG techniques from the perspective of Transformer-based PLMs. We hope it can help researchers and practitioners in the related fields to quickly track the academic and technological frontier, providing them with a landscape of the area and a roadmap for future research.

Original languageEnglish
Article number64
JournalACM Computing Surveys
Volume56
Issue number3
DOIs
Publication statusPublished - 6 Oct 2023

Keywords

  • Additional Key Words and PhrasesControllable text generation
  • Transformer
  • controllability
  • pre-trained language models
  • systematic review

Fingerprint

Dive into the research topics of 'A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models'. Together they form a unique fingerprint.

Cite this