Enabling controllable table-to-text generation via prompting large language models with guided planning

Shuo Zhao, Xin Sun*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Recently, Large Language Models (LLMs) has demonstrated unparalleled capabilities in understanding and generation, hence holding promising prospects for applying LLMs to table-to-text generation. However, the generation process with LLMs lacks a high degree of controllability, which hinders the utilization of LLMs for table-to-text generation. In this paper, we introduce Poised, an effective method that prompts LLMs with guided planning to achieve controllable table-to-text generation. Specifically, we first employ prefix-tuning on BART to derive a plan from the given table. Then, we combine the plan with guided instructions to create a comprehensive prompt, which is later input into LLMs to generate the description of the table. Experiments across three domains of the few-shot Wiki dataset show that Poised achieves or approaches a plan completion rate of 100%, with an average hallucination frequency of less than 10%. Furthermore, Poised allows for fine-grained control over the generated content by intentionally modifying the prompt, enabling precise control over aspects such as attribute realization order.

Original languageEnglish
Article number112571
JournalKnowledge-Based Systems
Volume304
DOIs
Publication statusPublished - 25 Nov 2024

Keywords

  • Controllable text generation
  • Few-shot table-to-text generation
  • Large language models

Fingerprint

Dive into the research topics of 'Enabling controllable table-to-text generation via prompting large language models with guided planning'. Together they form a unique fingerprint.

Cite this