TY - JOUR
T1 - Enabling controllable table-to-text generation via prompting large language models with guided planning
AU - Zhao, Shuo
AU - Sun, Xin
N1 - Publisher Copyright:
© 2024 Elsevier B.V.
PY - 2024/11/25
Y1 - 2024/11/25
N2 - Recently, Large Language Models (LLMs) has demonstrated unparalleled capabilities in understanding and generation, hence holding promising prospects for applying LLMs to table-to-text generation. However, the generation process with LLMs lacks a high degree of controllability, which hinders the utilization of LLMs for table-to-text generation. In this paper, we introduce Poised, an effective method that prompts LLMs with guided planning to achieve controllable table-to-text generation. Specifically, we first employ prefix-tuning on BART to derive a plan from the given table. Then, we combine the plan with guided instructions to create a comprehensive prompt, which is later input into LLMs to generate the description of the table. Experiments across three domains of the few-shot Wiki dataset show that Poised achieves or approaches a plan completion rate of 100%, with an average hallucination frequency of less than 10%. Furthermore, Poised allows for fine-grained control over the generated content by intentionally modifying the prompt, enabling precise control over aspects such as attribute realization order.
AB - Recently, Large Language Models (LLMs) has demonstrated unparalleled capabilities in understanding and generation, hence holding promising prospects for applying LLMs to table-to-text generation. However, the generation process with LLMs lacks a high degree of controllability, which hinders the utilization of LLMs for table-to-text generation. In this paper, we introduce Poised, an effective method that prompts LLMs with guided planning to achieve controllable table-to-text generation. Specifically, we first employ prefix-tuning on BART to derive a plan from the given table. Then, we combine the plan with guided instructions to create a comprehensive prompt, which is later input into LLMs to generate the description of the table. Experiments across three domains of the few-shot Wiki dataset show that Poised achieves or approaches a plan completion rate of 100%, with an average hallucination frequency of less than 10%. Furthermore, Poised allows for fine-grained control over the generated content by intentionally modifying the prompt, enabling precise control over aspects such as attribute realization order.
KW - Controllable text generation
KW - Few-shot table-to-text generation
KW - Large language models
UR - http://www.scopus.com/inward/record.url?scp=85205563576&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2024.112571
DO - 10.1016/j.knosys.2024.112571
M3 - Article
AN - SCOPUS:85205563576
SN - 0950-7051
VL - 304
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 112571
ER -