TY - JOUR
T1 - Enhancing math reasoning ability of large language models via computation logic graphs
AU - Zhao, Deji
AU - Han, Donghong
AU - Wu, Jia
AU - He, Zhongjiang
AU - Ning, Bo
AU - Yuan, Ye
AU - Li, Yongxiang
AU - Wang, Chao
AU - Song, Shuangyong
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/9/5
Y1 - 2025/9/5
N2 - The reasoning capabilities of large language models (LLMs) are essential for a wide range of tasks, particularly in the domain of mathematical reasoning. Common chain of thought methods perform well in handling simple reasoning problems, but for complex problems, a single-dimensional chain of thought is inadequate to address multi-layered logical relationships. To tackle this challenge, this paper introduces the concept of a Computation Logic Graph (CLG), designed to enhance the logical reasoning abilities of LLMs when solving complex mathematical problems. The CLG decomposes complex mathematical problems into multiple simple intermediate computational units, and the final answer is obtained through multiple iterations of these units. On the one hand, the CLG improves the model's ability to decompose and solve complex mathematical problems step-by-step from a global perspective. On the other hand, the local inference process within the CLG helps enhance the model's accuracy in single step calculations. To develop models with the ability to construct Computation Logic Graphs automatically, we create a dataset of computational logic graphs for complex mathematical problems, called the Computation-intensive Math Logic Graph (CMLG) dataset. We fine-tune several open-source LLMs using the CMLG dataset. Experimental results demonstrate that the proposed CLG method significantly enhances the performance of LLMs in complex mathematical reasoning tasks, outperforming on both the CMLG dataset and six other publicly available datasets from diverse domains.
AB - The reasoning capabilities of large language models (LLMs) are essential for a wide range of tasks, particularly in the domain of mathematical reasoning. Common chain of thought methods perform well in handling simple reasoning problems, but for complex problems, a single-dimensional chain of thought is inadequate to address multi-layered logical relationships. To tackle this challenge, this paper introduces the concept of a Computation Logic Graph (CLG), designed to enhance the logical reasoning abilities of LLMs when solving complex mathematical problems. The CLG decomposes complex mathematical problems into multiple simple intermediate computational units, and the final answer is obtained through multiple iterations of these units. On the one hand, the CLG improves the model's ability to decompose and solve complex mathematical problems step-by-step from a global perspective. On the other hand, the local inference process within the CLG helps enhance the model's accuracy in single step calculations. To develop models with the ability to construct Computation Logic Graphs automatically, we create a dataset of computational logic graphs for complex mathematical problems, called the Computation-intensive Math Logic Graph (CMLG) dataset. We fine-tune several open-source LLMs using the CMLG dataset. Experimental results demonstrate that the proposed CLG method significantly enhances the performance of LLMs in complex mathematical reasoning tasks, outperforming on both the CMLG dataset and six other publicly available datasets from diverse domains.
KW - Graph of thought
KW - Large language models
KW - Logical reasoning
KW - Mathematical reasoning
KW - Problem decomposition
UR - https://www.scopus.com/pages/publications/105008791107
U2 - 10.1016/j.knosys.2025.113905
DO - 10.1016/j.knosys.2025.113905
M3 - Article
AN - SCOPUS:105008791107
SN - 0950-7051
VL - 325
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 113905
ER -