TY - GEN
T1 - Optimization Design of Adaptive Loss Function Using Evolutionary Neural Networks
AU - Meng, Xiang
AU - Hai, Zhaoyang
AU - Liu, Xiabi
AU - Pei, Yan
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2026.
PY - 2026
Y1 - 2026
N2 - This study introduces the evolutionary loss function (ELF), a novel framework that dynamically optimizes loss functions using evolutionary computation. Unlike traditional loss functions based on fixed, predefined formulas, ELF employs a parameterized neural network capable of adapting to diverse data distributions and task-specific requirements. By leveraging operations of evolutionary computation such as mutation and selection, ELF explores a broad parameter space, effectively addressing the inherent limitations of gradient-based optimization methods. These methods, which require differentiable objectives, often struggle with non-smooth functions and are prone to local optima, limiting their effectiveness in complex or irregular optimization landscapes. In contrast, ELF utilizes evolutionary optimization to perform a global search across the parameter space, enabling it to overcome these challenges and dynamically optimize loss functions. To validate its effectiveness, ELF is evaluated across multiple tasks, with experimental results consistently demonstrating superior performance compared to both traditional and state-of-the-art dynamic loss functions.
AB - This study introduces the evolutionary loss function (ELF), a novel framework that dynamically optimizes loss functions using evolutionary computation. Unlike traditional loss functions based on fixed, predefined formulas, ELF employs a parameterized neural network capable of adapting to diverse data distributions and task-specific requirements. By leveraging operations of evolutionary computation such as mutation and selection, ELF explores a broad parameter space, effectively addressing the inherent limitations of gradient-based optimization methods. These methods, which require differentiable objectives, often struggle with non-smooth functions and are prone to local optima, limiting their effectiveness in complex or irregular optimization landscapes. In contrast, ELF utilizes evolutionary optimization to perform a global search across the parameter space, enabling it to overcome these challenges and dynamically optimize loss functions. To validate its effectiveness, ELF is evaluated across multiple tasks, with experimental results consistently demonstrating superior performance compared to both traditional and state-of-the-art dynamic loss functions.
KW - Adaptive Optimization Design
KW - Dynamic Loss Function
KW - Evolutionary Computation
KW - Neural Networks
KW - Optimization
UR - https://www.scopus.com/pages/publications/105023586018
U2 - 10.1007/978-981-95-4367-0_22
DO - 10.1007/978-981-95-4367-0_22
M3 - Conference contribution
AN - SCOPUS:105023586018
SN - 9789819543663
T3 - Lecture Notes in Computer Science
SP - 321
EP - 335
BT - Neural Information Processing - 32nd International Conference, ICONIP 2025, Proceedings
A2 - Taniguchi, Tadahiro
A2 - Leung, Chi Sing Andrew
A2 - Kozuno, Tadashi
A2 - Yoshimoto, Junichiro
A2 - Mahmud, Mufti
A2 - Doborjeh, Maryam
A2 - Doya, Kenji
PB - Springer Science and Business Media Deutschland GmbH
T2 - 32nd International Conference on Neural Information Processing, ICONIP 2025
Y2 - 20 November 2025 through 24 November 2025
ER -