Back-propagation extreme learning machine

Weidong Zou, Yuanqing Xia, Weipeng Cao*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

5 引用 (Scopus)

摘要

Incremental Extreme Learning Machine (I-ELM) is a typical constructive feed-forward neural network with random hidden nodes, which can automatically determine the appropriate number of hidden nodes. However I-ELM and its variants suffer from a notorious problem, that is, the input parameters of these algorithms are randomly assigned and kept fixed throughout the training process, which results in a very unstable performance of the model. To solve this problem, we propose a novel Back-Propagation ELM (BP-ELM) in this study, which can dynamically assign the most appropriate input parameters according to the current residual error of the model during the increasing process of the hidden nodes. In this way, BP-ELM can greatly improve the quality of newly added nodes and then accelerate the convergence rate and improve the model performance. Moreover, under the same error level, the network structure of the model obtained by BP-ELM is more compact than that of the I-ELM. We also prove the universal approximation ability of BP-ELM in this study. Experimental results on three benchmark regression problems and a real-life traffic flow prediction problem empirically show that BP-ELM has better stability and generalization ability than other I-ELM-based algorithms.

源语言英语
页(从-至)9179-9188
页数10
期刊Soft Computing
26
18
DOI
出版状态已出版 - 9月 2022

指纹

探究 'Back-propagation extreme learning machine' 的科研主题。它们共同构成独一无二的指纹。

引用此