Back-propagation extreme learning machine

Weidong Zou, Yuanqing Xia, Weipeng Cao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Incremental Extreme Learning Machine (I-ELM) is a typical constructive feed-forward neural network with random hidden nodes, which can automatically determine the appropriate number of hidden nodes. However I-ELM and its variants suffer from a notorious problem, that is, the input parameters of these algorithms are randomly assigned and kept fixed throughout the training process, which results in a very unstable performance of the model. To solve this problem, we propose a novel Back-Propagation ELM (BP-ELM) in this study, which can dynamically assign the most appropriate input parameters according to the current residual error of the model during the increasing process of the hidden nodes. In this way, BP-ELM can greatly improve the quality of newly added nodes and then accelerate the convergence rate and improve the model performance. Moreover, under the same error level, the network structure of the model obtained by BP-ELM is more compact than that of the I-ELM. We also prove the universal approximation ability of BP-ELM in this study. Experimental results on three benchmark regression problems and a real-life traffic flow prediction problem empirically show that BP-ELM has better stability and generalization ability than other I-ELM-based algorithms.

Original languageEnglish
Pages (from-to)9179-9188
Number of pages10
JournalSoft Computing
Volume26
Issue number18
DOIs
Publication statusPublished - Sept 2022

Keywords

  • Convergence rate
  • Generalization performance
  • I-ELM
  • Input parameters
  • Residual error
  • Stability

Fingerprint

Dive into the research topics of 'Back-propagation extreme learning machine'. Together they form a unique fingerprint.

Cite this