Modified bidirectional extreme learning machine with Gram–Schmidt orthogonalization method

Guoqiang Zeng, Baihai Zhang, Fenxi Yao*, Senchun Chai

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Incremental extreme learning machine has been proved to be an efficient and simple universal approximator. However, the network architecture may be very large due to the inefficient nodes which have a tiny effect on reducing the residual error. More to the point, the output weights are not the least square solution. To reduce such inefficient nodes, a method called bidirectional ELM (B-ELM), which analytically calculates the input weights of even nodes, was proposed. By analyzing, B-ELM can be further improved to achieve better performance on compacting structure. This paper proposes the modified B-ELM (MB-ELM), in which the orthogonalization method is involved in B-ELM to orthogonalize the output vectors of hidden nodes and the resulting vectors are taken as the output vectors. MB-ELM can greatly diminish inefficient nodes and obtain a preferable output weight vector which is the least square solution, so that it has better convergence rate and a more compact network architecture. Specifically, it has been proved that in theory, MB-ELM can reduce residual error to zero by adding only two nodes into network. Simulation results verify these conclusions and show that MB-ELM can reach smaller low limit of residual error than other I-ELM methods.

Original languageEnglish
Pages (from-to)405-414
Number of pages10
JournalNeurocomputing
Volume316
DOIs
Publication statusPublished - 17 Nov 2018

Keywords

  • Incremental extreme learning machine
  • Inefficient nodes
  • Least square solution
  • Orthogonalization method
  • Universal approximator

Fingerprint

Dive into the research topics of 'Modified bidirectional extreme learning machine with Gram–Schmidt orthogonalization method'. Together they form a unique fingerprint.

Cite this