Modified bidirectional extreme learning machine with Gram–Schmidt orthogonalization method

Guoqiang Zeng, Baihai Zhang, Fenxi Yao*, Senchun Chai

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

4 引用 (Scopus)

摘要

Incremental extreme learning machine has been proved to be an efficient and simple universal approximator. However, the network architecture may be very large due to the inefficient nodes which have a tiny effect on reducing the residual error. More to the point, the output weights are not the least square solution. To reduce such inefficient nodes, a method called bidirectional ELM (B-ELM), which analytically calculates the input weights of even nodes, was proposed. By analyzing, B-ELM can be further improved to achieve better performance on compacting structure. This paper proposes the modified B-ELM (MB-ELM), in which the orthogonalization method is involved in B-ELM to orthogonalize the output vectors of hidden nodes and the resulting vectors are taken as the output vectors. MB-ELM can greatly diminish inefficient nodes and obtain a preferable output weight vector which is the least square solution, so that it has better convergence rate and a more compact network architecture. Specifically, it has been proved that in theory, MB-ELM can reduce residual error to zero by adding only two nodes into network. Simulation results verify these conclusions and show that MB-ELM can reach smaller low limit of residual error than other I-ELM methods.

源语言英语
页(从-至)405-414
页数10
期刊Neurocomputing
316
DOI
出版状态已出版 - 17 11月 2018

指纹

探究 'Modified bidirectional extreme learning machine with Gram–Schmidt orthogonalization method' 的科研主题。它们共同构成独一无二的指纹。

引用此