Extreme Learning Machine under Minimum Information Divergence Criterion

Chengtian Song*, Lizhi Pan, Qiang Liu, Zhihong Jiang*, Jianguang Jia

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

In recent years, extreme learning machine (ELM) and its improved algorithms have been successfully applied to various classification and regression tasks. In these algorithms, MSE criterion is commonly used to control training error. However, MSE criterion is not suitable to deal with outliers, which can exist in general regression or classification tasks. In this paper, a novel extreme learning machine under minimum information divergence criterion (ELM-MinID) is proposed to deal with the training set with noises. In minimum information divergence criterion, the Gaussian kernel function and Euclidean information divergence are utilized to substitute the mean square error (MSE) criterion to enhance the anti-noise ability of ELM. Experimental results on two synthetic datasets and eleven benchmark datasets show that this method is superior to traditional ELMs.

Original languageEnglish
Article number9133571
Pages (from-to)122026-122035
Number of pages10
JournalIEEE Access
Volume8
DOIs
Publication statusPublished - 2020

Keywords

  • Extreme learning machine
  • gradient algorithm
  • kernel method
  • minimum information divergence criterion

Fingerprint

Dive into the research topics of 'Extreme Learning Machine under Minimum Information Divergence Criterion'. Together they form a unique fingerprint.

Cite this