Deep supervised learning with mixture of neural networks

Yaxian Hu, Senlin Luo, Longfei Han, Limin Pan*, Tiemei Zhang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

10 引用 (Scopus)

摘要

Deep Neural Network (DNN), as a deep architectures, has shown excellent performance in classification tasks. However, when the data has different distributions or contains some latent non-observed factors, it is difficult for DNN to train a single model to perform well on the classification tasks. In this paper, we propose mixture model based on DNNs (MoNNs), a supervised approach to perform classification tasks with a gating network and multiple local expert models. We use a neural network as a gating function and use DNNs as local expert models. The gating network split the heterogeneous data into several homogeneous components. DNNs are combined to perform classification tasks in each component. Moreover, we use EM (Expectation Maximization) as an optimization algorithm. Experiments proved that our MoNNs outperformed the other compared methods on determination of diabetes, determination of benign or malignant breast cancer, and handwriting recognition. Therefore, the MoNNs can solve the problem of data heterogeneity and have a good effect on classification tasks.

源语言英语
文章编号101764
期刊Artificial Intelligence in Medicine
102
DOI
出版状态已出版 - 1月 2020

指纹

探究 'Deep supervised learning with mixture of neural networks' 的科研主题。它们共同构成独一无二的指纹。

引用此