TY - JOUR
T1 - Deep supervised learning with mixture of neural networks
AU - Hu, Yaxian
AU - Luo, Senlin
AU - Han, Longfei
AU - Pan, Limin
AU - Zhang, Tiemei
N1 - Publisher Copyright:
© 2019
PY - 2020/1
Y1 - 2020/1
N2 - Deep Neural Network (DNN), as a deep architectures, has shown excellent performance in classification tasks. However, when the data has different distributions or contains some latent non-observed factors, it is difficult for DNN to train a single model to perform well on the classification tasks. In this paper, we propose mixture model based on DNNs (MoNNs), a supervised approach to perform classification tasks with a gating network and multiple local expert models. We use a neural network as a gating function and use DNNs as local expert models. The gating network split the heterogeneous data into several homogeneous components. DNNs are combined to perform classification tasks in each component. Moreover, we use EM (Expectation Maximization) as an optimization algorithm. Experiments proved that our MoNNs outperformed the other compared methods on determination of diabetes, determination of benign or malignant breast cancer, and handwriting recognition. Therefore, the MoNNs can solve the problem of data heterogeneity and have a good effect on classification tasks.
AB - Deep Neural Network (DNN), as a deep architectures, has shown excellent performance in classification tasks. However, when the data has different distributions or contains some latent non-observed factors, it is difficult for DNN to train a single model to perform well on the classification tasks. In this paper, we propose mixture model based on DNNs (MoNNs), a supervised approach to perform classification tasks with a gating network and multiple local expert models. We use a neural network as a gating function and use DNNs as local expert models. The gating network split the heterogeneous data into several homogeneous components. DNNs are combined to perform classification tasks in each component. Moreover, we use EM (Expectation Maximization) as an optimization algorithm. Experiments proved that our MoNNs outperformed the other compared methods on determination of diabetes, determination of benign or malignant breast cancer, and handwriting recognition. Therefore, the MoNNs can solve the problem of data heterogeneity and have a good effect on classification tasks.
KW - Deep neural network
KW - Diabetes determination
KW - Expectation maximization
KW - Mixture model
UR - http://www.scopus.com/inward/record.url?scp=85075757287&partnerID=8YFLogxK
U2 - 10.1016/j.artmed.2019.101764
DO - 10.1016/j.artmed.2019.101764
M3 - Article
C2 - 31980101
AN - SCOPUS:85075757287
SN - 0933-3657
VL - 102
JO - Artificial Intelligence in Medicine
JF - Artificial Intelligence in Medicine
M1 - 101764
ER -