Skip to main navigation Skip to search Skip to main content

Deep supervised learning with mixture of neural networks

  • Yaxian Hu
  • , Senlin Luo
  • , Longfei Han
  • , Limin Pan*
  • , Tiemei Zhang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Deep Neural Network (DNN), as a deep architectures, has shown excellent performance in classification tasks. However, when the data has different distributions or contains some latent non-observed factors, it is difficult for DNN to train a single model to perform well on the classification tasks. In this paper, we propose mixture model based on DNNs (MoNNs), a supervised approach to perform classification tasks with a gating network and multiple local expert models. We use a neural network as a gating function and use DNNs as local expert models. The gating network split the heterogeneous data into several homogeneous components. DNNs are combined to perform classification tasks in each component. Moreover, we use EM (Expectation Maximization) as an optimization algorithm. Experiments proved that our MoNNs outperformed the other compared methods on determination of diabetes, determination of benign or malignant breast cancer, and handwriting recognition. Therefore, the MoNNs can solve the problem of data heterogeneity and have a good effect on classification tasks.

Original languageEnglish
Article number101764
JournalArtificial Intelligence in Medicine
Volume102
DOIs
Publication statusPublished - Jan 2020

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • Deep neural network
  • Diabetes determination
  • Expectation maximization
  • Mixture model

Fingerprint

Dive into the research topics of 'Deep supervised learning with mixture of neural networks'. Together they form a unique fingerprint.

Cite this