Cell dynamic morphology classification using deep convolutional neural networks

Heng Li, Fengqian Pang, Yonggang Shi, Zhiwen Liu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

16 Citations (Scopus)

Abstract

Cell morphology is often used as a proxy measurement of cell status to understand cell physiology. Hence, interpretation of cell dynamic morphology is a meaningful task in biomedical research. Inspired by the recent success of deep learning, we here explore the application of convolutional neural networks (CNNs) to cell dynamic morphology classification. An innovative strategy for the implementation of CNNs is introduced in this study. Mouse lymphocytes were collected to observe the dynamic morphology, and two datasets were thus set up to investigate the performances of CNNs. Considering the installation of deep learning, the classification problem was simplified from video data to image data, and was then solved by CNNs in a self-taught manner with the generated image data. CNNs were separately performed in three installation scenarios and compared with existing methods. Experimental results demonstrated the potential of CNNs in cell dynamic morphology classification, and validated the effectiveness of the proposed strategy. CNNs were successfully applied to the classification problem, and outperformed the existing methods in the classification accuracy. For the installation of CNNs, transfer learning was proved to be a promising scheme.

Original languageEnglish
Pages (from-to)628-638
Number of pages11
JournalCytometry. Part A : the journal of the International Society for Analytical Cytology
Volume93
Issue number6
DOIs
Publication statusPublished - Jun 2018

Keywords

  • cell dynamic morphology
  • cell status prediction
  • convolutional neural networks
  • deep learning
  • transfer learning

Fingerprint

Dive into the research topics of 'Cell dynamic morphology classification using deep convolutional neural networks'. Together they form a unique fingerprint.

Cite this