StainCNNs: An efficient stain feature learning method

Gaoyi Lei, Yuanqing Xia*, Di Hua Zhai, Wei Zhang, Duanduan Chen, Defeng Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)

Abstract

Color variation in stained histopathology images prevents the development of computer-assisted diagnosis (CAD) algorithms for whole slide imaging systems. Therefore, stain normalization methods are studied to reduce the influence of color variation combined with digital image processing algorithms. The Structure Preserve Color Normalization (SPCN) method is a promising stain normalization method, utilizing the sparse non-negative matrix factorization to estimate the stain feature appearance matrix. However, the SPCN method suffers from the high computational complexity of dictionary learning, and its official implementation relies on Matlab and CPU. This research proposes the StainCNNs method to simplify the process of stain feature extraction, and imply a GPU-enabled realization to accelerate the learning of stain features in the Tensorflow Framework. What's more, the StainCNNs method is able to perform the stain normalization quickly in dataset level, more efficient than the SPCN method which is unable to make use of the stain feature distribution in dataset. Stain normalization experiments are conducted on the Camelyon16 dataset and the ICPR2014 dataset, evaluated by the QSSIM score and the FSIM score. Results demonstrate that the proposed StainCNNs method achieves a state-of-the-art performance compared with many conventional stain normalization methods.

Original languageEnglish
Pages (from-to)267-273
Number of pages7
JournalNeurocomputing
Volume406
DOIs
Publication statusPublished - 17 Sept 2020

Keywords

  • Convolutional neural network
  • Deep learning
  • Histopathological image analysis
  • Stain normalization

Fingerprint

Dive into the research topics of 'StainCNNs: An efficient stain feature learning method'. Together they form a unique fingerprint.

Cite this