Hierarchical classification framework for HEp-2 cell images

Zhenyu Ji, Wei Li*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Currently, indirect immune fluorescence imaging of human epithelial type 2 (HEp-2) cell image is an effective evidence to diagnose autoimmune diseases. In this work, a novel hierarchical classification model is developed for cell images. To be more specific, in the first step, the six-class task is constructed as a two-class task, where five categories are merged into one category, except the one that is the most difficult to distinguish. After that, the second step is to distinguish the combined five categories. During this process, Codebook less model (CLM) is used to extract the characteristics of the images. Feature mapping is used to effectively narrow the gap between training sets and test sets. Hierarchical classification framework is evaluated systematically on the IEEE International Conference on Image Processing (ICIP) 2013 contest dataset. Experimental results demonstrate the effectiveness of the proposed method.

Original languageEnglish
Title of host publicationInternational Conference on Biological Information and Biomedical Engineering, BIBE 2018
EditorsChengyu Liu
PublisherVDE VERLAG GMBH
Pages331-334
Number of pages4
ISBN (Electronic)9783800747276
Publication statusPublished - 2018
Externally publishedYes
Event2nd International Conference on Biological Information and Biomedical Engineering, BIBE 2018 - Shanghai, China
Duration: 6 Jul 20188 Jul 2018

Publication series

NameInternational Conference on Biological Information and Biomedical Engineering, BIBE 2018

Conference

Conference2nd International Conference on Biological Information and Biomedical Engineering, BIBE 2018
Country/TerritoryChina
CityShanghai
Period6/07/188/07/18

Fingerprint

Dive into the research topics of 'Hierarchical classification framework for HEp-2 cell images'. Together they form a unique fingerprint.

Cite this