Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN

Weijia Zeng, Wei Li*, Mengmeng Zhang, Hao Wang, Meng Lv, Yue Yang*, Ran Tao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)

Abstract

Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connection of CNN makes it difficult to extract the long-range dependencies of spectral bands. Transformer overcomes this problem well because of its self-attention mechanism. Nevertheless, transformer is inferior to CNN in extracting spatial detailed features. Therefore, a classification framework integrating transformer and CNN in parallel, named as Fusion Transformer (FUST), is proposed for MHSI classification tasks. Specifically, the transformer branch is employed to extract the overall semantics and capture the long-range dependencies of spectral bands to highlight the key spectral information. The parallel CNN branch is designed to extract significant multiscale spatial features. Furthermore, the feature fusion module is developed to effectively fuse and process the features extracted by the two branches. Experimental results on three MHSI datasets demonstrate that the proposed FUST achieves superior performance when compared with state-of-the-art methods.

Original languageEnglish
Pages (from-to)2910-2921
Number of pages12
JournalIEEE Journal of Biomedical and Health Informatics
Volume27
Issue number6
DOIs
Publication statusPublished - 1 Jun 2023

Keywords

  • Convolutional neural network (CNN)
  • feature fusion
  • microscopic hyperspectral image (MHSI)
  • transformer

Fingerprint

Dive into the research topics of 'Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN'. Together they form a unique fingerprint.

Cite this