Hyperspectral target detection based on transform domain adaptive constrained energy minimization

Xiaobin Zhao, Zengfu Hou, Xin Wu*, Wei Li, Pengge Ma, Ran Tao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

29 Citations (Scopus)

Abstract

Traditional hyperspectral target detection methods use spectral domain information for target recognition. Although it can effectively retain intrinsic characteristics of substances, targets in homogeneous regions still cannot be effectively recognized. By projecting the spectral domain features on the transform domain to increase the separability of background and target, fractional domain-based revised constrained energy minimization detector is proposed. Firstly, the fractional Fourier transform is adopted to project the original spectral information into the fractional domain for improving the separability of background and target. Then, a newly revised constrained energy minimization detector is performed, where sliding double window strategy is used to make the best of the local spatial statistical characteristics of testing pixel. In order to make the best of inner window information, the mean value of Pearson correlation coefficient is measured between prior target pixel and testing pixel associated with its four neighborhood pixels. Extensive experiments for four real hyperspectral scenes indicate that the performance of the proposed algorithm is excellent when compared with other related detectors.

Original languageEnglish
Article number102461
JournalInternational Journal of Applied Earth Observation and Geoinformation
Volume103
DOIs
Publication statusPublished - 1 Dec 2021

Keywords

  • Constrained energy minimization
  • Fractional Fourier transform
  • Hyperspectral imagery
  • Multi-direction double window
  • Target detection

Fingerprint

Dive into the research topics of 'Hyperspectral target detection based on transform domain adaptive constrained energy minimization'. Together they form a unique fingerprint.

Cite this