基于超像素注意力和孪生结构的半监督高光谱显著性目标检测

Translated title of the contribution: Semi-supervised Hyperspectral Salient Object Detection Using Superpixel Attention and Siamese Structure

Haolin Qin, Tingfa Xu, Jianan Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Hyperspectral salient object detection technology plays a key role in various fields, such as camouflage recognition and anomaly detection, thus having received extensive attention. The neural network model based on deep learning technology has improved issues such as low detection accuracy and weak robustness of traditional algorithms, but the cost of data labeling limits its further development. To this end, a superpixel attention siamese semi-supervised algorithm is proposed, which uses a small amount of fully supervised data and a large amount of weakly supervised data for training, effectively reducing annotation costs. The algorithm consists of a siamese prediction module and an attention assistance module. The siamese prediction module captures the implicit constraints of weak labels and generates a saliency result map, while the attention assistance module optimizes the prediction results with a superpixel-level channel attention mechanism. The newly proposed semi-supervised algorithm achieves a detection accuracy of 87% on hyperspectral datasets, outperforming other popular algorithms and demonstrating excellent saliency detection performance while effectively reducing annotation costs.

Translated title of the contributionSemi-supervised Hyperspectral Salient Object Detection Using Superpixel Attention and Siamese Structure
Original languageChinese (Traditional)
Pages (from-to)2639-2649
Number of pages11
JournalBinggong Xuebao/Acta Armamentarii
Volume44
Issue number9
DOIs
Publication statusPublished - 20 Sept 2023

Fingerprint

Dive into the research topics of 'Semi-supervised Hyperspectral Salient Object Detection Using Superpixel Attention and Siamese Structure'. Together they form a unique fingerprint.

Cite this