TY - JOUR
T1 - Cross-Scene Hyperspectral Image Classification via Bidirectional Mamba and Domain Mixing Network
AU - Dang, Junzhe
AU - Guo, Chengwang
AU - Zhang, Mengmeng
AU - Zhang, Yuxiang
AU - Jia, Wen
AU - Li, Wei
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2026
Y1 - 2026
N2 - To overcome the challenges posed by domain shift in hyperspectral image (HSI) classification, methods based on domain adaptation (DA) have been widely used. Currently, most HSI DA methods focus on designing complex strategies to align the distributions of the source domain (SD) and the target domain (TD) in the feature space after feature extraction, yielding promising results. However, when there exists a large domain shift between SD and TD, it becomes challenging to map them into the same feature space. In this article, we propose the bidirectional mamba and domain mixing network (BMDMnet). Since pure CNN architectures are constrained in local feature extraction, while transformer-based models improve global feature capturing capability at the cost of high computational complexity, we propose the bidirectional mamba module (BMM) as an efficient solution for capturing long-range dependencies. In addition, a self-distillation strategy is employed during training. By utilizing a more stable teacher model, reliable predictions can be obtained in the TD. Subsequently, a domain mixing supervised learning (DMSL) module is designed, which creates a mixed domain by selecting low-entropy sample-pseudo-label pairs from the TD and randomly combining them with sample-label pairs from the SD. DMSL aims to introduce mixed domain to mitigate the inter-domain gap in the data space, thereby enabling the model to learn TD representations more effectively. Experiments demonstrate that BMDMnet outperforms state-of-the-art algorithms across three cross-scene datasets.
AB - To overcome the challenges posed by domain shift in hyperspectral image (HSI) classification, methods based on domain adaptation (DA) have been widely used. Currently, most HSI DA methods focus on designing complex strategies to align the distributions of the source domain (SD) and the target domain (TD) in the feature space after feature extraction, yielding promising results. However, when there exists a large domain shift between SD and TD, it becomes challenging to map them into the same feature space. In this article, we propose the bidirectional mamba and domain mixing network (BMDMnet). Since pure CNN architectures are constrained in local feature extraction, while transformer-based models improve global feature capturing capability at the cost of high computational complexity, we propose the bidirectional mamba module (BMM) as an efficient solution for capturing long-range dependencies. In addition, a self-distillation strategy is employed during training. By utilizing a more stable teacher model, reliable predictions can be obtained in the TD. Subsequently, a domain mixing supervised learning (DMSL) module is designed, which creates a mixed domain by selecting low-entropy sample-pseudo-label pairs from the TD and randomly combining them with sample-label pairs from the SD. DMSL aims to introduce mixed domain to mitigate the inter-domain gap in the data space, thereby enabling the model to learn TD representations more effectively. Experiments demonstrate that BMDMnet outperforms state-of-the-art algorithms across three cross-scene datasets.
KW - Cross-scene
KW - domain mixing
KW - hyperspectral image (his) classification
KW - mamba
KW - unsupervised domain adaptation (DA)
UR - https://www.scopus.com/pages/publications/105027391845
U2 - 10.1109/TNNLS.2026.3651563
DO - 10.1109/TNNLS.2026.3651563
M3 - Article
AN - SCOPUS:105027391845
SN - 2162-237X
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
ER -