TY - JOUR
T1 - Hyperspectral and SAR Image Classification via Multiscale Interactive Fusion Network
AU - Wang, Junjie
AU - Li, Wei
AU - Gao, Yunhao
AU - Zhang, Mengmeng
AU - Tao, Ran
AU - Du, Qian
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2023/12/1
Y1 - 2023/12/1
N2 - — Due to the limitations of single-source data, joint classification using multisource remote sensing data has received increasing attention. However, existing methods still have certain shortcomings when faced with feature extraction from single-source data and feature fusion between multisource data. In this article, a method based on multiscale interactive information extraction (MIFNet) for hyperspectral and synthetic aperture radar (SAR) image classification is proposed. First, a multiscale interactive information extraction (MIIE) block is designed to extract meaningful multiscale information. Compared with traditional multiscale models, it can not only obtain richer scale information but also reduce the model parameters and lower the network complexity. Furthermore, a global dependence fusion module (GDFM) is developed to fuse features from multisource data, which implements cross attention between multisource data from a global perspective and captures long-range dependence. Extensive experiments on the three datasets demonstrate the superiority of the proposed method and the necessity of each module for accuracy improvement.
AB - — Due to the limitations of single-source data, joint classification using multisource remote sensing data has received increasing attention. However, existing methods still have certain shortcomings when faced with feature extraction from single-source data and feature fusion between multisource data. In this article, a method based on multiscale interactive information extraction (MIFNet) for hyperspectral and synthetic aperture radar (SAR) image classification is proposed. First, a multiscale interactive information extraction (MIIE) block is designed to extract meaningful multiscale information. Compared with traditional multiscale models, it can not only obtain richer scale information but also reduce the model parameters and lower the network complexity. Furthermore, a global dependence fusion module (GDFM) is developed to fuse features from multisource data, which implements cross attention between multisource data from a global perspective and captures long-range dependence. Extensive experiments on the three datasets demonstrate the superiority of the proposed method and the necessity of each module for accuracy improvement.
KW - Global dependence fusion
KW - multiscale interactive information extraction (MIIE)
KW - multisource remote sensing
UR - http://www.scopus.com/inward/record.url?scp=85132521878&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2022.3171572
DO - 10.1109/TNNLS.2022.3171572
M3 - Article
C2 - 35544495
AN - SCOPUS:85132521878
SN - 2162-237X
VL - 34
SP - 10823
EP - 10837
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 12
ER -