TY - GEN
T1 - An attention supervision transformer full-resolution residual network for space satellite image segmentation
AU - Wei, Yihang
AU - Fan, Shangchun
AU - Zhou, Jiale
AU - Hou, Zuoxun
AU - Zheng, Dezhi
AU - Wang, Shuai
AU - Qu, Xiaolei
N1 - Publisher Copyright:
© COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.
PY - 2024
Y1 - 2024
N2 - The growing number of satellites in orbit has resulted in a rise in defunct satellites and space debris, posing a significant risk to valuable spacecraft like normal satellites and space stations. Therefore, the removal of defunct satellites and space debris has become increasingly crucial. This article presents a segmentation method for satellite images captured in the visible light spectrum in space. Firstly, due to the lack of real space satellite images, we used optical simulation and Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-To-Image Translation (U-GAT-IT) to generate realistic space satellite images in the visible light spectrum and constructed a dataset. Secondly, we proposed an Attention Supervision Transformer Full-Resolution Residual Network (ASTransFRRN), which integrates transformer, attention mechanism and deep supervision, to segment satellite bodies, solar panels, and the cosmic background. Finally, we evaluated the proposed method using the U-GAT-IT simulated dataset and compared its performance with state-of-The-Art methods. The proposed method achieved a segmentation accuracy of 90.77%±7.04% for satellite bodies, 90.61%±16.48% for satellite solar panels, and 97.66%±1.94% for the cosmic background. The overall pixel segmentation accuracy was 97.22%±2.78%, outperforming the compared methods in terms of segmentation accuracy. The proposed ASTransFRRN demonstrated a significant improvement in the segmentation accuracy of the main components of space satellites.
AB - The growing number of satellites in orbit has resulted in a rise in defunct satellites and space debris, posing a significant risk to valuable spacecraft like normal satellites and space stations. Therefore, the removal of defunct satellites and space debris has become increasingly crucial. This article presents a segmentation method for satellite images captured in the visible light spectrum in space. Firstly, due to the lack of real space satellite images, we used optical simulation and Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-To-Image Translation (U-GAT-IT) to generate realistic space satellite images in the visible light spectrum and constructed a dataset. Secondly, we proposed an Attention Supervision Transformer Full-Resolution Residual Network (ASTransFRRN), which integrates transformer, attention mechanism and deep supervision, to segment satellite bodies, solar panels, and the cosmic background. Finally, we evaluated the proposed method using the U-GAT-IT simulated dataset and compared its performance with state-of-The-Art methods. The proposed method achieved a segmentation accuracy of 90.77%±7.04% for satellite bodies, 90.61%±16.48% for satellite solar panels, and 97.66%±1.94% for the cosmic background. The overall pixel segmentation accuracy was 97.22%±2.78%, outperforming the compared methods in terms of segmentation accuracy. The proposed ASTransFRRN demonstrated a significant improvement in the segmentation accuracy of the main components of space satellites.
KW - deep learning
KW - satellite component segmentation
KW - space target image simulation
UR - http://www.scopus.com/inward/record.url?scp=85188475239&partnerID=8YFLogxK
U2 - 10.1117/12.2692357
DO - 10.1117/12.2692357
M3 - Conference contribution
AN - SCOPUS:85188475239
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - MIPPR 2023
A2 - Liu, Jianguo
A2 - Chen, Zhong
A2 - Gao, Changxin
A2 - Xiao, Yang
A2 - Zhong, Sheng
A2 - Hong, Hanyu
A2 - Yue, Xiaofeng
PB - SPIE
T2 - SPIE 12th International Symposium on Multispectral Image Processing and Pattern Recognition, MIPPR 2023
Y2 - 10 November 2023 through 12 November 2023
ER -