TY - GEN
T1 - Adversarial Attacks Against Object Detection in Remote Sensing Images
AU - Huang, Rong
AU - Chen, Li
AU - Zheng, Jun
AU - Zhang, Quanxin
AU - Yu, Xiao
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - With the continuous development of artificial intelligence technology and the increasing richness of remote sensing data, deep convolutional neural networks(DNNs) have been widely used in the field of remote sensing images. Object detection in remote sensing images has achieved considerable progress due to DNNs. However, DNNs have shown their vulnerability to adversarial attacks. The object detection models in remote sensing images also have this vulnerability. The complexity of remote sensing object detection models makes it difficult to implement adversarial attacks. In this work, we propose an adversarial attack method against the remote sensing object detection model based on the L∞norm, which can make the detector blind–that is, the detector misses a large number of objects in the image. Because some remote sensing images are too large, we also designed a pre-processing method to segment and pre-process the huge images, which is combined with the attack method. Our proposed attack method can effectively perform adversarial attacks on remote sensing object detection models.
AB - With the continuous development of artificial intelligence technology and the increasing richness of remote sensing data, deep convolutional neural networks(DNNs) have been widely used in the field of remote sensing images. Object detection in remote sensing images has achieved considerable progress due to DNNs. However, DNNs have shown their vulnerability to adversarial attacks. The object detection models in remote sensing images also have this vulnerability. The complexity of remote sensing object detection models makes it difficult to implement adversarial attacks. In this work, we propose an adversarial attack method against the remote sensing object detection model based on the L∞norm, which can make the detector blind–that is, the detector misses a large number of objects in the image. Because some remote sensing images are too large, we also designed a pre-processing method to segment and pre-process the huge images, which is combined with the attack method. Our proposed attack method can effectively perform adversarial attacks on remote sensing object detection models.
KW - Adversarial Attack
KW - Object Detection models
KW - Remote Sensing Images
UR - http://www.scopus.com/inward/record.url?scp=85185722769&partnerID=8YFLogxK
U2 - 10.1007/978-981-99-9785-5_25
DO - 10.1007/978-981-99-9785-5_25
M3 - Conference contribution
AN - SCOPUS:85185722769
SN - 9789819997848
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 358
EP - 367
BT - Artificial Intelligence Security and Privacy - 1st International Conference on Artificial Intelligence Security and Privacy, AIS and P 2023, Proceedings
A2 - Vaidya, Jaideep
A2 - Gabbouj, Moncef
A2 - Li, Jin
PB - Springer Science and Business Media Deutschland GmbH
T2 - 1st International Conference on Artificial Intelligence Security and Privacy, AIS and P 2023
Y2 - 3 December 2023 through 5 December 2023
ER -