An adversarial attack on DNN-based black-box object detectors

Yajie Wang, Yu an Tan, Wenjiao Zhang, Yuhang Zhao, Xiaohui Kuang*

*此作品的通讯作者

科研成果: 期刊稿件文献综述同行评审

46 引用 (Scopus)

摘要

Object detection models play an essential role in various IoT devices as one of the core components. Scientific experiments have proven that object detection models are vulnerable to adversarial examples. Heretofore, some attack methods against object detection models have been proposed, but the existing attack methods can only attack white-box models or a specific type of black-box models. In this paper, we propose a novel black-box attack method called Evaporate Attack, which can successfully attack both regression-based and region-based detection models. To perform an effective attack on different types of object detection models, we design an optimization algorithm, which can generate adversarial examples only utilizes the position and label information of the model's prediction. Evaporate Attack can hide objects from detection models without any interior information of the model. This scenario is much practical in real-world faced by the attacker. Our approach achieves an 84% fooling rate on regression-based YOLOv3 and a 48% fooling rate on region-based Faster R–CNN, under the premise that all objects are hidden.

源语言英语
文章编号102634
期刊Journal of Network and Computer Applications
161
DOI
出版状态已出版 - 1 7月 2020

指纹

探究 'An adversarial attack on DNN-based black-box object detectors' 的科研主题。它们共同构成独一无二的指纹。

引用此